CN116095221B - Frame rate adjusting method in game and related device - Google Patents

Frame rate adjusting method in game and related device Download PDF

Info

Publication number
CN116095221B
CN116095221B CN202210956515.3A CN202210956515A CN116095221B CN 116095221 B CN116095221 B CN 116095221B CN 202210956515 A CN202210956515 A CN 202210956515A CN 116095221 B CN116095221 B CN 116095221B
Authority
CN
China
Prior art keywords
frame image
frame
matrix
game
similarity
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210956515.3A
Other languages
Chinese (zh)
Other versions
CN116095221A (en
Inventor
龙云
陈聪儿
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202210956515.3A priority Critical patent/CN116095221B/en
Publication of CN116095221A publication Critical patent/CN116095221A/en
Application granted granted Critical
Publication of CN116095221B publication Critical patent/CN116095221B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72427User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality for supporting games or graphical animations
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/7243User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages
    • H04M1/72439User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages for image or video messaging

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The embodiment of the application provides a frame rate adjusting method in a game and a related device, and relates to the technical field of terminals. The method comprises the following steps: determining a target scene in a gaming application; acquiring motion parameters between a first frame image and a second frame image; the first frame image and the second frame image are adjacent frame images in a target scene, the motion parameters are variable amounts obtained by taking a target object as a reference, and the motion parameters comprise visual angle variable values of the target object and/or movement amounts of the target object; comparing the similarity between the first frame image and the second frame image when the viewing angle change value is smaller than or equal to a first preset value and/or the moving amount is smaller than or equal to a second preset value; frame rate adjustment is performed based on the similarity. Therefore, the calculated amount of similarity comparison of the frame images can be reduced under the condition that the accuracy of similarity calculation is not affected, so that the game load can be optimized, the power consumption is reduced, the heating condition of the terminal equipment is improved, and the user experience is improved.

Description

Frame rate adjusting method in game and related device
Technical Field
The present application relates to the field of terminal technologies, and in particular, to a method and an apparatus for adjusting a frame rate in a game.
Background
With the development of terminal technology, more and more terminal devices support game applications, so as to meet the increasing game experience requirements of users, the scene richness, rendering quality and frame rate of game rendering on a mobile terminal platform are also higher and higher for game manufacturers. The games with rich scenes also cause serious heating due to the problems of high power consumption and the like when being rendered on the mobile platform at a high frame rate, and the mobile terminal platform subsequently meets the great challenges of high memory occupation and high power consumption.
In some implementations, the terminal device may calculate the difference between the current frame and the historical frame, respectively, and when the difference is smaller, discard or frame-down process the current frame to reduce power consumption and improve the heating phenomenon.
However, in the above implementation, there are still problems of serious heat generation and large power consumption, which affect the user experience.
Disclosure of Invention
The embodiment of the application provides a frame rate adjusting method in a game and a related device, which can reduce the calculated amount of similarity comparison of frame images under the condition of not affecting the accuracy of similarity calculation, thereby optimizing the game load, reducing the power consumption and improving the heating condition of terminal equipment.
In a first aspect, an embodiment of the present application provides a method for adjusting a frame rate in a game, including:
Determining a target scene in a gaming application; acquiring motion parameters between a first frame image and a second frame image; the first frame image and the second frame image are adjacent frame images in a target scene, the motion parameters are variable amounts obtained by taking a target object as a reference, and the motion parameters comprise visual angle variable values of the target object and/or movement amounts of the target object; comparing the similarity between the first frame image and the second frame image when the viewing angle change value is smaller than or equal to a first preset value and/or the moving amount is smaller than or equal to a second preset value; frame rate adjustment is performed based on the similarity. Therefore, the calculation amount of the similarity comparison of the frame images can be reduced under the condition that the accuracy of the similarity calculation is not affected, so that the game load can be optimized, and the heating condition of the terminal equipment can be improved.
In one possible implementation, a scene tag in a rendering instruction stream of a game application is read; a target scene in the gaming application is determined from the scene tags. Therefore, the target scene can be identified more quickly under the condition of smaller calculation force, and the non-target scene with low requirement on the frame rate is eliminated.
In a possible implementation manner, before acquiring the motion parameter between the first frame image and the second frame image, the method further includes: acquiring a rendering instruction stream corresponding to a game application; acquiring a first V matrix corresponding to a first frame image and a second V matrix of a second frame image, wherein the first V matrix comprises: and obtaining a first V matrix corresponding to the first frame image and a second V matrix corresponding to the second frame image based on a preset V matrix field in the rendering instruction stream. In this way, the calculation of the motion parameters can be performed based on the V matrix.
In a possible implementation manner, acquiring a motion parameter between a first frame image and a second frame image includes: and calculating according to the first V matrix and the second V matrix to obtain a visual angle change value. In this way, the amount of calculation of the similarity comparison of the frame images can be reduced without affecting the accuracy of the similarity calculation too much.
In a possible implementation manner, acquiring a motion parameter between a first frame image and a second frame image includes: acquiring a first coordinate value of a target object in a world space in a first frame image and a second coordinate value of the target object in the world space in a second frame image in a rendering instruction stream corresponding to the game application; and calculating the movement amount according to the first coordinate value and the second coordinate value. Therefore, before the similarity comparison of the two adjacent frames of images is carried out, the two adjacent frames of images with rapid motion parameter change can be eliminated.
In one possible implementation, the similarity between the first frame image and the second frame image is compared when the viewing angle change value is less than or equal to a first preset value and/or the amount of movement is less than or equal to a second preset value. Therefore, the similarity probability of the first frame image and the second frame image can be prejudged, and when the similarity probability of the first frame image and the second frame image is larger, the similarity comparison of the two adjacent frame images is carried out, so that the similar frame images can be more accurately identified, unnecessary similarity calculation is reduced, and the operation efficiency is improved.
In one possible implementation, when the similarity is greater than or equal to the similarity threshold, a frame-down rate adjustment is made, the frame-down rate adjustment including dropping frames, and/or a wait instruction is executed to reduce the frequency of sending limits, and/or to reduce the screen refresh rate. Therefore, the game load can be reduced, the game interface is blocked, and the purpose of reducing the power consumption of the terminal equipment is realized.
In a second aspect, an embodiment of the present application provides a device for adjusting a frame rate in a game, where the device for adjusting a frame rate may be a terminal device, or may be a chip or a chip system in the terminal device. The frame rate adjustment means may comprise a processing unit. The processing unit is configured to implement the first aspect or any method related to processing in any possible implementation manner of the first aspect. The processing unit may be a processor when the means for frame rate adjustment is a terminal device. The frame rate adjustment device may further include a storage unit, which may be a memory. The storage unit is configured to store instructions, and the processing unit executes the instructions stored in the storage unit, so that the terminal device implements a method described in the first aspect or any one of possible implementation manners of the first aspect. The processing unit may be a processor when the means for frame rate adjustment is a chip or a system of chips within the terminal device. The processing unit executes instructions stored by the storage unit to cause the terminal device to implement a method as described in the first aspect or any one of the possible implementations of the first aspect. The memory unit may be a memory unit (e.g., a register, a cache, etc.) in the chip, or a memory unit (e.g., a read-only memory, a random access memory, etc.) located outside the chip in the terminal device.
The processing unit is used for determining a target scene in the game application; acquiring motion parameters between a first frame image and a second frame image; the motion parameters comprise a visual angle change value of the target object and/or a moving amount of the target object; comparing the similarity between the first frame image and the second frame image when the viewing angle change value is smaller than or equal to a first preset value and/or the moving amount is smaller than or equal to a second preset value; frame rate adjustment is performed based on the similarity.
In a possible implementation manner, a processing unit is used for reading scene tags in a rendering instruction stream of the game application; a target scene in the gaming application is determined from the scene tags.
In a possible implementation manner, the processing unit is further configured to obtain a rendering instruction stream corresponding to the game application; the processing unit is specifically further configured to obtain, in the rendering instruction stream, a first V matrix corresponding to the first frame image and a second V matrix corresponding to the second frame image based on a preset V matrix field.
In a possible implementation manner, the processing unit is specifically configured to calculate the viewing angle change value according to the first V matrix and the second V matrix.
In a possible implementation manner, the processing unit is specifically further configured to obtain, in a rendering instruction stream corresponding to the game application, a first coordinate value of the target object in the world space in the first frame image, and a second coordinate value of the target object in the world space in the second frame image; and calculating the movement amount according to the first coordinate value and the second coordinate value.
In a possible implementation manner, the processing unit is further configured to compare the similarity between the first frame image and the second frame image when the viewing angle variation value is less than or equal to a first preset value and/or the movement amount is less than or equal to a second preset value.
In a possible implementation manner, the processing unit is further configured to perform a frame dropping rate adjustment when the similarity is greater than or equal to the similarity threshold, where the frame dropping rate adjustment includes frame dropping, and/or execute a wait instruction to reduce the frequency of sending the limit, and/or reduce the screen refresh rate.
In a third aspect, an embodiment of the present application provides an electronic device, including a processor and a memory, the memory being configured to store code instructions, the processor being configured to execute the code instructions to perform the method described in the first aspect or any one of the possible implementations of the first aspect.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium having stored therein a computer program or instructions which, when run on a computer, cause the computer to perform the frame rate adjustment method in a game described in the first aspect or any one of the possible implementations of the first aspect.
In a fifth aspect, embodiments of the present application provide a computer program product comprising a computer program which, when run on a computer, causes the computer to perform the frame rate adjustment method in a game described in the first aspect or any one of the possible implementations of the first aspect.
In a sixth aspect, the present application provides a chip or chip system comprising at least one processor and a communication interface, the communication interface and the at least one processor being interconnected by wires, the at least one processor being for executing a computer program or instructions to perform the frame rate adjustment method in a game as described in the first aspect or any one of the possible implementations of the first aspect. The communication interface in the chip can be an input/output interface, a pin, a circuit or the like.
In one possible implementation, the chip or chip system described above further includes at least one memory, where the at least one memory has instructions stored therein. The memory may be a memory unit within the chip, such as a register, a cache, etc., or may be a memory unit of the chip (e.g., a read-only memory, a random access memory, etc.).
It should be understood that, the second aspect to the sixth aspect of the present application correspond to the technical solutions of the first aspect of the present application, and the advantages obtained by each aspect and the corresponding possible embodiments are similar, and are not repeated.
Drawings
FIG. 1 is a schematic diagram of a coordinate system provided in an embodiment of the present application;
fig. 2 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 3 is a schematic diagram of a software framework of an electronic device according to an embodiment of the present application;
fig. 4 is a flowchart of a method for adjusting a frame rate in a game according to an embodiment of the present application;
FIG. 5 is a schematic view of an observation coordinate system according to an embodiment of the present application;
fig. 6 is a schematic diagram of transformation of camera view angles in an observation coordinate system according to an embodiment of the present application;
FIG. 7 is a schematic diagram of a camera position in world space coordinates according to an embodiment of the present application;
FIG. 8 is a flowchart illustrating another method for adjusting frame rate in a game according to an embodiment of the present application;
fig. 9 is a schematic structural diagram of a frame rate adjustment device in a game according to an embodiment of the present application.
Detailed Description
In order to facilitate the clear description of the technical solutions of the embodiments of the present application, the following simply describes some terms and techniques involved in the embodiments of the present application:
1. Frame rate: the number of frames displayed per second is in units of transmission frames per second (frames per second, FPS). The frame is a single image picture of the minimum unit in the image animation, one frame is a still picture, and continuous frames form the animation. A high frame rate may result in a smoother and more realistic animation. The more frames per second, the smoother the displayed motion.
The frame rate includes an output frame rate and a display frame rate. The output frame rate refers to the frame rate of the interface or application program output to the display screen through the display card. The display frame rate refers to a display frame rate that is the frame rate at which an interface or application is actually displayed on the display screen.
2. And (3) frame rendering: refers to the process of generating images from models using software. The model may be a description of the three-dimensional object in a well-defined language or data structure, and may include geometric, viewpoint, texture, and illumination information. The frame rendering can be used for two-dimensionally projecting a model in the three-dimensional scene into a digital image according to the set environment, lamplight, materials and rendering parameters, and can also comprise an image drawing process.
3. MVP matrix: the model, view, projection, three matrices are shown. As shown in fig. 1, the vertex coordinates start in local space (local space), then become world coordinates (world coordinates), view coordinates (view coordinates), clip coordinates (clip coordinates), and finally end in the form of screen coordinates (screen coordinate). The local coordinates are coordinates of the object with respect to the local origin, and are coordinates of the start of the object.
4. M matrix: a local space to world space conversion matrix. The local coordinates are converted into world coordinates, which are a coordinate system of a larger spatial extent, with respect to the origin of the world.
5. V matrix: also known as view observation matrix, is the world space to camera space conversion matrix. The world coordinates are converted into viewing coordinates, which refer to coordinates viewed at the angle of the camera or observer. When defining a camera, the position of the camera in world space, the direction of observation, a vector pointing to its right and a vector pointing above it are required. In effect a coordinate system is created with three unit axes perpendicular to each other, with the camera position as the origin.
6. P matrix: a transformation matrix of camera space to crop space. After the coordinates are processed into the viewing space, they need to be projected onto the clipping coordinates. Clipping coordinates are in the range of process-1.0 to 1.0 and determine which vertices will appear on the screen. Judging whether the vertex is in the visible range, if so, rendering the vertex, and if not, removing the vertex. The clipping coordinates then need to be converted to screen coordinates, a process called viewport transformation (viewport transform). The viewport transformation transforms coordinates lying in the range-1.0 to 1.0 into a coordinate range defined by a viewport transformation function, which may be a glViewport function. The final transformed coordinates are sent to the rasterizer, which converts them into fragments.
7. Other terms
In embodiments of the present application, the words "first," "second," and the like are used to distinguish between identical or similar items that have substantially the same function and effect. For example, the first chip and the second chip are merely for distinguishing different chips, and the order of the different chips is not limited. It will be appreciated by those of skill in the art that the words "first," "second," and the like do not limit the amount and order of execution, and that the words "first," "second," and the like do not necessarily differ.
It should be noted that, in the embodiments of the present application, words such as "exemplary" or "such as" are used to mean serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" or "for example" should not be construed as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "exemplary" or "such as" is intended to present related concepts in a concrete fashion.
In the embodiments of the present application, "at least one" means one or more, and "a plurality" means two or more. "and/or", describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate: a alone, a and B together, and B alone, wherein a, B may be singular or plural. The character "/" generally indicates that the context-dependent object is an "or" relationship. "at least one of" or the like means any combination of these items, including any combination of single item(s) or plural items(s). For example, at least one (one) of a, b, or c may represent: a, b, c, a-b, a-c, b-c, or a-b-c, wherein a, b, c may be single or plural.
8. Terminal equipment
The terminal device of the embodiment of the application can also be any form of electronic device, for example, the electronic device can include a handheld device with an image processing function, a vehicle-mounted device and the like. For example, some electronic devices are: a mobile phone, tablet, palm, notebook, mobile internet device (mobile internet device, MID), wearable device, virtual Reality (VR) device, augmented reality (augmented reality, AR) device, wireless terminal in industrial control (industrial control), wireless terminal in unmanned (self driving), wireless terminal in teleoperation (remote medical surgery), wireless terminal in smart grid (smart grid), wireless terminal in transportation security (transportation safety), wireless terminal in smart city (smart city), wireless terminal in smart home (smart home), cellular phone, cordless phone, session initiation protocol (session initiation protocol, SIP) phone, wireless local loop (wireless local loop, WLL) station, personal digital assistant (personal digital assistant, PDA), handheld device with wireless communication function, public computing device or other processing device connected to wireless modem, vehicle-mounted device, wearable device, terminal device in future communication network (public land mobile network), or land mobile communication network, etc. without limiting the application.
By way of example, and not limitation, in embodiments of the application, the electronic device may also be a wearable device. The wearable device can also be called as a wearable intelligent device, and is a generic name for intelligently designing daily wear by applying wearable technology and developing wearable devices, such as glasses, gloves, watches, clothes, shoes and the like. The wearable device is a portable device that is worn directly on the body or integrated into the clothing or accessories of the user. The wearable device is not only a hardware device, but also can realize a powerful function through software support, data interaction and cloud interaction. The generalized wearable intelligent device includes full functionality, large size, and may not rely on the smart phone to implement complete or partial functionality, such as: smart watches or smart glasses, etc., and focus on only certain types of application functions, and need to be used in combination with other devices, such as smart phones, for example, various smart bracelets, smart jewelry, etc. for physical sign monitoring.
In addition, in the embodiment of the application, the electronic equipment can also be terminal equipment in an internet of things (internet of things, ioT) system, and the IoT is an important component of the development of future information technology, and the main technical characteristics of the IoT are that the article is connected with a network through a communication technology, so that the man-machine interconnection and the intelligent network of the internet of things are realized.
The electronic device in the embodiment of the application may also be referred to as: a terminal device, a User Equipment (UE), a Mobile Station (MS), a Mobile Terminal (MT), an access terminal, a subscriber unit, a subscriber station, a mobile station, a remote terminal, a mobile device, a user terminal, a wireless communication device, a user agent, a user equipment, or the like.
In an embodiment of the present application, the electronic device or each network device includes a hardware layer, an operating system layer running on top of the hardware layer, and an application layer running on top of the operating system layer. The hardware layer includes hardware such as a central processing unit (central processing unit, CPU), a memory management unit (memory management unit, MMU), and a memory (also referred to as a main memory). The operating system may be any one or more computer operating systems that implement business processes through processes (processes), such as a Linux operating system, a Unix operating system, an Android operating system, an iOS operating system, or a windows operating system. The application layer comprises applications such as a browser, an address book, word processing software, instant messaging software and the like.
By way of example, fig. 2 shows a schematic structural diagram of an electronic device.
The electronic device may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, and a subscriber identity module (subscriber identification module, SIM) card interface 195, etc. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It should be understood that the structure illustrated in the embodiments of the present application does not constitute a specific limitation on the electronic device. In other embodiments of the application, the electronic device may include more or less components than illustrated, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
It should be understood that the connection relationship between the modules illustrated in the embodiments of the present application is only illustrative, and does not limit the structure of the electronic device. In other embodiments of the present application, the electronic device may also use different interfacing manners, or a combination of multiple interfacing manners in the foregoing embodiments.
The electronic device implements display functions via a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The electronic device may implement shooting functions through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the electronic device. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
The internal memory 121 may be used to store computer-executable program code that includes instructions. The internal memory 121 may include a storage program area and a storage data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data created during use of the electronic device (e.g., audio data, phonebook, etc.), and so forth. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like. The processor 110 performs various functional applications of the electronic device and data processing by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor. For example, the frame rate adjustment method in the game of the embodiment of the present application may be performed.
Fig. 3 is a software configuration block diagram of the terminal device 100 of the embodiment of the present application. The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, from top to bottom, an application layer, an application framework layer, an Zhuoyun row (Android run) and system libraries, and a kernel layer, respectively.
The application layer may include a series of application packages. As shown in fig. 3, the application package may include applications for cameras, calendars, phones, maps, games, and the like.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 3, the application framework layer may include a window manager, resource manager, notification manager, graphics rendering, and the like.
The window manager is used for managing window programs. The window manager may obtain the display screen size, determine if there is a status bar, lock screen, touch screen, drag screen, intercept screen, etc.
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification in the form of a chart or scroll bar text that appears on the system top status bar, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, prompting text information in a status bar, giving out a prompt tone, vibrating a terminal device, flashing an indicator light, etc.
Graphics rendering is used to render graphics, such as frame images in a game application's game scene.
The graphics rendering is used to render the drawn graphics, for example, after the graphics drawing completes drawing the game application's game scene, the graphics rendering renders the game scene.
Android runtimes include core libraries and virtual machines. Android run time is responsible for scheduling and management of the Android system.
The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface manager (surface manager), media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., openGL ES), 2D graphics engines (e.g., SGL), graphics composition, etc.
The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio and video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
Graphics compositing is used to composite one or more rendered views into a display interface, such as compositing a game background picture and a game player motion picture in a game application after rendering.
The kernel layer is a layer between hardware and software. The kernel layer may include display drivers, camera drivers, audio drivers, central processor drivers, and the like.
The workflow of the terminal device 100 software and hardware is illustrated below in connection with the scenario of application launch or interface switching occurring in an application.
When touch sensor 180K receives a touch operation, a corresponding hardware interrupt is issued to the kernel layer. The kernel layer processes the touch operation into the original input event (including information such as touch coordinates, touch strength, time stamp of the touch operation, etc.). The original input event is stored at the kernel layer. The application framework layer acquires an original input event from the kernel layer, and identifies a control corresponding to the input event. Taking the touch operation as a touch click operation, taking a control corresponding to the click operation as an example of a control of a game application icon, the game application calls an interface of an application framework layer to start the game application, and then a display driver is started by calling a kernel layer to display a functional interface of the game application.
With the development of terminal technology, more and more terminal devices support game applications, so as to meet the increasing game experience requirements of users, the scene richness, rendering quality and frame rate of game rendering on a mobile terminal platform are also higher and higher for game manufacturers. The games with rich scenes generate serious heat due to the problems of high memory occupation, high power consumption and the like when being rendered on a mobile platform at a high frame rate.
In one implementation, the terminal device can compare the similarity of two adjacent frames of images in a non-differential manner in the running process of the game application, and when the similarity of the two adjacent frames of images is higher, the terminal device performs discarding of the current frame of images or reducing the frame rate so as to achieve the purposes of reducing power consumption and slowing down heating by discarding the frames or reducing the frame rate. However, the calculation of the image similarity has certain requirements on the calculation power of the CPU and also brings power consumption, so that the indiscriminate calculation of the similarity for any two adjacent frames of images has larger calculation amount, and causes more memory occupation and power consumption, so that the effects of reducing the power consumption and slowing down heating in the mode are poor.
Therefore, in the method for adjusting the frame rate in the game provided by the embodiment of the application, before the similarity comparison of the two adjacent frames of images is carried out, the two adjacent frames of images with faster motion parameter change are eliminated. Since the similarity of two frames of images with faster motion parameters is generally lower, the situation that the similarity is higher occurs with lower probability. Therefore, the calculation amount of the similarity comparison of the frame images can be reduced under the condition that the accuracy of the similarity calculation is not affected, so that the game load can be optimized, the power consumption is reduced, the heating condition of the terminal equipment is improved, and the user experience is improved.
The method for adjusting the frame rate in the game according to the embodiment of the present application will be described in detail by way of specific examples. The following embodiments may be combined with each other or implemented independently, and the same or similar concepts or processes may not be described in detail in some embodiments.
Fig. 4 shows a frame rate adjustment method in a game according to an embodiment of the present application. The method comprises the following steps:
s401, determining a target scene in a game application.
In the embodiment of the application, the target scene can be a scene capable of reducing the power consumption load by adjusting the frame rate, including a scene with higher requirement on the frame rate, or a scene with larger load in the game, or any other game scene which is easy to cause the clamping of the game interface. For example, the target scene may include a game play scene, a man-machine exercise scene, a novice guidance scene, or a game playback scene in a game application, and the embodiment of the present application does not limit a specific target scene. The game scene can be a game environment in which a game player is in a game, and the frame image of the game scene can comprise a space, an environment, a building, props and the like in the game, so as to be used for matching with the development of a game scenario and the requirement of man-machine interaction.
It will be appreciated that the game may also include lobby scenes, setup interface scenes, and other scenes in non-game plays, for which the frame rate requirements are not high, the load may be reduced by dropping the frame rate.
In one possible implementation, determining a target scene in a gaming application may determine the target scene by obtaining a status identification of the gaming application. The status identifier may include a game scene tag, a game player status, a detected game title, a picture identification, etc., and the embodiment of the present application does not limit the judgment manner of the target scene.
Illustratively, the identification of the target scene may determine the game player status by acquiring relevant data through a function or interface in the program; or the character strings corresponding to the specific labels of the drawn objects in the target scene can be obtained and searched and matched with the character strings analyzed in advance, and if the specified character strings are successfully matched, the target scene can be determined.
S402, acquiring motion parameters between the first frame image and the second frame image.
In the embodiment of the application, the first frame image and the second frame image are adjacent frame images in the target scene, and the motion parameters can be used for representing the change speed of the target object in the target scene. It will be appreciated that the target object may be an object in the target scene that accompanies the movement of the player, for example the target object may be the player himself; or if the player is driving, the target object may be a moving car; or if the player is taking an airplane, the target object may be a moving airplane; the embodiment of the application does not limit the target object.
The motion parameter may include a viewing angle change amount or a movement amount. The view angle change amount may be a view angle change amount of a target object in the target scene, for example, the view angle change amount may be a view angle change amount in an observation coordinate system, and the obtaining manner of the view angle change amount is not limited in the embodiment of the present application.
The movement amount may include a movement distance, a movement speed, or other parameters representing the movement amount of the target object, and the embodiment of the present application does not limit the parameters representing the movement amount. The movement amount may be a movement amount of a target object in the target scene, for example, the movement amount may be a movement amount of a target object in world space, may be a movement amount of a target object in an observation coordinate system, or may be a movement amount of a target object obtained in other manners.
S403, comparing the similarity between the first frame image and the second frame image when the visual angle change value is smaller than or equal to a first preset value and/or the moving amount is smaller than or equal to a second preset value.
In the embodiment of the present application, the first preset value may be a viewing angle change value preset in the terminal device, and the second preset value may be a movement amount preset in the terminal device. The first preset value may be the same or different in different game applications. The second preset value may be the same or different in different gaming applications. In one possible implementation, the first preset value and the second preset value may be set according to an empirical preset value obtained by analyzing the data. The specific values of the first preset value and the second preset value are not limited herein.
In one possible implementation of the embodiment of the present application, when the viewing angle change value is less than or equal to the first preset value, it may be understood that the viewing angle change of the first frame image and the second frame image is smaller. In this way, if the probability that the similarity between the first frame image and the second frame image is relatively high is high, the similarity comparison between the first frame image and the second frame image can be further performed, so that when the similarity between the first frame image and the second frame image is high, the steps of reducing power consumption such as frame reduction and the like can be performed. It will be appreciated that if the viewing angle variation value is greater than the first preset value, the probability that the similarity between the first frame image and the second frame image is low may be predicted to be greater. Thus, even if the similarity between the first frame image and the second frame image is calculated again, the result of higher similarity between the first frame image and the second frame image cannot be obtained, and the frame rate reduction operation performed when the similarity between the first frame image and the second frame image is higher cannot be realized. Therefore, when the visual angle change value is larger than the first preset value, the similarity comparison between the first frame image and the second frame image is not performed, so that the calculation force can be saved, and the power consumption can be reduced.
In another possible implementation, when the movement amount is less than or equal to the second preset value, it may be understood that the position change of the first frame image and the second frame image is small, so that it may be predicted that the probability that the similarity between the first frame image and the second frame image is relatively high is large, and then the similarity comparison between the first frame image and the second frame image may be further performed. Similarly, when the movement amount is larger than a second preset value, similarity comparison between the first frame image and the second frame image is not performed, so that calculation force can be saved, and power consumption can be reduced.
In still another possible implementation, when the viewing angle variation value is less than or equal to the first preset value and the movement amount is less than or equal to the second preset value, it may be understood that the first frame image and the second frame image are less changed. In this way, it can be predicted that the probability that the similarity between the first frame image and the second frame image is relatively high is large, and then the similarity comparison between the first frame image and the second frame image can be further performed. Similarly, when the viewing angle change value is larger than the first preset value and the movement amount is larger than the second preset value, similarity comparison between the first frame image and the second frame image is not performed, so that calculation force can be saved, and power consumption can be reduced.
It should be noted that, the judgment of the angle of view change value and the judgment of the movement amount may not be sequentially distinguished.
In the embodiment of the present application, comparing the similarity between the first frame image and the second frame image may include comparing the similarity between pixels and the like between the first frame image and the second frame image, and the specific implementation of comparing the similarity between the first frame image and the second frame image is not limited.
For example, the implementation of similarity comparison of the first frame image and the second frame image may include: measuring the similarity of the images based on the vector similarity, respectively calculating histograms of the two frames of images, normalizing the histograms, and measuring the similarity according to a certain distance measurement standard; or image template matching can be performed, and the similarity between each position and the template image is searched through a sliding window in the image; or a fingerprint character string can be generated for each image through a perceptual hash algorithm, and the closer the results are, the more similar the images are.
S404, performing frame rate adjustment according to the similarity.
In the embodiment of the application, when the similarity between the first frame image and the second frame image is higher, the frame loss or frame dropping rate and the like can be executed, so that the purpose of reducing the power consumption of the terminal equipment is realized.
For example, frame rate adjustment may include dropping frames and/or turning down the screen refresh rate. The frame loss or the screen refresh rate can be reduced, so that the frame rate in the game can be reduced, and the purpose of reducing the power consumption is achieved. When the screen refresh rate is adjusted down, the original screen refresh rate can be multiplied by a coefficient smaller than 1, or the original screen refresh rate is reduced by a fixed value, and then the frame image can be drawn according to the adjusted down frame rate, so that the power consumption is reduced.
In summary, according to the in-game frame rate adjustment method provided by the embodiment of the application, before image similarity comparison, by calculating the motion parameters of two adjacent frames of images, pictures with lower similarity can be rapidly eliminated, the calculated amount of similarity comparison is reduced, the game load is optimized, the power consumption is reduced, and the user experience is improved.
Optionally, based on the corresponding embodiment of fig. 4, determining that the target scene is in the game application in S401 may include: reading scene tags in a rendering instruction stream of a game application; a target scene in the gaming application is determined from the scene tags.
The rendering instruction stream is an instruction set that instructs the graphics processor to render a game scene in a game, rendering data in the rendering instruction stream may include scene tags, which may be distinguishing identifications identifying different image scenes, that is, different image scenes correspond to different scene tags.
In one possible implementation, the scene tag may be obtained by offline data analysis and statistics by using graphics analysis software, and preset tag processing is performed on different scenes in the terminal device, so that the rendering instruction stream is read.
In another possible implementation, the terminal device may read the scene tag by acquiring a corresponding scene tag field or scene tag function in the rendering program.
The target scene in the game application is determined by acquiring a rendering instruction stream of the game application, and determining the target scene according to scene tags in the rendering instruction stream. For example, in the image drawing process, when the rendering task of a frame of image starts, by intercepting the scene tag in the rendering instruction stream, it can be judged whether the scene tag of the current frame of image is matched with the preset target scene tag, if so, it is indicated that the frame of image belongs to the target scene frame of image.
The scene tag in the embodiment of the application can be obtained by analyzing the offline data, so that the scene tag in the rendering instruction stream can be more accurately determined, and the recognition accuracy of the target scene is improved. The scene tag can also be obtained through corresponding fields or functions in the rendering program, so that the target scene can be identified more quickly and intelligently under the condition of smaller calculation power. And before frame image analysis is carried out, a target scene is determined according to the scene label, so that a non-target scene with low requirement on the frame rate can be rapidly eliminated, and the motion parameters and similarity comparison of two adjacent frame images are not required to be calculated, thereby saving calculation power, reducing power consumption and improving user experience.
Optionally, on the basis of the embodiment corresponding to fig. 4, before the target object is an object accompanying the movement of the player in the game and the motion parameters between the first frame image and the second frame image are acquired, the method further includes: acquiring a rendering instruction stream corresponding to a game application; acquiring a first V matrix corresponding to a first frame image and a second V matrix of a second frame image, wherein the first V matrix comprises: and obtaining a first V matrix corresponding to the first frame image and a second V matrix corresponding to the second frame image based on a preset V matrix field in the rendering instruction stream.
The observation coordinate system is to observe objects at different distances and angles, and a coordinate system with a camera as an origin is established. As shown in fig. 5, an observation matrix V matrix corresponding to the observation coordinate system, which converts world coordinates to observation coordinates, which are relative to the position and direction of the camera, can be acquired by rendering the instruction stream. The V matrix includes the position in world space, the directional vector of the camera, a vector pointing to the right of it, and a vector pointing above it. The V matrix creates a matrix by 3 mutually perpendicular axes and coordinates with the camera's position as origin, transforming all world coordinates into camera space very efficiently.
In one possible implementation, a first observation matrix V matrix corresponding to the first frame image and a second V matrix of the second frame image may be obtained by rendering a preset V matrix field in the instruction stream, where the first observation matrix V matrix and the second V matrix satisfy the following matrix formula:
where (Rx, ry, rz) is the right vector, (Ux, uy, uz) is the up vector, (Dx, dy, dz) is the direction vector of the camera, and (Px, py, pz) is the position vector of the camera.
For example, different rendering programs may correspond to different rendering fields or rendering functions, and the terminal device may determine the V matrix field value in the rendering instruction stream by acquiring the corresponding rendering field or rendering function through the rendering program. For example: view field may be represented as a V matrix rendering field or rendering function in a rendering program, the rendering field name or function name may be named by any combination of letters and numbers.
Optionally, acquiring the motion parameter between the first frame image and the second frame image in S402 includes: and calculating according to the first V matrix and the second V matrix to obtain a visual angle change value.
Since the angle of view variation may be an angle of view variation of the target object in the observation coordinate system, or may be regarded as a rotation angle of the camera, as shown in fig. 6, the angle of view variation value may be obtained by calculating how fast the target object changes angle in the world space coordinate system. The viewing angle change value of the target object can be obtained by the direction vector (Dx, dy, dz) in the V matrix, wherein the direction vector (Dx, dy, dz) in the V matrix satisfies the following dot product operation formula:
d N-1 ·d N =|d N-1 ||d N |cosθ
Wherein θ is the viewing angle variation value, d N-1 Is the direction vector in the first V matrix, d N-1 =(D N-1 x,D N-1 y,D N-1 z),d N Is the direction vector in the second V matrix, d N =(D N x,D N y,D N z)。
Therefore, the viewing angle change value calculated according to the first V matrix and the second V matrix satisfies the following formula:
in a possible implementation, the first direction vector corresponding to the first frame image and the second direction vector corresponding to the second frame image may be obtained by the direction vector (Dx, dy, dz) of the target object in the V matrix.
In another possible implementation, the first direction vector corresponding to the first frame image and the second direction vector corresponding to the second frame image may be obtained by rendering a preset direction vector field in the instruction stream. Different rendering programs can correspond to different rendering fields or rendering functions, and the terminal device can obtain corresponding direction vector field values through the rendering programs. For example: camera forward may be represented as a direction vector rendering field or rendering function of a target object in a rendering program, the direction vector field name or function name may be named by any combination of letters and numbers.
Optionally, on the basis of the embodiment corresponding to fig. 4, acquiring a motion parameter between the first frame image and the second frame image includes: acquiring a first coordinate value of a target object in a world space in a first frame image and a second coordinate value of the target object in the world space in a second frame image in a rendering instruction stream corresponding to the game application; and calculating the movement amount according to the first coordinate value and the second coordinate value.
In one possible implementation of calculating the movement amount, since the movement amount may be a movement amount of the target object in the world space, the movement amount may be obtained by a coordinate value of the target object in the world space. The moving amount comprises the moving distance of the target object in the world space, and the moving amount is calculated according to the first coordinate value and the second coordinate value, so that the following formula is satisfied:
wherein,is the movement amount (x) N-1 ,y N-1 ,z N-1 ) For the first coordinate value in world space, (x) N ,y N ,z N ) Is the second coordinate value in world space.
In another possible implementation of calculating the movement amount, since the movement amount may be a movement amount of the target object in the observation coordinate system, as shown in fig. 7, the movement amount may be obtained by calculating a position vector (Px, py, pz) of the target object in the observation coordinate system. The movement amount includes a movement distance of the target object in the observation coordinate system. The method comprises the steps of obtaining a moving distance between a first frame image and a second frame image, wherein the moving distance comprises a first position vector of a target object corresponding to the first frame image and a second position vector of the target object corresponding to the second frame image in a rendering instruction stream corresponding to a game application. According to the first position vector and the second position vector, calculating to obtain a movement amount, wherein the movement amount satisfies the following formula:
/>
Wherein,to move distance, P N-1 For the position vector in the first V matrix, P N Is the position vector in the second V matrix.
In a possible implementation, the first position vector corresponding to the first frame image and the second position vector corresponding to the second frame image may be obtained by the position vectors (Px, py, pz) of the target object in the V matrix.
In another possible implementation, the first position vector corresponding to the first frame image and the second position vector corresponding to the second frame image may be obtained through a preset position vector field in the rendering instruction stream. Different rendering programs can correspond to different rendering fields or rendering functions, and the terminal device can obtain corresponding position vector field values through the rendering programs. For example: the camera_origin may be represented as a position vector rendering field or a rendering function of a target object in different rendering programs, and the position vector field name or the function name may be named by any combination of letters and numbers.
In another possible implementation manner of calculating the movement amount, the movement amount may include a movement speed, and the movement amount is calculated according to the movement distance, so as to satisfy the following formula:
where v is the moving speed and t is the time interval between the first frame image and the second frame image.
According to the method for adjusting the frame rate in the game, provided by the embodiment of the application, before the similarity comparison of two adjacent frames of images is carried out, the two adjacent frames of images with faster motion parameter changes are eliminated, and because the similarity of the two frames of images with faster motion parameter changes is usually lower, the situation that the similarity is higher can occur at a lower probability, so that the calculation amount of the similarity comparison of the frames of images can be reduced under the condition that the accuracy of the similarity calculation is not affected, the game load can be optimized, the power consumption is reduced, the heating condition of terminal equipment is improved, and the user experience is improved.
Optionally, based on the embodiment corresponding to fig. 4, when the viewing angle change value is less than or equal to the first preset value and/or the movement amount is less than or equal to the second preset value, the similarity between the first frame image and the second frame image is compared in S403.
In one possible implementation, when the viewing angle variation value is less than or equal to the first preset value, it may be understood that the viewing angle variation of the first frame image and the second frame image is smaller, so that it may be predicted that the probability that the similarity between the first frame image and the second frame image is relatively high is greater, and then the similarity comparison between the first frame image and the second frame image may be further performed.
In another possible implementation, when the movement amount is less than or equal to the second preset value, it may be understood that the position change of the first frame image and the second frame image is small, so that it may be predicted that the probability that the similarity between the first frame image and the second frame image is relatively high is large, and then the similarity comparison between the first frame image and the second frame image may be further performed.
In still another possible implementation, when the viewing angle variation value is less than or equal to the first preset value and the movement amount is less than or equal to the second preset value, it may be understood that the first frame image and the second frame image have small changes, so that it may be predicted that the probability that the similarity between the first frame image and the second frame image is relatively high is large, and then the similarity comparison between the first frame image and the second frame image may be further performed.
It should be noted that, the judgment of the angle of view change value and the judgment of the movement amount may not be sequentially distinguished.
In the embodiment of the application, the similarity probability of the first frame image and the second frame image can be pre-judged by comparing the visual angle change value with the first preset value and comparing the moving amount with the second preset value. When the similarity probability of the first frame image and the second frame image is larger, similarity comparison of two adjacent frame images is performed, and frame rate reduction operation is performed, so that the similar frame images can be more accurately identified, unnecessary similarity calculation is reduced, and the operation efficiency is improved.
Optionally, on the basis of the corresponding embodiment of fig. 4, performing frame rate adjustment according to the similarity in S404 includes: and when the similarity is greater than or equal to the similarity threshold, performing frame dropping rate adjustment, wherein the frame dropping rate adjustment comprises frame dropping, and/or reducing the limit sending frequency when executing a waiting instruction, and/or reducing the screen refresh rate.
Optionally, performing the frame rate adjustment includes: keeping the screen refresh rate unchanged, and reducing the display sending frequency in the application program framework layer by executing a waiting instruction; or calling an interface of the kernel layer for reducing the screen refresh rate, and reducing the screen refresh rate based on the kernel layer; alternatively, the step of reducing the screen refresh rate is performed at the application framework layer.
In one possible implementation, in the image drawing process, the drawing function corresponding to the image needing to be reduced is skipped, namely, the image corresponding to the frame is discarded, so that the image cannot participate in image drawing. The processing mode can achieve frame dropping adjustment by executing a waiting instruction in an application program framework layer while ensuring that the screen refresh rate is unchanged.
In another possible implementation, the screen refresh rate may be reduced by calling an interface of the kernel layer to reduce the screen refresh rate, and reducing the screen refresh rate based on the kernel layer, or the screen refresh rate may be reduced by performing the step of reducing the screen refresh rate at the application framework layer. For example, the current image drawing frequency is 60Hz, the number of frames is 60 frames per second, and when the screen refresh rate needs to be reduced, an instruction for reducing the image drawing frequency from 60Hz to 30Hz can be output to the kernel layer, so that the frame number is discarded from 60 frames per second to 30 frames per second.
The method for adjusting the frame rate in the game provided by the embodiment of the application can reduce the display sending frequency or execute the step of reducing the screen refresh rate by writing the null value in the application program on the premise of not influencing the continuity of the game picture, and can also reduce the screen refresh rate based on the kernel layer, thereby reducing the game load, reducing the clamping of a game interface and realizing the purpose of reducing the power consumption of terminal equipment.
Specifically, fig. 8 is a flowchart of another method for adjusting a frame rate in a game application according to an embodiment of the present application. It should be understood that Frame N-1 in the figure may be the first Frame image, which represents the last Frame image of the nth Frame image acquired, N being greater than or equal to 2. Similarly, frame N may be the second Frame image representing the acquired nth Frame image, and Frame n+1 may be the next Frame image of the nth Frame image. As shown in fig. 8, the specific steps may be:
s801, a second Frame image Frame N is acquired.
S802, determining whether Frame N is in a target scene in the game application. In the determining process of the target scene, reference may be made to S401 in the above embodiment, and the embodiments of the present application are not described herein again.
If Frame N is in the target scene, which may be a game scene in the game, S803 is executed;
If Frame N is not in the target scene, the non-target scene may be a non-game play scene, S808 is performed.
S803, a first V matrix corresponding to the first Frame image Frame N-1 and a second V matrix corresponding to the second Frame image Frame N are obtained.
In one possible implementation, a first observation matrix V matrix corresponding to Frame N-1 and a second observation matrix V matrix corresponding to Frame N may be obtained by rendering a preset V matrix field in the instruction stream.
S804, acquiring motion parameters between frames N-1 and N through the first V matrix and the second V matrix. Wherein the motion parameter includes at least one of a viewing angle variation amount or a movement amount.
In a possible implementation manner, the calculation of the motion parameters between Frame N-1 and Frame N can be referred to S402 in the above embodiment, and the embodiments of the present application are not described herein again.
S805, a relationship between the view angle change amounts of the first Frame image Frame N-1 and the second Frame image Frame N and the first threshold value, and a relationship between the movement amount and the second threshold value are determined.
If the viewing angle variation amounts of Frame N-1 and Frame N are less than or equal to the first threshold value, and/or the movement amount is less than or equal to the second threshold value, S806 is performed. If the viewing angle variation is greater than the first threshold and the movement amount is greater than the second threshold, S809 is performed. The judgment of the viewing angle variation and the first threshold, and the judgment of the movement amount and the second threshold can be referred to S403 in the above embodiment, and the embodiments of the present application are not described herein again.
S806, comparing the similarity of the first Frame image Frame N-1 and the second Frame image Frame N. The similarity comparison may refer to S403 in the above embodiment, and the embodiments of the present application are not described herein again.
S807, judging whether Frame N-1 and Frame N are similar Frame images. If the similarity of the two frames is higher, S808 is executed; if the similarity of the two frames is low, S809 is performed.
S808, frame rate adjustment is performed. The frame rate adjustment may be referred to S404 in the above embodiment, and the embodiments of the present application are not described herein again.
S809, acquiring a next Frame image Frame N+1. When the new round of Frame rate adjustment is executed, the next Frame image Frame n+1 may be used as the second Frame image Frame N in the new round of execution flow, and the flow method judgment shown in fig. 8 is executed again, and so on, which is not repeated.
According to the frame rate adjusting method in the game, before the similarity comparison of two adjacent frames of images is carried out, the two adjacent frames of images with the faster motion parameters are eliminated, and the similarity of the two frames of images with the faster motion parameters is usually lower, and the situation that the similarity is higher occurs at a lower probability, so that the calculation amount of the similarity comparison of the frames of images can be reduced under the condition that the accuracy of the similarity calculation is not affected, the game load can be optimized, the power consumption is reduced, the heating condition of terminal equipment is improved, and the user experience is improved.
The foregoing description of the solution provided by the embodiments of the present application has been mainly presented in terms of a method. To achieve the above functions, it includes corresponding hardware structures and/or software modules that perform the respective functions. Those of skill in the art will readily appreciate that the present application may be implemented in hardware or a combination of hardware and computer software, as the method steps of the examples described in connection with the embodiments disclosed herein. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The embodiment of the application can divide the functional modules of the device for realizing the method for adjusting the frame rate in the game according to the method example, for example, each functional module can be divided corresponding to each function, and two or more functions can be integrated in one processing module. The integrated modules may be implemented in hardware or in software functional modules. It should be noted that, in the embodiment of the present application, the division of the modules is schematic, which is merely a logic function division, and other division manners may be implemented in actual implementation.
Fig. 9 is a schematic structural diagram of a chip according to an embodiment of the present application. Chip 900 includes one or more (including two) processors 901, communication lines 902, communication interfaces 903, and memory 904.
In some implementations, the memory 904 stores the following elements: executable modules or data structures, or a subset thereof, or an extended set thereof.
The method described in the above embodiments of the present application may be applied to the processor 901 or implemented by the processor 901. Processor 901 may be an integrated circuit chip with signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware in the processor 901 or instructions in the form of software. The processor 901 may be a general purpose processor (e.g., a microprocessor or a conventional processor), a digital signal processor (digital signal processing, DSP), an application specific integrated circuit (application specific integrated circuit, ASIC), an off-the-shelf programmable gate array (field-programmable gate array, FPGA) or other programmable logic device, discrete gates, transistor logic, or discrete hardware components, and the processor 901 may implement or perform the methods, steps, and logic diagrams associated with the various processes disclosed in embodiments of the application.
The steps of the method disclosed in connection with the embodiments of the present application may be embodied directly in the execution of a hardware decoding processor, or in the execution of a combination of hardware and software modules in a decoding processor. The software modules may be located in a state-of-the-art storage medium such as random access memory, read-only memory, programmable read-only memory, or charged erasable programmable memory (electrically erasable programmable read only memory, EEPROM). The storage medium is located in the memory 904, and the processor 901 reads information in the memory 904 and performs the steps of the above method in combination with its hardware.
The processor 901, the memory 904, and the communication interface 903 may communicate via a communication line 902.
In the above embodiments, the instructions stored by the memory for execution by the processor may be implemented in the form of a computer program product. The computer program product may be written in the memory in advance, or may be downloaded in the form of software and installed in the memory.
Embodiments of the present application also provide a computer program product comprising one or more computer instructions. When the computer program instructions are loaded and executed on a computer, the processes or functions in accordance with embodiments of the present application are produced in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by wired (e.g., coaxial cable, fiber optic, digital subscriber line (digital subscriber line, DSL), or wireless (e.g., infrared, wireless, microwave, etc.), or semiconductor medium (e.g., solid state disk, SSD)) or the like.
The embodiment of the application also provides a computer readable storage medium. The methods described in the above embodiments may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. Computer readable media can include computer storage media and communication media and can include any medium that can transfer a computer program from one place to another. The storage media may be any target media that is accessible by a computer.
As one possible design, the computer-readable medium may include compact disk read-only memory (CD-ROM), RAM, ROM, EEPROM, or other optical disk memory; the computer readable medium may include disk storage or other disk storage devices. Moreover, any connection is properly termed a computer-readable medium. For example, if the software is transmitted from a website, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave, then the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave are included in the definition of medium. Disk and disc, as used herein, includes Compact Disc (CD), laser disc, optical disc, digital versatile disc (digital versatile disc, DVD), floppy disk and blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers.
Embodiments of the present application are described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processing unit of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processing unit of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.

Claims (11)

1. A frame rate adjusting method in game is applied to electronic equipment, which is characterized in that,
determining a target scenario or a non-target scenario in a gaming application, wherein a demand for a frame rate or load is greater in the target scenario than in the non-target scenario;
when the target scene is located, acquiring motion parameters between a first frame image and a second frame image; the first frame image and the second frame image are adjacent frame images in the target scene, the motion parameter is a variable quantity obtained by taking a target object as a reference, and the motion parameter comprises a visual angle variable value of the target object and a moving quantity of the target object;
Comparing the similarity between the first frame image and the second frame image when the viewing angle variation value is less than or equal to a first preset value and the movement amount is less than or equal to a second preset value;
and when the similarity is greater than or equal to a similarity threshold, performing frame dropping rate adjustment.
2. The method of claim 1, wherein the target object is an object accompanying player movement in a game, and the acquiring the movement parameter between the first frame image and the second frame image comprises:
acquiring a first observation matrix V matrix corresponding to the first frame image and a second V matrix of the second frame image;
and calculating according to the first V matrix and the second V matrix to obtain the visual angle change value.
3. The method of claim 2, wherein the viewing angle change value calculated from the first V matrix and the second V matrix satisfies the following formula:
wherein θ is the viewing angle variation value, d N-1 D, as a vector of the observation direction of the target object in the first V matrix N Is the observation direction vector of the target object in the second V matrix.
4. A method according to claim 2 or 3, wherein before the obtaining the first observation matrix V matrix corresponding to the first frame image and the second observation matrix V matrix of the second frame image, further comprises:
Acquiring a rendering instruction stream corresponding to the game application;
the obtaining the first V matrix corresponding to the first frame image and the second V matrix of the second frame image includes: and obtaining a first V matrix corresponding to the first frame image and a second V matrix of the second frame image based on a preset field in the rendering instruction stream.
5. The method according to any one of claims 1-4, wherein the acquiring a motion parameter between the first frame image and the second frame image comprises:
acquiring a first coordinate value of the target object in the world space in the first frame image and a second coordinate value of the target object in the world space in the second frame image in a rendering instruction stream corresponding to the game application;
and calculating the movement amount according to the first coordinate value and the second coordinate value.
6. The method of claim 5, wherein the movement amount includes a movement distance, and the movement amount is calculated according to the first coordinate value and the second coordinate value, and the following formula is satisfied:
wherein,for the distance of movement, (x N-1 ,y N-1 ,z N-1 ) For the first coordinate value, (x) N ,y N ,z N ) Is the second coordinate value.
7. The method of any of claims 1-6, wherein the reduced frame rate adjustment comprises: frame loss, and/or executing a wait instruction to reduce the frequency of sending limits, and/or, to lower the screen refresh rate.
8. The method of claim 7, wherein said performing a frame rate down adjustment comprises:
keeping the screen refresh rate unchanged, and reducing the display frequency in the application framework layer by executing a waiting instruction;
or, calling an interface of the kernel layer for reducing the screen refresh rate, and reducing the screen refresh rate based on the kernel layer;
alternatively, the step of reducing the screen refresh rate is performed at the application framework layer.
9. The method of any of claims 1-8, wherein the target scene comprises a game scene in the gaming application, and wherein the determining the target scene in the gaming application comprises:
reading scene tags in a rendering instruction stream of the game application;
and determining a target scene in the game application according to the scene tag.
10. A terminal device, comprising: a memory for storing a computer program, and a processor for executing the computer program to perform the in-game frame rate adjustment method according to any one of claims 1-9.
11. A computer readable storage medium storing instructions that, when executed, cause a computer to perform the in-game frame rate adjustment method of any one of claims 1-9.
CN202210956515.3A 2022-08-10 2022-08-10 Frame rate adjusting method in game and related device Active CN116095221B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210956515.3A CN116095221B (en) 2022-08-10 2022-08-10 Frame rate adjusting method in game and related device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210956515.3A CN116095221B (en) 2022-08-10 2022-08-10 Frame rate adjusting method in game and related device

Publications (2)

Publication Number Publication Date
CN116095221A CN116095221A (en) 2023-05-09
CN116095221B true CN116095221B (en) 2023-11-21

Family

ID=86185540

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210956515.3A Active CN116095221B (en) 2022-08-10 2022-08-10 Frame rate adjusting method in game and related device

Country Status (1)

Country Link
CN (1) CN116095221B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116664630B (en) * 2023-08-01 2023-11-14 荣耀终端有限公司 Image processing method and electronic equipment

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103400386A (en) * 2013-07-30 2013-11-20 清华大学深圳研究生院 Interactive image processing method used for video
CN105045367A (en) * 2015-01-16 2015-11-11 中国矿业大学 Android system equipment power consumption optimization method based on game load prediction
JP2019144827A (en) * 2018-02-20 2019-08-29 キヤノン株式会社 Image processing device, and control method and program of the same
CN112507799A (en) * 2020-11-13 2021-03-16 幻蝎科技(武汉)有限公司 Image identification method based on eye movement fixation point guidance, MR glasses and medium
WO2022068326A1 (en) * 2020-09-30 2022-04-07 华为技术有限公司 Image frame prediction method and electronic device
CN114452645A (en) * 2021-07-09 2022-05-10 荣耀终端有限公司 Method, apparatus and storage medium for generating scene image
CN114724055A (en) * 2021-01-05 2022-07-08 华为技术有限公司 Video switching method, device, storage medium and equipment
CN114740965A (en) * 2022-05-05 2022-07-12 Oppo广东移动通信有限公司 Processing method and device for reducing power consumption of terminal, terminal and readable storage medium

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103400386A (en) * 2013-07-30 2013-11-20 清华大学深圳研究生院 Interactive image processing method used for video
CN105045367A (en) * 2015-01-16 2015-11-11 中国矿业大学 Android system equipment power consumption optimization method based on game load prediction
JP2019144827A (en) * 2018-02-20 2019-08-29 キヤノン株式会社 Image processing device, and control method and program of the same
WO2022068326A1 (en) * 2020-09-30 2022-04-07 华为技术有限公司 Image frame prediction method and electronic device
CN112507799A (en) * 2020-11-13 2021-03-16 幻蝎科技(武汉)有限公司 Image identification method based on eye movement fixation point guidance, MR glasses and medium
CN114724055A (en) * 2021-01-05 2022-07-08 华为技术有限公司 Video switching method, device, storage medium and equipment
CN114452645A (en) * 2021-07-09 2022-05-10 荣耀终端有限公司 Method, apparatus and storage medium for generating scene image
CN114740965A (en) * 2022-05-05 2022-07-12 Oppo广东移动通信有限公司 Processing method and device for reducing power consumption of terminal, terminal and readable storage medium

Also Published As

Publication number Publication date
CN116095221A (en) 2023-05-09

Similar Documents

Publication Publication Date Title
CA3104243C (en) Real-time animation generation using machine learning
CN111880648B (en) Three-dimensional element control method and terminal
US11951390B2 (en) Method and system for incremental topological update within a data flow graph in gaming
US11986729B2 (en) Method and system for retargeting a human component of a camera motion
CN116095221B (en) Frame rate adjusting method in game and related device
US11615506B2 (en) Dynamic over-rendering in late-warping
US20240029197A1 (en) Dynamic over-rendering in late-warping
CN113936089A (en) Interface rendering method and device, storage medium and electronic equipment
CN116688495B (en) Frame rate adjusting method and related device
US12008155B2 (en) Reducing startup time of augmented reality experience
CN116091292B (en) Data processing method and related device
CN114564101A (en) Three-dimensional interface control method and terminal
US20220414984A1 (en) Volumetric data processing using a flat file format
US20230154445A1 (en) Spatial music creation interface
WO2024045701A1 (en) Data processing method and apparatus, and device and storage medium
CN115623318B (en) Focusing method and related device
US20230306671A1 (en) Method and system for distributed rendering for simulation
US20220375026A1 (en) Late warping to minimize latency of moving objects
CN116737104A (en) Volume adjusting method and related device
KR20240008370A (en) Late warping to minimize latency for moving objects
CN117742849A (en) Interface display method and related device based on application splitting
CN117714836A (en) Image processing method and related device
CN116680431A (en) Visual positioning method, electronic equipment, medium and product
CN117425869A (en) Dynamic over-rendering in post-distortion
CN117459756A (en) Video processing method, equipment and related products

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant