CN115253291A - Motion state identification method and device, storage medium and electronic equipment - Google Patents

Motion state identification method and device, storage medium and electronic equipment Download PDF

Info

Publication number
CN115253291A
CN115253291A CN202210814544.6A CN202210814544A CN115253291A CN 115253291 A CN115253291 A CN 115253291A CN 202210814544 A CN202210814544 A CN 202210814544A CN 115253291 A CN115253291 A CN 115253291A
Authority
CN
China
Prior art keywords
target
application
motion state
drawing instruction
application element
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210814544.6A
Other languages
Chinese (zh)
Inventor
朱秀丽
高光磊
姚士峰
黄文涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202210814544.6A priority Critical patent/CN115253291A/en
Publication of CN115253291A publication Critical patent/CN115253291A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/64Methods for processing data by generating or executing the game program for computing dynamical parameters of game objects, e.g. motion determination or computation of frictional forces for a virtual car
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/65Methods for processing data by generating or executing the game program for computing the condition of a game character

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The application discloses a motion state identification method, a motion state identification device, a storage medium and electronic equipment, wherein the method comprises the following steps: acquiring a target graph drawing instruction aiming at a target application, and acquiring a first camera position matrix aiming at an application element in a first frame picture in the target graph drawing instruction; acquiring a second camera position matrix of an application element in a recorded second frame picture, wherein the second frame picture is obtained before the first frame picture based on target application, and the number of frames separated from the first frame picture is less than or equal to a preset number of frames; the motion state of the application element is determined based on the first camera position matrix and the second camera position matrix, so that the motion state of the application element in the target application can be judged through the camera position matrix between different frames in the running process of the target application under the condition that the target application client does not send any information such as action, time and the like of the application element.

Description

Motion state identification method and device, storage medium and electronic equipment
Technical Field
The present application relates to the field of computer technologies, and in particular, to a motion state identification method and apparatus, a storage medium, and an electronic device.
Background
With the development of computer technology, various game applications are developed to enrich people's entertainment. With the continuous iterative upgrade of various game applications and the popularization of large-scale game applications, many popular game applications are running with higher and higher hardware configurations. Under the condition that the hardware configuration is insufficient, the running effect of the game application is poor, and the user experience is poor.
At present, under the condition of limited hardware configuration conditions, improving the running effect of game applications and improving the game experience of users become another research direction of related enterprises.
Disclosure of Invention
The embodiment of the application provides a motion state identification method and device, a storage medium and an electronic device. The technical scheme is as follows:
in a first aspect, an embodiment of the present application provides a motion state identification method, where the method includes:
acquiring a target graph drawing instruction aiming at a target application, and acquiring a first camera position matrix aiming at an application element in a first frame picture in the target graph drawing instruction;
acquiring a second camera position matrix of an application element in a recorded second frame picture, wherein the second frame picture is obtained before the first frame picture based on the target application, and the number of frames separated from the first frame picture is less than or equal to a preset number of frames;
determining a motion state of the application element based on the first and second camera position matrices.
In a second aspect, an embodiment of the present application provides an apparatus for identifying a motion state, where the apparatus includes:
the first matrix acquisition module is used for acquiring a target graph drawing instruction aiming at a target application and acquiring a first camera position matrix aiming at an application element in a first frame picture in the target graph drawing instruction;
a second matrix obtaining module, configured to obtain a second camera position matrix of an application element in a recorded second frame of picture, where the second frame of picture is obtained before the first frame of picture based on the target application, and a frame number that is separated from the first frame of picture is less than or equal to a preset frame number;
a motion state determination module to determine a motion state of the application element based on the first camera position matrix and the second camera position matrix.
In a third aspect, embodiments of the present application provide a computer storage medium storing a plurality of instructions adapted to be loaded by a processor and to perform the above-mentioned method steps.
In a fourth aspect, an embodiment of the present application provides an electronic device, which may include: a processor and a memory; wherein the memory stores a computer program adapted to be loaded by the processor and to perform the above-mentioned method steps.
In one or more embodiments of the present application, by obtaining a target graphics drawing instruction for a target application, obtaining a first camera position matrix for an application element in a first frame of a target graphics drawing instruction, then obtaining a second camera position matrix for the application element in a second frame of the recorded second frame of the target graphics drawing instruction, where the second frame is a frame that is obtained before the first frame of the target application and is separated from the first frame by a number of frames that is less than or equal to a preset number of frames, and finally determining a motion state of the application element based on the first camera position matrix and the second camera position matrix, the motion state of the application element in the target application can be determined by using a camera position matrix between different frames in a target application running process without sending information such as an action and time of any application element by a target application client.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a schematic flow chart of a motion state identification method according to an embodiment of the present disclosure;
fig. 2 is a schematic flow chart of a motion state identification method according to an embodiment of the present disclosure;
fig. 3 is a schematic flow chart of a motion state identification method according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of a motion state identification apparatus according to an embodiment of the present application;
fig. 5 is a schematic structural diagram of a motion state identification apparatus according to an embodiment of the present disclosure;
fig. 6 is a schematic structural diagram of a motion state determining module according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of an image quality adjustment module according to an embodiment of the present disclosure;
fig. 8 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
In the description of the present application, it is to be understood that the terms "first," "second," and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance. In the description of the present application, it is noted that, unless explicitly stated or limited otherwise, "including" and "having" and any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus. The specific meaning of the above terms in this application will be understood to be a specific case for those of ordinary skill in the art. In addition, in the description of the present application, "a plurality" means two or more unless otherwise specified. "and/or" describes the association relationship of the associated object, indicating that there may be three relationships, for example, a and/or B, which may indicate: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
In the prior art, in the running process of a game application, a user terminal can acquire a motion state of a game picture of the game application according to information such as actions and time of elements in the game picture issued by a game client, and optimize running parameters of the game application in real time according to the current motion state of the game picture so as to improve the running effect of the game application.
However, in the development stage, some game applications are not given the function of sending information such as action and time to the user terminal by the game client in the running process, that is, the user terminal cannot know the motion state of the game screen of the game application in the running process of the game, and thus cannot optimize the running parameters of the game application in real time according to the current motion state of the game screen.
Based on the above, the present application provides a motion state identification method, in the running process of a target application, a target graph drawing instruction for the target application is obtained, a first camera position matrix for an application element in a first frame picture in the target graph drawing instruction is obtained, then a second camera position matrix for the application element in a recorded second frame picture is obtained, the second frame picture is a frame picture which is obtained before the first frame picture based on the target application and has a frame number which is less than or equal to a preset frame number and is separated from the first frame picture, and finally the motion state of the application element is determined based on the first camera position matrix and the second camera position matrix, so that the motion state of the application element in the target application can be judged by detecting the camera position matrix between different frames in the running process of the target application under the condition that a target application client does not send information such as actions and time of any application element.
The following is a detailed description with reference to specific examples. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the application, as detailed in the appended claims. The flow diagrams depicted in the figures are merely exemplary and need not be performed in the order of the steps shown. For example, some steps are parallel, and there is no strict logical relationship between them, so the actual execution order is variable.
Fig. 1 is a schematic flow chart of a motion state identification method according to an embodiment of the present application. In a specific embodiment, the motion state identification method is applied to a motion state identification device or an electronic device equipped with a motion state identification device. As will be described in detail with respect to the flow shown in fig. 2, the motion state identification method may specifically include the following steps:
s101, acquiring a target graph drawing instruction aiming at a target application, and acquiring a first camera position matrix aiming at an application element in a first frame picture in the target graph drawing instruction;
it is understood that, during the running process of the target application, the central processing unit in the electronic device sends the graphics drawing instruction for the target application to the image processor so as to update each frame of application picture of the target application.
Specifically, in the running process of the target application, the central processing unit in the electronic device sends a target graph drawing instruction for the target application to the image processor, and obtains a first camera position matrix of an application element in an application picture for the target application in the target graph drawing instruction.
It should be noted that, in the running process of the target application, when an application screen of the target application needs to be displayed, the central processing unit may call the graphics API to send a graphics drawing instruction to the graphics processor, and perform rendering processing on data of each frame of the application screen, thereby obtaining an application screen after rendering and displaying the application screen on the display screen. Aiming at each frame of application picture, the central processing unit sends a plurality of graphic drawing instructions to the graphic processor, the graphic processor sequentially obtains the graphic drawing instructions sent by the central processing unit and carries out rendering operation, when the graphic processor completely executes the graphic drawing instructions corresponding to one frame of application picture, the rendered application picture is obtained, and then the frame of application picture can be displayed on the display screen.
When the central processing unit sends the graph drawing instruction to the graph processor, the central processing unit stores the graph drawing instruction in a buffer area of the graph processor in consideration of the problem of speed mismatch, and the graph processor sequentially acquires the graph drawing instruction in the buffer area.
And the graphics drawing instruction is a graphics API instruction and is used for calling the graphics processor to render and draw the display picture of the target application. The graphics drawing instruction may be an Application Programming Interface (API) instruction such as OpenGL, openGL ES, directX, or the like.
The application element is one of the elements of the application screen of the target application, and may be, for example, a person, a vehicle, an animal, or a character.
The first camera position matrix is one or more matrixes in a model view projection matrix used by a shader in the graphics processor when the shader renders the application elements in the first frame picture, and can represent the motion state of the application elements in the application picture.
The model view projection matrix, also called MVP matrix, is a matrix obtained by transforming the model matrix, the view matrix and the projection matrix, and can be used for a shader to render and display an application picture with a transformed view angle according to the model view projection matrix. And respectively storing each matrix forming the model view projection matrix in the data transmitted by each graph drawing command, wherein the first camera position matrix is stored in the data transmitted by the target graph drawing command.
In one embodiment, when the target application is first run, the target graphics rendering instructions may be determined by:
before a target graph drawing instruction which is sent to a graph processor by a central processing unit and aims at a target application is obtained, a plurality of frame pictures of the target application when application elements are in different motion states are intercepted, differences between model view projection matrixes corresponding to the application elements in different motion states are compared, and a camera position matrix is determined in a plurality of matrixes forming the model view projection matrixes according to the differences between the model view projection matrixes corresponding to the different motion states. Then, a first buffer area where the camera position matrix is located is determined in each buffer area of the graphics processor, a first buffer area identifier of the first buffer area is determined, the buffer area named by the first buffer area identifier is used as a target buffer area, and a graphics drawing instruction stored in the target buffer area is used as a target graphics drawing instruction.
And in the running process of the target application, monitoring a graph drawing instruction aiming at the target application sent to the graph processor by the central processing unit in real time, and judging whether a cache region stored by the graph drawing instruction is a target buffer region named by the first buffer region identifier. And if the cache region stored by the graph drawing instruction is determined to be the target buffer region named by the first buffer region identifier, determining the graph drawing instruction to be the target graph drawing instruction, and acquiring a first camera position matrix aiming at the application element in the first frame picture in the target graph drawing instruction.
In one embodiment, when the target application is not first run, the target graphics rendering instructions may be determined by:
and determining a second buffer area with the same size as the first buffer area in each buffer area of the graphics processor, determining a second buffer area identifier of the second buffer area, taking the buffer area named by the second buffer area identifier as a target buffer area, and taking the graphics drawing instruction stored in the target buffer area as a target graphics drawing instruction.
And in the running process of the target application, monitoring a graph drawing instruction sent to the graph processor by the central processing unit in real time, and judging whether a cache region stored by the graph drawing instruction is a target buffer region named by the second buffer region identifier. And if the cache region stored by the graph drawing instruction is determined to be the target buffer region named by the second buffer region identifier, determining the graph drawing instruction to be the target graph drawing instruction, and acquiring a first camera position matrix aiming at the application element in the first frame picture in the target graph drawing instruction.
S102, acquiring a second camera position matrix of application elements in a recorded second frame picture, wherein the second frame picture is obtained before the first frame picture based on target application and the number of frames separated from the first frame picture is less than or equal to a preset number of frames;
specifically, a second camera position matrix of application elements in a recorded second frame picture is obtained, the second frame picture is a frame picture before a first frame picture in the running process of the target application, the number of frames between the second frame picture and the first frame picture is less than or equal to a preset number of frames, and the second camera position matrix is a camera position matrix corresponding to the application elements in the second frame picture obtained before the first camera position matrix of the first frame picture is obtained.
S103, determining the motion state of the application element based on the first camera position matrix and the second camera position matrix.
Specifically, a difference between a first camera position matrix corresponding to an application element in the first frame picture and a second camera position matrix corresponding to an application element in the second frame picture is calculated. And if the difference value between the first camera position matrix and the second camera position matrix is zero, namely the application element is not moved from the second frame picture to the first frame picture, determining that the motion state of the application element is in a static state, and if the difference value between the first camera position matrix and the second camera position matrix is not zero, namely the application element is moved from the second frame picture to the first frame picture, determining that the motion state of the application element is in a moving state.
In one embodiment, after determining the motion state of the application element, the picture display quality of the target application may be adjusted based on the motion state of the application element.
Optionally, if it is determined that the motion state of the application element is the moving state, the picture display quality of the current picture is reduced based on the load reduction technique, and if it is determined that the motion state of the application element is the static state, the picture display quality of the current picture is kept unchanged.
It can be understood that, when the motion state of the application element is a static state, the change of each frame of picture displayed by the target application is small or unchanged, and when each frame of picture is rendered and displayed, the computing pressure of the central processing unit and the graphic processor is small, and the pause phenomenon caused by insufficient hardware performance can not occur, so that the picture display quality of the target application does not need to be reduced, and a good display effect can be maintained to meet the needs of users. When the motion state of the application element is a moving state, each frame of picture displayed in the moving target application of the application element is changed rapidly, at this time, the change of each frame of picture displayed by the target application is large, the calculation pressure of the central processing unit and the graphic processor when each frame of picture is rendered and displayed is large, and a pause phenomenon caused by insufficient hardware performance is easy to occur.
In one or more embodiments of the present application, in the running process of a target application, a target graphics drawing instruction for the target application, which is sent from a central processing unit to a graphics processor, is obtained, a first camera position matrix for an application element in a first frame picture in the target graphics drawing instruction is obtained, then a second camera position matrix for the application element in a recorded second frame picture is obtained, the second frame picture is a frame picture which is obtained before the first frame picture based on the target application and has a frame number which is less than or equal to a preset frame number and is separated from the first frame picture, and finally a motion state of the application element is determined based on the first camera position matrix and the second camera position matrix.
In one embodiment, the target buffer in which the target graphics drawing instruction is stored also needs to be determined prior to fetching the target graphics drawing instruction. Referring to fig. 2, a schematic flow chart of a motion state identification method provided in an embodiment of the present application is shown when a target application is first run. As shown in fig. 2, the motion state identification method may include the steps of:
s201, intercepting a plurality of frame pictures when application elements in target application are in different motion states, and acquiring difference data between model view projection matrixes of each frame picture in the plurality of frame pictures;
specifically, when the target application runs for the first time, a plurality of frame images of the target application with application elements in different motion states are captured, and differences between model view projection matrixes corresponding to the application elements in the frame images are compared to obtain difference data between the model view projection matrixes.
For example, a frame picture in which an application element in the target application is in a static state and a frame picture in which the application element in the target application is in a running state are captured, model view projection matrices corresponding to the application elements in the two frame pictures are compared, and difference data between the two model view projection matrices is obtained.
S202, determining a camera position matrix in a plurality of matrixes forming a model view projection matrix based on difference data;
it can be understood that when the application element is in different motion states, the model view projection matrices used by the shader for rendering and displaying the target element are different, difference data between the model view projection matrices is obtained by comparing differences between the model view projection matrices in which the application element is in different motion states, and a camera position matrix for representing the motion state of the application element can be determined from the difference data among a plurality of matrices constituting the model view projection matrices.
S203, determining a first buffer area where the camera position matrix is located in a buffer area set, wherein each buffer area in the buffer area set is used for storing a graphic drawing instruction for a target application;
s204, taking the first buffer area as a target buffer area, and taking the graph drawing instruction stored in the target buffer area as a target graph drawing instruction;
and each buffer zone in the buffer zone set is each buffer zone in the graphics processor and is used for storing the graphics drawing instruction which is sent to the graphics processor by the central processing unit and aims at the target application.
Specifically, in steps S203 to S204, after the camera position matrix representing the motion state of the application element is determined from the plurality of matrices constituting the model view projection matrix, a first buffer area in which the camera position matrix is located is determined in each buffer area of the graphics processor, the first buffer area is used as a target buffer area, and the graphics drawing instruction stored in the target buffer area is used as a target graphics drawing instruction. The target graph drawing instruction is a graph drawing instruction carrying a camera position matrix.
It should be appreciated that the camera position matrix used to represent the motion state of the application element is not obvious when it is first run, i.e. the electronic device running the target application does not know which matrix can represent the motion state of the application element in the initial phase, the camera position matrix being obtained by comparing the differences between the model view projection matrices for which the application element is in different motion states. For a target element in the same target application, the camera position matrix used for representing the motion state of the application element does not change any more, so that the steps of sequentially determining the camera position matrix only need to be executed when the target application runs for the first time.
The camera position matrix is data transmitted in a graph drawing instruction sent by the central processing unit to the graph processor, after the camera position matrix is determined, a first buffer area stored in the camera position matrix needs to be located, the first buffer area serves as a target buffer area, and the graph drawing instruction stored in the target buffer area serves as a target graph drawing instruction, so that the target graph drawing instruction stored in the buffer area can be monitored in the running process of target application.
It should be noted that, from the starting operation of the target application to the ending operation of the target application, in a complete operation process, the buffer area stored by the target graphic drawing instruction carrying the camera position matrix is not changed. That is, after the target buffer area is determined, the graphics drawing instruction stored in the target buffer area is the target graphics drawing instruction carrying the camera position matrix.
S205, monitoring a graph drawing instruction for the target application in real time, and judging whether a cache region stored by the graph drawing instruction is a target buffer region;
s206, if the cache region stored by the graph drawing instruction is determined to be the target buffer region, determining the graph drawing instruction to be the target graph drawing instruction;
in step S205 to step S206, a graphics drawing instruction sent from the central processing unit to the graphics processing unit is monitored in real time, whether a buffer identifier of a buffer area stored in the graphics drawing instruction is the same as a target buffer identifier of a target buffer area is determined, and if the buffer identifier of the buffer area stored in the graphics drawing instruction is the same as the target buffer area, the cache area stored in the graphics drawing instruction is determined as the target buffer area, and the graphics drawing instruction is determined as the target graphics drawing instruction carrying the camera position matrix.
The buffer identification may be a name identification of the buffer or a size identification of the buffer.
Optionally, if it is determined that the cache region stored in the graphics drawing instruction is not the target buffer region, it is determined that the graphics drawing instruction is not the target graphics drawing instruction.
S207, acquiring a first camera position matrix aiming at application elements in a first frame picture in a target graph drawing instruction;
specifically, after the target graphic drawing instruction is determined, a first camera position matrix for the application element in the first frame of the picture is acquired in data passed by the target graphic drawing instruction based on a HOOK function.
The first frame picture is an application picture of a target application rendered and displayed by the graphics processor based on the target graphics drawing instruction.
S208, acquiring a second camera position matrix of application elements in a recorded second frame picture, wherein the second frame picture is obtained before the first frame picture based on target application, and the number of frames separated from the first frame picture is less than or equal to a preset number of frames;
specifically, please refer to step S208 for a detailed description of step S102 in another embodiment, which is not repeated herein.
S209, calculating a difference value between the first camera position matrix and the second camera position matrix;
s210, if the difference value is zero, determining that the motion state of the application element is a static state;
s211, if the difference is not zero, determining that the motion state of the application element is a moving state;
alternatively, the movement state may include a walking state, a running state, and a driving state. In one embodiment, the moving speed of the application element may also be calculated based on a time interval between the first frame picture and the second frame picture and a difference value between the first camera position matrix and the second camera position matrix. If the moving speed of the application element is larger than zero and smaller than a first speed threshold value, determining that the motion state of the application element is a walking state; if the moving speed of the application element is greater than the first speed threshold and less than the second speed threshold, determining that the motion state of the application element is a running state; and if the moving speed of the application element is greater than the second speed threshold value, determining that the motion state of the application element is a driving state.
S212, if the motion state of the application element is determined to be the moving state, reducing the picture display quality of the current picture based on a load reduction technology;
it can be understood that, when the motion state of the application element is a moving state, each frame displayed in the moving target application along with the application element changes rapidly, at this time, the change of each frame displayed by the target application is large, the computing pressure of the central processing unit and the graphics processing unit when rendering and displaying each frame is large, and a pause phenomenon caused by insufficient hardware performance is easy to occur, so that the image display quality of the target application can be reduced based on a load reduction technology to relieve the computing pressure of the central processing unit and the graphics processing unit, and the target application is kept running smoothly.
The load reduction technology can be GPU load reduction technology and variable resolution rendering technology.
Alternatively, the movement state may include a walking state, a running state, and a driving state. In one embodiment, the game image quality is reduced to a first resolution based on a load reduction technique if the motion state of the application element is determined to be a walking state, the game image quality is reduced to a second resolution based on the load reduction technique if the motion state of the application element is determined to be a running state, and the game image quality is reduced to a third resolution based on the load reduction technique if the motion state of the application element is determined to be a driving state, the first resolution being greater than the second resolution, and the second resolution being greater than the third resolution.
And S213, if the motion state of the application element is determined to be a static state, keeping the picture display quality of the current picture unchanged.
It can be understood that, when the motion state of the application element is a static state, the change of each frame of picture displayed by the target application is small or unchanged, and when each frame of picture is rendered and displayed, the computing pressure of the central processing unit and the graphic processor is small, and the pause phenomenon caused by insufficient hardware performance can not occur, so that the picture display quality of the target application does not need to be reduced, and a good display effect can be maintained to meet the needs of users.
In the embodiment of the application, when a target application runs for the first time, a plurality of frame pictures of the target application when application elements are in different motion states are intercepted, a camera position matrix is determined based on difference data between model view projection matrixes of each frame picture in the plurality of frame pictures, a target buffer area corresponding to the camera position matrix is determined, and a graph drawing instruction stored in the target buffer area is a target graph drawing instruction carrying the camera position matrix. And then monitoring a target graph drawing instruction stored in a target buffer area in real time, acquiring a camera position matrix used for identifying the motion state of the application element in the target graph drawing instruction based on a HOOK function, determining the motion state of the application element by calculating the difference value of a first camera position matrix of the application element in a first frame of picture and a second camera position matrix of the application element in a second frame of picture, under the condition that a target application client does not send any information such as action, time and the like of the application element, judging the motion state of the application element in the target application by detecting the camera position matrix between different frames in the running process of the target application, adjusting the picture display quality of the target application according to the motion state of the application element, avoiding the problem of jamming caused in the rapid moving process of the application element, and ensuring smooth running of the target application.
In one embodiment, when the target application is not running for the first time, the position of the target buffer stored in the camera position matrix for representing the motion state of the application element may change, but the size of the target buffer may not change. Thus, when the target application is not first run time, the target buffer stored by the camera position matrix needs to be re-determined based on the first run-time determined first buffer size. Please refer to fig. 3, which is a flowchart illustrating a motion state identification method according to an embodiment of the present disclosure. As will be explained in detail with respect to the flow shown in fig. 3, the motion state identification method may include the following steps:
s301, determining a second buffer area with the same size as the first buffer area in the buffer area set;
specifically, when the target application is not run for the first time, a second buffer area with the same size as the first buffer area is determined in each buffer area of the graphics processor based on the size of the first buffer area determined when the target application is run for the first time.
S302, taking the second buffer area as a target buffer area, and taking the graph drawing instruction stored in the target buffer area as a target graph drawing instruction;
specifically, please refer to step S302 for a detailed description of step S204 in another embodiment, which is not repeated herein.
S303, monitoring a graph drawing instruction aiming at the target application in real time, and judging whether a cache region stored by the graph drawing instruction is a target buffer region or not;
s304, if the cache region stored by the graph drawing instruction is determined to be the target buffer region, determining the graph drawing instruction to be the target graph drawing instruction;
specifically, please refer to detailed descriptions of steps S205 to S206 in another embodiment for step S303 and step S304, which are not repeated herein.
S305, acquiring a first camera position matrix aiming at application elements in a first frame of picture in a target graph drawing instruction;
specifically, after the target graphic drawing instruction is determined, a first camera position matrix for the application element in the first frame picture is acquired in data transferred by the target graphic drawing instruction based on the HOOK function.
The first frame picture is an application picture of a target application rendered and displayed by the graphics processor based on the target graphics drawing instruction.
S306, acquiring a second camera position matrix of an application element in a recorded second frame picture, wherein the second frame picture is obtained before the first frame picture based on target application, and the number of frames separated from the first frame picture is less than or equal to a preset number of frames;
specifically, please refer to step S306 for a detailed description of step S102 in another embodiment, which is not repeated herein.
S307, calculating a difference value between the first camera position matrix and the second camera position matrix;
s308, if the difference value is zero, determining that the motion state of the application element is a static state;
s309, if the difference value is not zero, determining that the motion state of the application element is a moving state;
alternatively, the movement state may include a walking state, a running state, and a driving state. In one embodiment, the moving speed of the application element may also be calculated based on a time interval between the first frame picture and the second frame picture and a difference value between the first camera position matrix and the second camera position matrix. If the moving speed of the application element is larger than zero and smaller than a first speed threshold value, determining that the motion state of the application element is a walking state; if the moving speed of the application element is greater than the first speed threshold and less than the second speed threshold, determining that the motion state of the application element is a running state; and if the moving speed of the application element is greater than the second speed threshold value, determining that the motion state of the application element is the driving state.
S310, if the motion state of the application element is determined to be the moving state, reducing the picture display quality of the current picture based on a load reduction technology;
it can be understood that, when the motion state of the application element is a moving state, each frame displayed in the moving target application along with the application element changes rapidly, at this time, the change of each frame displayed by the target application is large, the computing pressure of the central processing unit and the graphics processing unit when rendering and displaying each frame is large, and a pause phenomenon caused by insufficient hardware performance is easy to occur, so that the image display quality of the target application can be reduced based on a load reduction technology to relieve the computing pressure of the central processing unit and the graphics processing unit, and the target application is kept running smoothly.
Alternatively, the movement state may include a walking state, a running state, and a driving state. In one embodiment, the game image quality is reduced to a first resolution based on a load reduction technique if the motion state of the application element is determined to be a walking state, the game image quality is reduced to a second resolution based on the load reduction technique if the motion state of the application element is determined to be a running state, and the game image quality is reduced to a third resolution based on the load reduction technique if the motion state of the application element is determined to be a driving state, the first resolution being greater than the second resolution, and the second resolution being greater than the third resolution.
S311, if the motion state of the application element is determined to be the static state, maintaining the display quality of the current frame.
It can be understood that, when the motion state of the application element is a static state, the change of each frame of picture displayed by the target application is small or unchanged, and when each frame of picture is rendered and displayed, the calculation pressure of the central processing unit and the graphic processor is small, and the pause phenomenon caused by insufficient hardware performance can not occur, so that the picture display quality of the target application does not need to be reduced, and a good display effect can be maintained to meet the needs of a user.
In the embodiment of the application, when the target application is not operated for the first time, the size of the first buffer area is determined based on the first operation of the target application, the second buffer area with the same size as the first buffer area is determined in each buffer area of the graphics processor, the second buffer area is determined to be the target buffer area corresponding to the camera position matrix, and the graphics drawing instruction stored in the target buffer area is the target graphics drawing instruction carrying the camera position matrix. And then monitoring a target graph drawing instruction stored in a target buffer area in real time, acquiring a camera position matrix for identifying the motion state of the application element in the target graph drawing instruction based on a HOOK function, determining the motion state of the application element by calculating the difference value of a first camera position matrix of the application element in a first frame of picture and a second camera position matrix of the application element in a second frame of picture, so that the motion state of the application element in the target application can be judged by detecting the camera position matrix between different frames in the running process of the target application under the condition that a target application client does not send any information such as action, time and the like of the application element, the picture display quality of the target application can be adjusted according to the motion state of the application element, the problem of jamming caused in the rapid moving process of the application element is avoided, and the smooth running of the target application is ensured.
The motion state recognition device provided in the embodiment of the present application will be described in detail below with reference to fig. 4. It should be noted that, the motion state identification apparatus shown in fig. 4 is used for executing the method of the embodiment shown in fig. 1, fig. 2, and fig. 3 of the present application, for convenience of description, only the portion related to the embodiment of the present application is shown, and details of the specific technology are not disclosed, please refer to the embodiment shown in fig. 1, fig. 2, and fig. 3 of the present application.
Please refer to fig. 4, which is a schematic structural diagram of a motion state identification apparatus according to an embodiment of the present disclosure. As shown in fig. 4, the motion state recognition apparatus 1 may be implemented by software, hardware, or a combination of both as all or a part of an electronic device. According to some embodiments, the motion state identification apparatus 1 includes a first matrix obtaining module 11, a second matrix obtaining module 12, and a motion state determining module 13, and specifically includes:
the first matrix obtaining module 11 is configured to obtain a target graph drawing instruction for a target application, and obtain a first camera position matrix for an application element in a first frame of picture in the target graph drawing instruction;
a second matrix obtaining module 12, configured to obtain a second camera position matrix of an application element in a recorded second frame, where the second frame is obtained before the first frame based on the target application, and a frame number that is separated from the first frame is less than or equal to a preset frame number;
a motion state determination module 13 configured to determine a motion state of the application element based on the first camera position matrix and the second camera position matrix.
Optionally, please refer to fig. 5, which is a schematic structural diagram of a motion state identification device according to an embodiment of the present application. As shown in fig. 5, the apparatus further includes a first instruction determining module 14, where the first instruction determining module 14 is specifically configured to:
intercepting a plurality of frame pictures when application elements in a target application are in different motion states, and acquiring difference data between model view projection matrixes of each frame picture in the plurality of frame pictures;
determining a camera position matrix among a plurality of matrices constituting the model view projection matrix based on the disparity data;
determining a first buffer area where the camera position matrix is located in a buffer area set, wherein each buffer area in the buffer area set is used for storing a graphic drawing instruction for a target application;
and taking the first buffer area as a target buffer area, and taking the graph drawing instruction stored in the target buffer area as a target graph drawing instruction.
Optionally, please refer to fig. 5, which is a schematic structural diagram of a motion state identification device according to an embodiment of the present application. As shown in fig. 5, the apparatus further includes a second instruction determining module 15, where the second instruction determining module 15 is specifically configured to:
determining a second buffer area with the same size as the first buffer area in the buffer area set;
and taking the second buffer area as a target buffer area, and taking the graph drawing instruction stored in the target buffer area as a target graph drawing instruction.
Optionally, the first matrix obtaining module 11 is specifically configured to:
monitoring a graph drawing instruction aiming at a target application in real time, and judging whether a cache region stored by the graph drawing instruction is a target buffer region or not;
if the cache region stored by the graph drawing instruction is determined to be a target buffer region, determining the graph drawing instruction to be a target graph drawing instruction;
and acquiring a first camera position matrix aiming at the application element in the first frame of picture in the target graph drawing instruction.
Optionally, please refer to fig. 6, which is a schematic structural diagram of a motion state determining module according to an embodiment of the present application. As shown in fig. 6, the motion state determining module 13 includes:
a difference value calculation unit 131 for calculating a difference value between the first camera position matrix and the second camera position matrix;
a first determining unit 132, configured to determine that the motion state of the application element is a static state if the difference is zero;
a second determining unit 132, configured to determine that the motion state of the application element is a moving state if the difference is not zero.
Optionally, the movement state includes a walking state, a running state and a driving state, and the first determining unit 132 includes:
calculating a moving speed of the application element based on a time interval between the first frame picture and the second frame picture and the difference value;
if the moving speed of the application element is larger than zero and smaller than a first speed threshold value, determining that the motion state of the application element is a walking state;
if the moving speed of the application element is greater than a first speed threshold and less than a second speed threshold, determining that the motion state of the application element is a running state;
and if the moving speed of the application element is greater than a second speed threshold value, determining that the motion state of the application element is a driving state.
Optionally, please refer to fig. 5, which is a schematic structural diagram of a motion state identification device according to an embodiment of the present application. As shown in fig. 5, the apparatus further includes a quality adjustment module 16, where the quality adjustment module 16 is specifically configured to:
adjusting the picture display quality of the target application based on the motion state of the application element.
Fig. 7 is a schematic structural diagram of an image quality adjustment module according to an embodiment of the present disclosure. As shown in fig. 7, the image quality adjustment module 16 includes:
an image quality reducing unit 161, configured to reduce, if it is determined that the motion state of the application element is a moving state, the picture display quality of the current picture based on a load reduction technique;
and an image quality maintaining unit 162, configured to maintain the image display quality of the current image unchanged if the motion state of the application element is determined to be the still state.
Optionally, the image quality reducing unit 161 is specifically configured to:
if the motion state of the application element is determined to be a walking state, reducing the game image quality to a first resolution ratio based on a load reduction technology;
if the motion state of the application element is determined to be a running state, reducing the game image quality to a second resolution based on a load reduction technology;
if the motion state of the application element is determined to be the driving state, reducing the game image quality to a third resolution ratio based on a load reduction technology;
the first resolution is greater than the second resolution, which is greater than the third resolution.
In one or more embodiments of the present application, a target graphics drawing instruction for a target application, which is sent from a central processing unit to a graphics processing unit, is obtained, a first camera position matrix for an application element in a first frame of a target graphics drawing instruction is obtained, then a second camera position matrix for the application element in a recorded second frame of the target application is obtained, where the second frame is a frame that is obtained before the first frame of the target application and is separated from the first frame by a number of frames that is less than or equal to a preset number of frames, and finally a motion state of the application element is determined based on the first camera position matrix and the second camera position matrix.
It should be noted that, when the motion state identification apparatus provided in the foregoing embodiment executes the motion state identification method, only the division of the above functional modules is used for illustration, and in practical applications, the above functions may be distributed by different functional modules according to needs, that is, the internal structure of the device may be divided into different functional modules, so as to complete all or part of the above described functions. In addition, the motion state identification device and the motion state identification method provided by the above embodiments belong to the same concept, and details of implementation processes thereof are referred to in the method embodiments and are not described herein again.
It should be noted that, when the motion state identification apparatus provided in the foregoing embodiment executes the motion state identification method, only the division of the above functional modules is taken as an example, and in practical applications, the above functions may be distributed to different functional modules according to needs, that is, the internal structure of the device may be divided into different functional modules to complete all or part of the above described functions. In addition, the motion state identification device and the motion state identification method provided by the above embodiments belong to the same concept, and details of implementation processes thereof are referred to in the method embodiments and are not described herein again.
The above-mentioned serial numbers of the embodiments of the present application are merely for description, and do not represent the advantages and disadvantages of the embodiments.
A computer storage medium further provided in the embodiments of the present application may store a plurality of instructions, where the instructions are suitable for being loaded by a processor and executing the motion state identification method according to the embodiments shown in fig. 1 to fig. 3, and a specific execution process may refer to specific descriptions of the embodiments shown in fig. 1 to fig. 3, which is not described herein again.
The present application further provides a computer program product, where at least one instruction is stored, and the at least one instruction is loaded by the processor and executes the motion state identification method according to the embodiment shown in fig. 1 to fig. 3, and a specific execution process may refer to the specific description of the embodiment shown in fig. 1 to fig. 3, which is not described herein again.
Please refer to fig. 8, which is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure. The electronic device in the present application may comprise one or more of the following components: a processor 110, a memory 120, an input device 130, an output device 140, and a bus 150. The processor 110, memory 120, input device 130, and output device 140 may be coupled by a bus 150.
Processor 110 may include one or more processing cores. The processor 110 connects various parts within the overall electronic device using various interfaces and lines, performs various functions of the terminal 100 and processes data by executing or executing instructions, programs, code sets, or instruction sets stored in the memory 120 and calling data stored in the memory 120. Alternatively, the processor 110 may be implemented in hardware using at least one of Digital Signal Processing (DSP), field-Programmable Gate Array (FPGA), and Programmable Logic Array (PLA). The processor 110 may integrate one or more of a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), a modem, and the like. The CPU mainly processes an operating system, a user page, an application program and the like; the GPU is used for rendering and drawing display content; the modem is used to handle wireless communications. It is understood that the modem may not be integrated into the processor 110, but may be implemented by a communication chip.
The Memory 120 may include a Random Access Memory (RAM) or a Read-Only Memory (ROM). Optionally, the memory 120 includes a Non-Transitory Computer-Readable Medium (Non-transient Computer-Readable Storage Medium). The memory 120 may be used to store instructions, programs, code sets, or instruction sets. The memory 120 may include a program storage area and a data storage area, wherein the program storage area may store instructions for implementing an operating system, instructions for implementing at least one function (such as a touch function, a sound playing function, an image playing function, and the like), instructions for implementing the above method embodiments, and the like, and the operating system may be an Android (Android) system, including a system based on Android system depth development, an IOS system developed by apple, including a system based on IOS system depth development, or other systems.
The memory 120 may be divided into an operating system space, in which an operating system runs, and a user space, in which native and third-party applications run. In order to ensure that different third-party application programs can achieve a better operation effect, the operating system allocates corresponding system resources for the different third-party application programs. However, the requirements of different application scenarios in the same third-party application program on system resources also differ, for example, in a local resource loading scenario, the third-party application program has a higher requirement on the disk reading speed; in an animation rendering scene, the third-party application program has a high requirement on the performance of the GPU. The operating system and the third-party application program are independent from each other, and the operating system often cannot timely sense the current application scene of the third-party application program, so that the operating system cannot perform targeted system resource adaptation according to the specific application scene of the third-party application program.
In order to enable the operating system to distinguish a specific application scenario of the third-party application program, data communication between the third-party application program and the operating system needs to be opened, so that the operating system can acquire current scenario information of the third-party application program at any time, and further perform targeted system resource adaptation based on the current scenario.
The input device 130 is used for receiving input commands or data, and the input device 130 includes, but is not limited to, a keyboard, a mouse, a camera, a microphone, or a touch device. The output device 140 is used for outputting instructions or data, and the output device 140 includes, but is not limited to, a display device, a speaker, and the like. In one example, the input device 130 and the output device 140 may be combined, and the input device 130 and the output device 140 are touch display screens.
The touch display screen can be designed as a full-face screen, a curved screen or a profiled screen. The touch display screen can also be designed as a combination of a full-screen and a curved-surface screen, and a combination of a special-shaped screen and a curved-surface screen, which is not limited in this application.
In addition, those skilled in the art will appreciate that the configurations of the electronic devices illustrated in the above-described figures do not constitute limitations on the electronic devices, which may include more or fewer components than illustrated, or some components may be combined, or a different arrangement of components. For example, the electronic device further includes a radio frequency circuit, an input unit, a sensor, an audio circuit, a Wireless Fidelity (WiFi) module, a power supply, a bluetooth module, and other components, which are not described herein again.
In the electronic device shown in fig. 8, the processor 110 may be configured to call the motion state identification program stored in the memory 120 and execute the program to implement the motion state identification method according to the various method embodiments of the present application.
In the embodiment of the application, a target graph drawing instruction aiming at a target application and sent to a graph processor by a central processing unit is obtained, a first camera position matrix aiming at an application element in a first frame picture in the target graph drawing instruction is obtained, then a second camera position matrix of the application element in a recorded second frame picture is obtained, the second frame picture is a frame picture which is obtained before the first frame picture based on the target application and has a frame number which is separated from the first frame picture and is less than or equal to a preset frame number, and finally the motion state of the application element is determined based on the first camera position matrix and the second camera position matrix.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. The storage medium may be a magnetic disk, an optical disk, a read-only memory or a random access memory.
The above disclosure is only for the purpose of illustrating the preferred embodiments of the present application and is not to be construed as limiting the scope of the present application, so that the present application is not limited thereto, and all equivalent variations and modifications can be made to the present application.

Claims (12)

1. A motion state identification method, the method comprising:
acquiring a target graph drawing instruction aiming at a target application, and acquiring a first camera position matrix aiming at an application element in a first frame picture in the target graph drawing instruction;
acquiring a second camera position matrix of application elements in a recorded second frame picture, wherein the second frame picture is obtained before the first frame picture based on the target application, and the number of frames separated from the first frame picture is less than or equal to a preset number of frames;
determining a motion state of the application element based on the first and second camera position matrices.
2. The method according to claim 1, wherein when the target application is run for the first time, before obtaining the target graphics drawing instruction for the target application and obtaining the first camera position matrix for the application element in the first frame of picture in the target graphics drawing instruction, the method further comprises:
intercepting a plurality of frame pictures when application elements in a target application are in different motion states, and acquiring difference data between model view projection matrixes of each frame picture in the plurality of frame pictures;
determining a camera position matrix among a plurality of matrices constituting the model view projection matrix based on the disparity data;
determining a first buffer area where the camera position matrix is located in a buffer area set, wherein each buffer area in the buffer area set is used for storing a graphic drawing instruction for a target application;
and taking the first buffer area as a target buffer area, and taking the graph drawing instruction stored in the target buffer area as a target graph drawing instruction.
3. The method according to claim 2, wherein before the obtaining of the target graphics drawing instruction for the target application when the target application is not running for the first time and obtaining the first camera position matrix for the application element in the first frame of picture in the target graphics drawing instruction, the method further comprises:
determining a second buffer area with the same size as the first buffer area in the buffer area set;
and taking the second buffer area as a target buffer area, and taking the graph drawing instruction stored in the target buffer area as a target graph drawing instruction.
4. The method according to any one of claims 2 to 3, wherein the obtaining a target graphics drawing instruction for a target application, and obtaining a first camera position matrix for an application element in a first frame of picture in the target graphics drawing instruction comprises:
monitoring a graph drawing instruction aiming at a target application in real time, and judging whether a cache region stored by the graph drawing instruction is a target buffer region or not;
if the cache region stored by the graph drawing instruction is determined to be a target buffer region, determining the graph drawing instruction to be a target graph drawing instruction;
and acquiring a first camera position matrix aiming at the application element in the first frame of picture in the target graph drawing instruction.
5. The method of claim 1, wherein the determining a motion state of the application element based on the first and second camera position matrices comprises:
calculating a difference between the first and second camera position matrices;
if the difference value is zero, determining that the motion state of the application element is a static state;
and if the difference is not zero, determining that the motion state of the application element is a moving state.
6. The method of claim 5, wherein the movement state comprises a walking state, a running state, and a driving state, and wherein determining the motion state of the application element as a movement state if the difference is not zero comprises:
calculating a moving speed of the application element based on a time interval between the first frame picture and the second frame picture and the difference value;
if the moving speed of the application element is larger than zero and smaller than a first speed threshold value, determining that the motion state of the application element is a walking state;
if the moving speed of the application element is larger than a first speed threshold value and smaller than a second speed threshold value, determining that the motion state of the application element is a running state;
and if the moving speed of the application element is greater than a second speed threshold value, determining that the motion state of the application element is a driving state.
7. The method of claim 1, further comprising:
adjusting the picture display quality of the target application based on the motion state of the application element.
8. The method of claim 7, wherein the adjusting the picture display quality of the target application based on the motion state of the application element comprises:
if the motion state of the application element is determined to be a moving state, reducing the picture display quality of the current picture based on a load reduction technology;
and if the motion state of the application element is determined to be a static state, keeping the picture display quality of the current picture unchanged.
9. The method according to claim 8, wherein the reducing the picture display quality of the current picture based on the load reduction technique if the motion state of the application element is determined to be the moving state comprises:
if the motion state of the application element is determined to be a walking state, reducing the game image quality to a first resolution ratio based on a load reduction technology;
if the motion state of the application element is determined to be a running state, reducing the game image quality to a second resolution based on a load reduction technology;
if the motion state of the application element is determined to be the driving state, reducing the game image quality to a third resolution ratio based on a load reduction technology;
the first resolution is greater than the second resolution, which is greater than the third resolution.
10. A motion state recognition apparatus, characterized in that the apparatus comprises:
the first matrix acquisition module is used for acquiring a target graph drawing instruction aiming at a target application and acquiring a first camera position matrix aiming at an application element in a first frame picture in the target graph drawing instruction;
a second matrix obtaining module, configured to obtain a second camera position matrix of an application element in a recorded second frame of picture, where the second frame of picture is obtained before the first frame of picture based on the target application, and a frame number that is separated from the first frame of picture is less than or equal to a preset frame number;
a motion state determination module to determine a motion state of the application element based on the first camera position matrix and the second camera position matrix.
11. A storage medium having stored thereon a plurality of instructions, wherein the instructions, when executed by a processor, implement the steps of the method of any one of claims 1 to 9.
12. An electronic device, comprising: a processor and a memory; wherein the memory stores a computer program adapted to be loaded by the processor and to perform the steps of the method according to any of claims 1-9.
CN202210814544.6A 2022-07-12 2022-07-12 Motion state identification method and device, storage medium and electronic equipment Pending CN115253291A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210814544.6A CN115253291A (en) 2022-07-12 2022-07-12 Motion state identification method and device, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210814544.6A CN115253291A (en) 2022-07-12 2022-07-12 Motion state identification method and device, storage medium and electronic equipment

Publications (1)

Publication Number Publication Date
CN115253291A true CN115253291A (en) 2022-11-01

Family

ID=83765416

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210814544.6A Pending CN115253291A (en) 2022-07-12 2022-07-12 Motion state identification method and device, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN115253291A (en)

Similar Documents

Publication Publication Date Title
EP3754490B1 (en) User interface rendering method and apparatus, and terminal
CN112004086B (en) Video data processing method and device
CN110209501B (en) Frequency adjusting method and device of graphic processor, terminal and storage medium
CN110827378B (en) Virtual image generation method, device, terminal and storage medium
WO2020038128A1 (en) Video processing method and device, electronic device and computer readable medium
US20210168441A1 (en) Video-Processing Method, Electronic Device, and Computer-Readable Storage Medium
CN109151966B (en) Terminal control method, terminal control device, terminal equipment and storage medium
CN112529995B (en) Image rendering calculation method and device, storage medium and terminal
CN112053449A (en) Augmented reality-based display method, device and storage medium
EP3779690A1 (en) Processor core scheduling method and apparatus, terminal, and storage medium
CN109361950B (en) Video processing method and device, electronic equipment and storage medium
CN110750664B (en) Picture display method and device
CN111790148B (en) Information interaction method and device in game scene and computer readable medium
CN112053370A (en) Augmented reality-based display method, device and storage medium
CN113625983B (en) Image display method, device, computer equipment and storage medium
CN111127469A (en) Thumbnail display method, device, storage medium and terminal
CN114697568B (en) Special effect video determining method and device, electronic equipment and storage medium
CN114089896A (en) Rendering image intercepting method and device
CN113034653A (en) Animation rendering method and device
CN118159341A (en) Image frame rendering method and related device
CN116596748A (en) Image stylization processing method, apparatus, device, storage medium, and program product
CN115253291A (en) Motion state identification method and device, storage medium and electronic equipment
CN112684962B (en) Canvas extension method, device, storage medium and terminal
CN110659024A (en) Graphic resource conversion method, apparatus, electronic device and storage medium
CN113934500A (en) Rendering method, rendering device, storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination