CN113641488A - Method and device for optimizing resources based on user use scene - Google Patents

Method and device for optimizing resources based on user use scene Download PDF

Info

Publication number
CN113641488A
CN113641488A CN202110767941.8A CN202110767941A CN113641488A CN 113641488 A CN113641488 A CN 113641488A CN 202110767941 A CN202110767941 A CN 202110767941A CN 113641488 A CN113641488 A CN 113641488A
Authority
CN
China
Prior art keywords
frequency
user
scene
preset
gesture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110767941.8A
Other languages
Chinese (zh)
Inventor
朱潇
赵俊民
张威
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202110767941.8A priority Critical patent/CN113641488A/en
Publication of CN113641488A publication Critical patent/CN113641488A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/50Allocation of resources, e.g. of the central processing unit [CPU]
    • G06F9/5005Allocation of resources, e.g. of the central processing unit [CPU] to service a request
    • G06F9/5027Allocation of resources, e.g. of the central processing unit [CPU] to service a request the resource being a machine, e.g. CPUs, Servers, Terminals
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/308Details of the user interface
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/6045Methods for processing data by generating or executing the game program for mapping control signals received from the input arrangement into game commands
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/65Methods for processing data by generating or executing the game program for computing the condition of a game character

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Telephone Function (AREA)

Abstract

The embodiment of the application provides a method and a device for resource optimization based on a user use scene, relates to the field of terminals, and can optimize resources based on the user use scene and improve user experience. The method comprises the following steps: responding to the operation of opening the game application by the user, and acquiring gesture information of the user on an interface of the game application; if the gesture information meets the first condition, the user usage scene is a first scene (for example, a chart running scene), and the processor frequency is adjusted to the first frequency; wherein the first condition is related to a sliding duration and a sliding trajectory; if the gesture information meets a second condition, the user usage scenario is a second scenario (for example, a battle scenario), and the processor frequency is adjusted to a second frequency, where the second frequency is different from the first frequency. Wherein the second condition comprises: the second condition is related to the slide duration, the slide trajectory, the click frequency, and the click area.

Description

Method and device for optimizing resources based on user use scene
Technical Field
The present application relates to the field of terminals, and in particular, to a method and an apparatus for resource optimization based on a user usage scenario.
Background
Currently, for electronic devices, system resource optimization can be performed based on resource dimensions or device dimensions. For example, the frequency of a Central Processing Unit (CPU) may be adjusted according to the system load, the device temperature, and the remaining battery level. When the temperature is too high, the load is too low or the power is too low, the frequency of the CPU can be reduced, and conversely, the frequency of the CPU can be increased.
But the above described frequency modulation strategy may affect the user experience. For example, when a user plays a game on an electronic device, the temperature of the device generally increases, and if the CPU frequency is decreased, the system may be jammed, and the user experience is poor.
Disclosure of Invention
The embodiment of the application provides a method and a device for resource optimization based on a user use scene, which can optimize resources based on the user use scene and improve user experience.
In a first aspect, an embodiment of the present application provides a method for resource optimization based on a user usage scenario, which is applied to an electronic device, and includes: responding to the operation of opening the first application program by the user, and collecting gesture information of the user on an interface of the first application program; the gesture information comprises a gesture operation type, the receiving times of the gesture and coordinates corresponding to the gesture; when the category of the first application program is game application, if the gesture information meets a first condition, determining that a user usage scenario is a first scenario, wherein the first condition comprises: the sliding time length is greater than or equal to the preset time length, and the sliding track does not exceed a first preset range; if the gesture information meets a second condition, determining that the user usage scene is a second scene, wherein the second condition comprises: the sliding time length is greater than or equal to the preset time length, the sliding track does not exceed a first preset range, the clicking frequency is greater than or equal to the first preset frequency, and the clicking area does not exceed a second preset range; and adjusting the frequency of the processor according to the first scene or the second scene, wherein the first scene and the second scene respectively correspond to a first frequency and a second frequency, and the first frequency and the second frequency are different.
Based on the method provided by the embodiment of the application, the user use scene can be identified according to the category of the application program and the gesture information of the user, so that the frequency of the processor is adjusted according to the identified user use scene. It should be understood that, because the gesture is a main way for the user to interact with the device, and the user has a certain difference from the gesture interaction mode of the electronic device in different usage scenarios, the usage scenario of the user can be identified according to the gesture information of the user. The processor can be adjusted to different frequencies based on different user use scenes so as to meet the requirements of the current user use scenes. For example, a game-like application may include different types of user usage scenarios, such as a roadmap scenario, a battle scenario, and the like. In a battle scene, the frequency is not limited, so that the system runs at a higher frequency, the fluency is ensured, and the user experience is improved; in the roadmap scenario, the frequency may be limited to reduce power consumption and device temperature.
In a possible implementation manner, if the gesture information satisfies a third condition, it is determined that the usage scenario of the user is a third scenario, where the third condition includes: the click frequency is less than a first preset frequency and/or the sliding frequency is less than a second preset frequency, and the click frequency is less than the second preset frequency and/or the sliding frequency is less than a third preset frequency; and adjusting the frequency of the processor according to a third scene, wherein the third scene corresponds to a third frequency, and the first frequency, the second frequency and the third frequency are different.
For example, for a game scene, 3 different CPU frequencies may be preset, for example, the CPU frequencies corresponding to max, medium and min, and max, medium and min are sequentially decreased. The system can run at the CPU frequency corresponding to max in the second scene to ensure the system performance and improve the game experience; the method comprises the steps of running at the CPU frequency corresponding to the medium in the first scene to reduce the problems of heating and power consumption, and running at the CPU frequency corresponding to the min in the third scene to further reduce the problems of heating and power consumption.
In one possible implementation, if the gesture information does not satisfy the first condition, the second condition, and the third condition, the processor frequency is adjusted according to at least one of a temperature, a remaining power, and a resource load. For example, an interative strategy can be adopted as a CPU frequency modulation strategy, namely frequency increasing or reducing according to system load. Or, a temperature control module can be arranged, and the temperature control module limits the frequency of the CPU to prevent the temperature of the equipment from being too high. For example, when the temperature is higher than the temperature control threshold T, the frequency is directly decreased no matter what user usage scenario is currently in.
In one possible implementation, the method further includes: responding to the operation of opening the second application program by the user, and inquiring a user use scene from an Application Program Interface (API) interface provided by the second application program; the processor frequency is adjusted according to the user usage scenario. In one possible design, an electronic device vendor may cooperate with an application developer such that the application may provide an API interface for system applications of the electronic device to query the user for usage scenarios. In this way, the system application of the electronic device can query the current user usage scenario, such as a running chart, a battle, a still, etc., according to the API interface of the application program.
In a possible implementation manner, in the second scenario, when the temperature is higher than the first preset temperature threshold and lower than the second preset temperature threshold, if the processor frequency is lower than the second frequency, the frequency boosting processing is performed. I.e. the adjustment strategy (frequency up or down) of the processor frequency can be jointly decided according to the temperature and the use scene of the user. Therefore, the temperature of the electronic equipment can be prevented from being too high, the high performance can be ensured, and the dual optimization of the system performance and the power consumption can be achieved.
In one possible implementation, the first scenario is a roadmap scenario, and the first condition includes: SliTime > T1& & SliCutRecArea ∈ [ R1 ]; the SliTime represents the single sliding time length or the total continuous sliding time length for multiple times, T1 represents a preset time length, SliCutRecArea represents a section rectangular range corresponding to a sliding track, R1 represents a preset first preset range, and & represents a logical AND; the second scene is a battle scene, and the second condition comprises the following steps: SliTime > T1& & slicutrecanarea ∈ [ R1] & & CliFreq > F1& & clicutrecanarea ∈ [ R2 ]; wherein CliFreq represents the click frequency, clicutrake represents the rectangular range of the section corresponding to the click area, F1 represents the first preset frequency, and R2 represents the second preset range.
At one kind canIn an implementation manner, if the electronic device is connected to the fourth generation (4) by the dual connection technique respectivelythgeneration, 4G) mobile communication system base station (4G base station for short) and fifth generation (5)thgeneration, 5G) mobile communication system base station (5G base station for short), the method further comprises: the dual connectivity of the electronic device with the 4G base station and the 5G base station is maintained in the battle scene, and the connectivity of the electronic device with the 5G base station is released in the roadmap scene. Thus, the electronic device can maintain dual connectivity in a battle scene to ensure high performance and user experience. In the scenario of the running map, the connection between the UE and the 5G base station can be released (which is equivalent to frequency reduction), so as to achieve the effects of reducing the power consumption of the device and avoiding the waste of network resources.
In a possible implementation manner, the third scenario is a static scenario, and the third condition includes: CliCou < C1& & CliFreq < F2& & SliCou < C2& & SliFreq < F3; where CliCou represents the number of clicks, C1 represents a first preset number, CliFreq represents the click frequency, F2 represents a second preset frequency, F2 is smaller than F1, SliCou represents the number of slips, C2 represents the second preset number, SliFreq represents the slip frequency, and F3 represents a third preset frequency.
In one possible implementation, the method further includes: and releasing the connection between the electronic equipment and the 5G base station in the static scene. Namely, in a static scene, the connection between the UE and the 5G base station can be released (which is equivalent to frequency reduction), so as to achieve the effects of reducing the power consumption of the device and avoiding the waste of network resources.
In one possible implementation, the method further includes: when first gesture information of a user is collected, monitoring resource load of electronic equipment, wherein the resource load comprises at least one resource information of processor frequency, memory occupancy rate and network occupancy rate; and if the resource load of the electronic equipment is smaller than the preset threshold value and keeps unchanged, discarding the first gesture information. This increases fault tolerance and avoids the effects of "false" gestures (unwanted gestures).
In one possible implementation, F (class) ({ F (App)), F (Gesture), F (Load) }; wherein f (class) represents a user usage scenario, f (app) represents a category of an application program, f (load) represents resource load information, f (gesture) represents a condition satisfied by gesture information, and oc represents a positive correlation. Therefore, the user use scene can be identified according to the type of the application program, the gesture information of the user and the resource load characteristics of the device, and therefore resource optimization is carried out on the electronic device according to the identified user use scene.
In one possible implementation form of the method,
F(Gesture)
=F(CliCou,CliFreq,CliCutRecArea,SliCou,SliFreq,SliTime,SliCutRecArea)
=R(CliCou)&R(CliFreq)&R(CliCutRecArea)&R(SliCou)&R(SliFreq)&R(SliTime)&R(SliCutRecArea)
where SliTime represents a single sliding duration or a total duration of continuous multiple sliding, slicustrecanarea represents a tangent plane rectangular range corresponding to a sliding track, CliFreq represents a click frequency, clicustrecanarea represents a tangent plane rectangular range corresponding to a click region, CliCou represents a click number, SliCou represents a sliding number, SliFreq represents a sliding frequency, R () represents a condition that a variable in () satisfies, and & represents joint decision based on each R ().
In one possible implementation, the user usage scenario model may be: f (class) ((F) (App), F (Gesture), F (load), F (expeience)); wherein, f (experience) represents a user experience factor, and the user experience factor is used for indicating at least one of network delay information, stuck or frame loss information, and satisfaction of user feedback. The user use scene model can be adaptively adjusted according to the user experience factors, and the robustness of the user use scene model is enhanced.
In one possible implementation form of the method,
Figure BDA0003152636320000031
the method comprises the following steps that P (Class | App, Gesture, Load) represents the probability of different user use scenes obtained according to the type of a first application program, Gesture information of a user and resource Load decision, and the user use scene with the highest probability is used as the user use scene; p (App, gettrue, Load | Class) represents the probability of the occurrence of the category of the first application, the Gesture information of the user, and the resource Load in different user usage scenarios. P (class) represents the probability of occurrence of usage scenarios of different users. P (App, gettrue, Load) represents the category of the first application, the Gesture information of the user, and the probability of the resource Load occurring. P (App, gettrue, Load | Class), P (Class), and P (App, gettrue, Load) may be a probability value calculated from history data.
In one possible implementation, the method further includes: and performing at least one of network scheduling, memory resource scheduling and user portrait generation according to the identified user use scene, so that the results of the network scheduling, the memory resource scheduling and the user portrait generation are more accurate and fit with the user experience.
In a second aspect, the present application provides a chip system that includes one or more interface circuits and one or more processors. The interface circuit and the processor are interconnected by a line.
The above chip system may be applied to an electronic device including a communication module and a memory. The interface circuit is configured to receive signals from a memory of the electronic device and to transmit the received signals to the processor, the signals including computer instructions stored in the memory. When executed by a processor, the computer instructions may cause an electronic device to perform the method as described in the first aspect and any of its possible designs.
Alternatively, the above-described chip system may be applied to a server (server apparatus) including a communication module and a memory. The interface circuit is configured to receive signals from the memory of the server and to send the received signals to the processor, the signals including computer instructions stored in the memory. The server may perform the method as described in the first aspect and any of its possible designs when the computer instructions are executed by the processor.
In a third aspect, the present application provides a computer-readable storage medium comprising computer instructions. When the computer instructions are run on an electronic device, such as a mobile phone, they cause the electronic device to perform the method according to the first aspect and any of its possible designs.
In a fourth aspect, the present application provides a computer program product for causing a computer to perform the method according to the first aspect and any one of its possible designs when the computer program product runs on the computer.
In a fifth aspect, the present application provides an apparatus comprising a processor, a processor coupled with a memory, the memory storing program instructions, and the program instructions stored in the memory, when executed by the processor, enable the apparatus to implement the method of the first aspect and any possible design thereof. The apparatus may be an electronic device or a server device; or may be an integral part of the electronic device or the server device, such as a chip.
In a sixth aspect, the present application provides an apparatus, which may be functionally divided into different logical units or modules, and each unit or module performs different functions, so that the apparatus performs the method described in the first aspect and any possible design manner thereof.
It should be understood that the advantageous effects achieved by the chip system according to the second aspect, the computer-readable storage medium according to the third aspect, the computer program product according to the fourth aspect, and the apparatus according to the fifth aspect and the sixth aspect provided above can be referred to as the advantageous effects of the first aspect and any possible design thereof, and are not repeated herein.
Drawings
Fig. 1 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure;
fig. 2 is a schematic flowchart of a method for resource optimization based on a user usage scenario according to an embodiment of the present application;
FIG. 3 is a schematic illustration of a display provided by an embodiment of the present application;
FIG. 4 is a schematic illustration of yet another display provided by an embodiment of the present application;
FIG. 5 is a schematic view of a sliding operation provided in the present embodiment;
FIG. 6 is a schematic illustration of yet another display provided by an embodiment of the present application;
FIG. 7 is a schematic diagram of a continuous clicking operation provided in an embodiment of the present application;
fig. 8 is a schematic diagram of a learning framework of a user usage scenario model according to an embodiment of the present application;
fig. 9 is a schematic diagram of an identification architecture of a user usage scenario according to an embodiment of the present application;
fig. 10 is a flowchart illustrating a further method for resource optimization based on a user usage scenario according to an embodiment of the present application;
fig. 11 is a schematic diagram of a chip structure according to an embodiment of the present application.
Detailed Description
Currently, for electronic devices, the way of optimizing for system resources is basically limited to the resource dimension or the device dimension. For example, the frequency modulation algorithm of the CPU may perform decision-making judgment in combination with the current load of the CPU, the device temperature, and the remaining amount of the battery: when the temperature is too high, the load is too low, or the power is too low, the frequency may be decreased, and conversely, the frequency may be increased.
However, the above strategy does not consider the current usage scenario of the device and the user experience. For example, when a user plays a game on an electronic device, the temperature of the device generally increases, and if the CPU frequency is decreased, the system may be jammed, and the user experience is poor. Therefore, a method for optimizing resources of an electronic device according to a user usage scenario is needed. However, at present, there are the following problems in identifying a user usage scenario:
1. the current mainstream operating systems all use a sandbox mechanism to isolate the processes of each application program to the greatest extent so as to meet the security requirements of application codes and data. The sandbox mechanism increases the difficulty of obtaining the internal information of the application, so that the use scene of the user is difficult to identify. Especially for game-type applications, since game applications are usually drawn through an open graphics library (OpenGL), rather than implemented through components provided by the system, further scene recognition is hardly possible, and thus, targeted resource optimization cannot be performed for different user usage scenes.
2. At present, a user using scene can be identified based on a screenshot of an operation interface of a user program, but the method can cause higher load, is higher in image identification difficulty, has higher risk of misidentification, and is poor in feasibility.
The embodiment of the application provides a method for optimizing resources based on a user use scene, which can identify the user use scene according to the category of an application program, gesture information of a user and resource load characteristics of equipment, so that resource optimization is performed on electronic equipment according to the identified user use scene. It can be understood that, because the gesture is a main way for the user to interact with the device, and the user has a certain difference from the gesture interaction mode of the electronic device in different usage scenarios, the usage scenarios of the user can be identified according to the gesture information of the user. Moreover, on the basis of identifying the use scene of the user according to the gesture information of the user, the type of the application program and the resource load characteristics of the equipment are considered, so that the identified use scene of the user can be more accurate.
The method and the device are suitable for the operating system adopting the sandbox mechanism. And the load is lower than that of recognition based on the screenshot of the current interface.
Based on the method provided by the embodiment of the application, the specific user use scene can be accurately identified, so that the optimization can be carried out aiming at the corresponding user use scene. For example, the game-like application may include different types of user usage scenarios, such as a login scenario, a run scenario, a battle scenario, a settlement scenario, and so forth. In a battle scene, the frequency is not limited, so that the system runs at a higher frequency, the fluency is ensured, and the user experience is improved; under the scenes of login, running chart, settlement and the like, the frequency can be limited so as to reduce the power consumption and the temperature of equipment.
Of course, besides game-like applications, other types of applications, such as social chat-like applications, shopping-like applications, news-like applications, etc., may also include various different types of usage scenarios, and may be optimized for the corresponding usage scenarios of the user. For example, a social chat-type application may include a browse scenario (e.g., browsing push articles for public numbers), a chat scenario (sending text, pictures, videos, etc. at a chat interface), and so forth. For another example, the shopping applications may include a live scene (live viewing of a certain anchor), a commodity browsing scene (shopping of commodities), a payment settlement scene, and the like. Based on different user usage scenarios, corresponding resource optimization can be performed in a targeted manner.
In addition, the identification of the user usage scenario can be used for resource optimization (system optimization) and can also be used for refining and identifying the user type, so that scenario recommendation or reminding is performed in a targeted manner, and the life of the user can be further facilitated. For example, if the user is identified to be in a shopping scene for a long time, new money commodity information or promotion information can be pushed to the user; the long time when the user is in the video watching scene is identified to be longer, the user can be reminded to pay attention to rest, and the excessive eye use is avoided.
The identity authentication method provided by the embodiment of the application can be applied to electronic equipment. The electronic device may be, for example, a mobile phone, a tablet computer, a desktop computer (desktop computer), a handheld computer, a notebook computer (laptop computer), an ultra-mobile personal computer (UMPC), a netbook, a Personal Digital Assistant (PDA), an Augmented Reality (AR) Virtual Reality (VR) device, and the like, and the embodiment of the present application is not limited to a specific form of the electronic device. Or, the identity authentication method provided by the embodiment of the present application may be applied to a server device.
As shown in fig. 1, the electronic device may be a mobile phone 100. The mobile phone 100 may include a processor 110, an external memory interface 120, an internal memory 121, a USB interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a radio frequency module 150, a communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a button 190, a motor 191, an indicator 192, a camera 193, a display 194, a SIM card interface 195, and the like. The sensor module may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
The structure illustrated in the embodiment of the present invention is not limited to the mobile phone 100. It may include more or fewer components than shown, or combine certain components, or split certain components, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a Neural-Network Processing Unit (NPU), etc. The different processing units may be independent devices or may be integrated in the same processor.
The controller may be a decision maker directing the various components of the handset 100 to work in concert as instructed. Is the neural center and command center of the handset 100. The controller generates an operation control signal according to the instruction operation code and the time sequence signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor is a cache memory. Instructions or data that have just been used or recycled by the processor may be saved. If the processor needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses and reducing the latency of the processor, thereby increasing the efficiency of the system.
In some embodiments, the processor 110 may include an interface. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface.
The I2C interface is a bi-directional synchronous serial bus that includes a serial data line (SDA) and a Serial Clock Line (SCL). In some embodiments, the processor may include multiple sets of I2C buses. The processor may be coupled to the touch sensor, charger, flash, camera, etc. via different I2C bus interfaces. For example: the processor may be coupled to the touch sensor via an I2C interface, such that the processor and the touch sensor communicate via an I2C bus interface to implement the touch functionality of the cell phone 100.
The I2S interface may be used for audio communication. In some embodiments, the processor may include multiple sets of I2S buses. The processor may be coupled to the audio module via an I2S bus to enable communication between the processor and the audio module. In some embodiments, the audio module can transmit audio signals to the communication module through the I2S interface, so as to realize the function of answering the call through the bluetooth headset.
The PCM interface may also be used for audio communication, sampling, quantizing and encoding analog signals. In some embodiments, the audio module and the communication module may be coupled by a PCM bus interface. In some embodiments, the audio module may also transmit the audio signal to the communication module through the PCM interface, so as to implement a function of answering a call through the bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication, with different sampling rates for the two interfaces.
The UART interface is a universal serial data bus used for asynchronous communications. The bus is a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is typically used to connect the processor with the communication module 160. For example: the processor communicates with the Bluetooth module through the UART interface to realize the Bluetooth function. In some embodiments, the audio module may transmit the audio signal to the communication module through the UART interface, so as to realize the function of playing music through the bluetooth headset.
The MIPI interface can be used to connect a processor with peripheral devices such as a display screen and a camera. The MIPI interface includes a Camera Serial Interface (CSI), a Display Serial Interface (DSI), and the like. In some embodiments, the processor and the camera communicate through a CSI interface to implement the camera function of the handset 100. The processor and the display screen communicate through a DSI interface to implement the display function of the mobile phone 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal or as a data signal. In some embodiments, the GPIO interface may be used to connect the processor with a camera, display screen, communication module, audio module, sensor, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, a MIPI interface, and the like.
The USB interface 130 may be a Mini USB interface, a Micro USB interface, a USB Type C interface, etc. The USB interface may be used to connect a charger to charge the mobile phone 100, or may be used to transmit data between the mobile phone 100 and a peripheral device. And the earphone can also be used for connecting an earphone and playing audio through the earphone. But may also be used to connect other electronic devices such as AR devices, etc.
The interface connection relationship between the modules in the embodiment of the present invention is only schematically illustrated, and does not limit the structure of the mobile phone 100. The mobile phone 100 may adopt different interface connection modes or a combination of multiple interface connection modes in the embodiment of the present invention.
The charging management module 140 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module may receive charging input from a wired charger via a USB interface. In some wireless charging embodiments, the charging management module may receive a wireless charging input through a wireless charging coil of the cell phone 100. The charging management module can also supply power to the electronic device through the power management module 141 while charging the battery.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module receives the input of the battery and/or the charging management module and supplies power to the processor, the internal memory, the external memory, the display screen, the camera, the communication module and the like. The power management module may also be used to monitor parameters such as battery capacity, battery cycle number, battery state of health (leakage, impedance), etc. In some embodiments, the power management module 141 may also be disposed in the processor 110. In some embodiments, the power management module 141 and the charging management module may also be disposed in the same device.
The wireless communication function of the mobile phone 100 can be implemented by the antenna module 1, the antenna module 2, the rf module 150, the communication module 160, a modem, and a baseband processor.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the handset 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the cellular network antenna may be multiplexed into a wireless local area network diversity antenna. In some embodiments, the antenna may be used in conjunction with a tuning switch.
The RF module 150 may provide applications including second generation (2) for the handset 100thgeneration, 2G)/third generation (3)thgeneration, 3G)/fourth generation (4)thgeneration, 4G)/fifth generation (5)thgeneration, 5G), and the like. May include at least one filter, switch, power Amplifier, Low Noise Amplifier (LNA), etc. The radio frequency module receives electromagnetic waves through the antenna 1, and processes the received electromagnetic waves such as filtering, amplification and the like, and transmits the electromagnetic waves to the modem for demodulation. The radio frequency module can also be used for a modemThe modulated signal is amplified and converted into electromagnetic wave through the antenna 1 to be radiated. In some embodiments, at least some of the functional modules of the rf module 150 may be disposed in the processor 150. In some embodiments, at least some functional modules of the rf module 150 may be disposed in the same device as at least some modules of the processor 110.
The modem may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to a speaker, a receiver, etc.) or displays an image or video through a display screen. In some embodiments, the modem may be a stand-alone device. In some embodiments, the modem may be separate from the processor, in the same device as the rf module or other functional module.
The communication module 160 may provide a communication processing module including a solution for wireless communication, such as Wireless Local Area Network (WLAN) (e.g., WiFi), bluetooth, Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like, which is applied to the mobile phone 100. The communication module 160 may be one or more devices integrating at least one communication processing module. The communication module receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor. The communication module 160 may also receive a signal to be transmitted from the processor, frequency-modulate and amplify the signal, and convert the signal into electromagnetic waves via the antenna 2 to radiate the electromagnetic waves.
In some embodiments, the antenna 1 of the handset 100 is coupled to the radio frequency module and the antenna 2 is coupled to the communication module. So that the handset 100 can communicate with networks and other devices via wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), General Packet Radio Service (GPRS), code division multiple access (code division multiple access, CDMA), Wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), LTE, 5G New wireless communication (New Radio, NR), BT, GNSS, WLAN, NFC, FM, and/or IR technologies, and the like. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The mobile phone 100 implements the display function through the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing and is connected with a display screen and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen includes a display panel. The display panel may be a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED), a miniature, a Micro led, a Micro-o led, a quantum dot light-emitting diode (QLED), or the like. In some embodiments, the handset 100 may include 1 or N display screens, with N being a positive integer greater than 1.
As also shown in fig. 1, the cell phone 100 may implement a shooting function through an ISP, a camera 193, a video codec, a GPU, a display screen, an application processor, and the like.
The ISP is used for processing data fed back by the camera. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, the handset 100 may include 1 or N cameras, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the handset 100 is in frequency bin selection, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. Handset 100 may support one or more codecs. Thus, the handset 100 can play or record video in a variety of encoding formats, such as: MPEG1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. The NPU can realize applications such as intelligent recognition of the mobile phone 100, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the storage capability of the mobile phone 100. The external memory card communicates with the processor through the external memory interface to realize the data storage function. For example, files such as music, video, etc. are saved in an external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The processor 110 executes various functional applications of the cellular phone 100 and data processing by executing instructions stored in the internal memory 121. The memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The data storage area may store data (e.g., audio data, a phonebook, etc.) created during use of the handset 100, and the like. Further, the memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, other volatile solid-state storage devices, a universal flash memory (UFS), and the like.
The mobile phone 100 can implement audio functions through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module is used for converting digital audio information into analog audio signals to be output and converting the analog audio input into digital audio signals. The audio module may also be used to encode and decode audio signals. In some embodiments, the audio module may be disposed in the processor 110, or some functional modules of the audio module may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The cellular phone 100 can listen to music through a speaker or listen to a hands-free call.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the handset 100 receives a call or voice information, it can receive voice by placing the receiver close to the ear.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or sending voice information, a user can input a voice signal into the microphone by making a sound by approaching the microphone through the mouth of the user. The handset 100 may be provided with at least one microphone. In some embodiments, the handset 100 may be provided with two microphones to achieve a noise reduction function in addition to collecting sound signals. In some embodiments, the mobile phone 100 may further include three, four or more microphones to collect sound signals and reduce noise, and may further identify sound sources and implement directional recording functions.
The headphone interface 170D is used to connect a wired headphone. The earphone interface may be a USB interface, or may be an open mobile platform (OMTP) standard interface of 3.5mm, or a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used for sensing a pressure signal, and converting the pressure signal into an electrical signal. In some embodiments, the pressure sensor may be disposed on the display screen. There are many types of pressure sensors, such as resistive pressure sensors, inductive pressure sensors, capacitive pressure sensors, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor, the capacitance between the electrodes changes. The handset 100 determines the intensity of the pressure from the change in capacitance. When a touch operation is applied to the display screen, the mobile phone 100 detects the intensity of the touch operation according to the pressure sensor. The cellular phone 100 can also calculate the touched position based on the detection signal of the pressure sensor. In some embodiments, the touch operations that are applied to the same touch position but different touch operation intensities may correspond to different operation instructions. For example: and when the touch operation with the touch operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for viewing the short message. And when the touch operation with the touch operation intensity larger than or equal to the first pressure threshold value acts on the short message application icon, executing an instruction of newly building the short message.
The gyro sensor 180B may be used to determine the motion attitude of the cellular phone 100. In some embodiments, the angular velocity of the handset 100 about three axes (i.e., the x, y, and z axes) may be determined by a gyroscope sensor. The gyro sensor may be used for photographing anti-shake. Illustratively, when the shutter is pressed, the gyroscope sensor detects the shake angle of the mobile phone 100, and calculates the distance to be compensated for the lens module according to the shake angle, so that the lens can counteract the shake of the mobile phone 100 through reverse movement, thereby achieving anti-shake. The gyroscope sensor can also be used for navigation and body feeling game scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, the handset 100 calculates altitude, aiding in positioning and navigation, from barometric pressure values measured by a barometric pressure sensor.
The magnetic sensor 180D includes a hall sensor. The handset 100 may detect the opening and closing of the flip holster using a magnetic sensor. In some embodiments, when the handset 100 is a flip phone, the handset 100 may detect the opening and closing of the flip based on the magnetic sensor. And then according to the opening and closing state of the leather sheath or the opening and closing state of the flip cover, the automatic unlocking of the flip cover is set.
The acceleration sensor 180E can detect the magnitude of acceleration of the cellular phone 100 in various directions (typically three axes). The magnitude and direction of gravity can be detected when the handset 100 is stationary. The method can also be used for recognizing the terminal gesture, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The handset 100 may measure distance by infrared or laser. In some embodiments, the scene is photographed and the cell phone 100 may utilize a range sensor to measure the distance to achieve fast focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. Infrared light is emitted outward through the light emitting diode. Infrared reflected light from nearby objects is detected using a photodiode. When sufficient reflected light is detected, it can be determined that there is an object near the cell phone 100. When insufficient reflected light is detected, it can be determined that there is no object near the cellular phone 100. The mobile phone 100 can detect that the user holds the mobile phone 100 close to the ear by using the proximity light sensor, so as to automatically turn off the screen to achieve the purpose of saving power. The proximity light sensor can also be used in a holster mode, a pocket mode automatically unlocks and locks the screen.
The ambient light sensor 180L is used to sense the ambient light level. The mobile phone 100 may adaptively adjust the display screen brightness according to the perceived ambient light level. The ambient light sensor can also be used to automatically adjust the white balance when taking a picture. The ambient light sensor may also cooperate with the proximity light sensor to detect whether the cell phone 100 is in a pocket to prevent inadvertent contact.
The fingerprint sensor 180H is used to collect a fingerprint. The mobile phone 100 can utilize the collected fingerprint characteristics to unlock the fingerprint, access the application lock, take a photograph of the fingerprint, answer an incoming call with the fingerprint, and the like.
The temperature sensor 180J is used to detect temperature. In some embodiments, the handset 100 implements a temperature processing strategy using the temperature detected by the temperature sensor. For example, when the temperature reported by the temperature sensor exceeds the threshold, the mobile phone 100 performs a reduction in the performance of the processor located near the temperature sensor, so as to reduce power consumption and implement thermal protection.
The touch sensor 180K is also referred to as a "touch panel". Can be arranged on the display screen. For detecting a touch operation acting thereon or thereabout. The detected touch operation may be passed to an application processor to determine the touch event type and provide a corresponding visual output via the display screen.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, the bone conduction sensor may acquire a vibration signal of a human voice vibrating a bone mass. The bone conduction sensor can also contact the pulse of the human body to receive the blood pressure pulsation signal. In some embodiments, the bone conduction sensor may also be disposed in the earpiece. The audio module 170 may analyze a voice signal based on the vibration signal of the bone block vibrated by the sound part obtained by the bone conduction sensor, so as to implement a voice function. The application processor can analyze heart rate information based on the blood pressure beating signals acquired by the bone conduction sensor, and a heart rate detection function is realized.
The keys 190 include a power-on key, a volume key, and the like. The keys may be mechanical keys. Or may be touch keys. The cellular phone 100 receives a key input, and generates a key signal input related to user setting and function control of the cellular phone 100.
The motor 191 may generate a vibration cue. The motor can be used for incoming call vibration prompt and can also be used for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The touch operation on different areas of the display screen can also correspond to different vibration feedback effects. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The SIM card interface 195 is used to connect a Subscriber Identity Module (SIM). The SIM card can be attached to and detached from the cellular phone 100 by being inserted into or pulled out from the SIM card interface. The handset 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface can support a Nano SIM card, a Micro SIM card, a SIM card and the like. Multiple cards can be inserted into the same SIM card interface at the same time. The types of the plurality of cards may be the same or different. The SIM card interface may also be compatible with different types of SIM cards. The SIM card interface may also be compatible with external memory cards. The mobile phone 100 interacts with the network through the SIM card to implement functions such as communication and data communication. In some embodiments, the handset 100 employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the mobile phone 100 and cannot be separated from the mobile phone 100.
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application. In the description of the present application, unless otherwise specified, "at least one" means one or more, "a plurality" means two or more. In addition, in order to facilitate clear description of technical solutions of the embodiments of the present application, in the embodiments of the present application, terms such as "first" and "second" are used to distinguish the same items or similar items having substantially the same functions and actions. Those skilled in the art will appreciate that the terms "first," "second," etc. do not denote any order or quantity, nor do the terms "first," "second," etc. denote any order or importance.
For convenience of understanding, a method for resource optimization based on a user usage scenario provided in the embodiments of the present application is specifically described below with reference to the accompanying drawings.
As shown in fig. 2, an embodiment of the present application provides a method for resource optimization based on a user usage scenario, including:
201. a user usage scenario is identified.
First, the class of the foreground application may be identified. It should be appreciated that an application may be a foreground application when the terminal device is displaying the relevant interface for the application. For example, as shown in (a) of fig. 3, a user may open a game Application (APP) 302 in a main interface 301 and enter a game interface. In response to the operation of the user, the cellular phone may display a game interface 303 as shown in (b) of fig. 3. At this time, the game APP is the APP running in the foreground.
Wherein, foreground running means that a foreground task is running in the CPU. The determination method of the foreground running application program may include: whether the application program is a foreground program can be judged through functions of running process, ActivityLifecycleCallbacks, usagetstatsmanager and the like. Or, if the terminal device is an Android system, whether the application program is a foreground program can be judged through the barrier-free function of the Android. Or, if the terminal device is a Linux system, the process information stored in the/proc directory in the Linux system can be read to determine whether the application program is a foreground program. The specific determination process may refer to the prior art, and is not described herein in detail.
The electronic device may establish an application category list, for example, as shown in table 1, the application category list may include different application categories and specific application programs corresponding to the different application categories (package names or process names of some application programs are illustrated in table 1).
TABLE 1
Figure BDA0003152636320000121
After collecting the process name or the package name of the foreground application, the category of the foreground application can be determined by querying the application category list (e.g., table 1). For example, if the package name of the foreground application program is com. And if the package name of the foreground application program is com.
The relationship between the user usage scenario and the application program category can be shown as formula (1):
f (class). alpha.F (App). formula (1)
Wherein, f (class) represents a user usage scenario, and f (app) represents a category of an application program. F (class) is positively correlated with F (App). That is, the category of the application may proportionally influence the decision of the user to use the scene.
After the category of the application program is determined, gesture information of the user can be recognized. Because the gesture is the main interaction mode between the user and the equipment, and the user has certain difference with the gesture interaction mode of the electronic equipment in different use scenes, the use scenes of the user can be identified according to the gesture information of the user.
In the process of using the electronic device by the user, the electronic device may collect gesture information of the user, where the gesture information includes a gesture operation type, a receiving number of times of the gesture, and a coordinate corresponding to the gesture (the coordinate corresponding to the gesture may reflect a position, an area, or a range of the gesture operation). The gesture operation types include clicking (single click, double click, knuckle tapping) on the screen, sliding (one-finger sliding, two-finger sliding, three-finger sliding, and the like), long pressing, and the like.
Optionally, the gesture operation type may further include a spatial gesture (e.g., waving a palm, making a fist or opening five fingers, etc.) that does not contact the screen, and the gesture information may further include a shape of the gesture operation, and the like, which is not limited in this application.
Taking the game APP as an example, the scenes of the game APP may include scenes of running, fighting, standing, and the like. The scenes of running, fighting, standing and the like can be determined based on the clicking times, the clicking frequency, the section matrix of the clicking range, the sliding times, the sliding frequency, the single sliding time length, the sliding section matrix and other factors.
It will be appreciated that in a game, a roadmap may refer to running around a map, and that due to team competition or the need for increased personal ability, players may need to practice roadmaps in order to improve familiarity with maps. The battle may be conducted between the game player and another player, or between the game player and the virtual player. The battle may also be referred to as PK, battlefield, attack battle, etc., and the present application is not limited thereto. A stationary scenario may refer to a scenario in which the user is not operating for a long time, e.g., the user may not be operating for a long time at a login interface or for a long time at an equipment selection/upgrade interface.
Illustratively, as shown in fig. 4, when in a roadmap scenario, a user's gesture is typically characterized by a long sliding motion in one or more directions within a fixed range of the screen (typically the lower left of the screen). I.e. the user's gesture satisfies the following condition:
SliTime>T1&&SliCutRecArea∈[R1]
where SliTime represents a sliding time, T1 represents a preset duration threshold (e.g., 5 seconds, 10 seconds, etc.), slicutrecaea represents a single-sliding slice rectangle, R1 represents a preset first rectangle range, which may be located at the lower left of the screen, and & & represents a logical and.
That is, the gesture of the user satisfies the condition that the sliding time length is greater than T1 and the single-sliding tangent plane rectangle is within the preset rectangle range R1. As shown in fig. 5, the sliding section rectangle corresponding to the sliding track is within the preset rectangular range R1. It can be understood that the user may perform a plurality of sliding operations in the running chart scene, and the sliding tracks corresponding to the plurality of sliding operations are all within the preset rectangular range R1.
As shown in fig. 6, in a battle scene, a user's gesture generally has the following 2 features:
1. the long time slides in one or more directions within a fixed range of the screen (generally, the lower left of the screen).
2. Clicking continuously in a certain area of the screen (generally the lower right of the screen).
That is, in the battle scene, the user's gesture satisfies the following condition:
SliTime>T1&&SliCutRecArea∈[R1]&&CliFreq>F1&&CliCutRecArea∈[R2]
where CliFreq represents the frequency of clicks and CliCutRecArea represents the rectangular range of the click section. F1 represents a preset first click frequency threshold, and R2 represents a preset second rectangular range, which may be located at the lower right of the screen. Other parameters are referred to above and will not be described herein.
That is, the user's gesture satisfies the condition that the swipe duration (which may be, for example, a single swipe duration or a total duration of multiple swipes) is greater than T1, the single-swipe cut rectangle is within a given rectangle R1, and the click frequency is greater than F1, the click cut rectangle is within a given rectangle R2. As shown in fig. 5, the sliding section rectangle corresponding to the sliding track is within the preset rectangular range R1. As shown in FIG. 7, the click-cut rectangle corresponding to the continuous click operation is within the preset rectangular range R2. It can be understood that the user can perform multiple sliding and clicking operations in the battle scene, sliding tracks corresponding to the multiple sliding operations are all within the preset rectangular range R1, and clicking tangent plane rectangles corresponding to the multiple clicking operations are all within the preset rectangular range R2.
In a static scene, the gesture of the user has the characteristic of no operation for a long time. E.g., no operation for a long time at the login interface, or no operation for a long time at the equipment selection/upgrade interface.
In the resting scene, the user's gesture satisfies the following condition:
CliCou<C1&&CliFreq<F2&&SliCou<C2&&SliFreq<F3
wherein CliCou represents the number of clicks, C1 represents a preset threshold value of the number of clicks, CliFreq represents the frequency of clicks, F2 represents a preset second threshold value of the frequency of clicks, the second threshold value of the frequency of clicks is smaller than the first threshold value of the frequency of clicks, SliCou represents the number of sliding, C2 represents a preset threshold value of the number of sliding, SliFreq represents the sliding frequency, and F3 represents a preset threshold value of the sliding frequency.
That is, the gesture of the user satisfies the conditions that the number of clicks is less than C1, the click frequency is less than F2, the number of swipes is less than C2, and the number of swipes is less than F3.
Therefore, a gesture model can be obtained according to the gesture information, and the gesture model can be as shown in formula (2):
F(Gesture)=F(CliCou,CliFreq,CliCutRecArea,SliCou,SliFreq,SliTime,SliCutRecArea)=R(CliCou)&R(CliFreq)&R(CliCutRecArea)&R(SliCou)&R(SliFreq)&R(SliTime)&R(SliCutRecArea)
formula (2)
Where f (gettrue) represents a condition that the gesture information satisfies, and the description of parameters such as CliCou is described above. R () represents a condition that a variable in () satisfies. For example, in the running map scenario, R (SliTime) indicates that SliTime satisfies the condition SliTime > T1, and R (slicutrecaea) indicates that slicutrecaea satisfies the condition slicutrecaea ∈ [ R1 ]. The & symbol is used to concatenate R () corresponding to different variables, representing a joint decision based on R () corresponding to each variable.
Under different user use scenarios, different condition combinations can be considered.
For example, in the scene of a running picture, F (gettrue) ═ F (SliTime, slicutreecleararea) ═ r (SliTime) & r (slicutreecleararea).
In the battle scene, F (gettrue) ═ F (SliTime, slicutreetarearea, CliFreq, clicutreetarearea) ═ F (SliTime, slicutreaearea)
R(SliTime)&R(SliCutRecArea)&R(CliFreq)&R(CliCutRecArea)。
In the static scene, F (gettrue) ═ F (CliCou, CliFreq, SliCou, SliFreq) ═ F
R(CliCou)&R(CliFreq)&R(SliCou)&R(SliFreq)。
The relationship between the user usage scenario and the gesture model can be shown as formula (3):
f (class) (. alpha.). alpha.F (Gesture) (. 3)
Wherein F (class) is positively correlated with F (Gesture). Namely, the gesture information influences the decision of the user to use the scene according to a certain proportion.
In addition, in the process of establishing the gesture model according to the gesture information, the resource load of the electronic device can be continuously monitored, wherein the resource load comprises at least one resource information of the CPU frequency, the memory occupancy rate and the network occupancy rate. This increases fault tolerance and avoids the effects of "false" gestures (unwanted gestures). For example, if the CPU frequency, the memory occupancy rate, or the network occupancy rate is low and does not fluctuate substantially after receiving a certain gesture operation, the gesture operation may be considered as an invalid operation, and thus, the related information of the gesture operation may be deleted.
The relationship between the user usage scenario and the resource load can be shown as formula (4):
f (class). alpha.F (load). formula (4)
Wherein, f (class) represents a user usage scenario, and f (load) represents a resource load. F (class) is in positive correlation with F (load). F (load) will affect F (class) in a certain proportion.
From a practical perspective, as shown in fig. 8, the electronic device may collect foreground application categories, gesture information of the user, and resource load in different user usage scenarios offline. The collected big data can then be input into a classification engine, and the classification engine can establish a user usage scenario model as shown in equation (5) offline.
F (class) ((F) (App)), F (Gesture), F (load) } formula (5)
Namely, the user can train to use the scene model based on the information such as the category, the gesture information, the resource load and the like of the application program. The application program category is used for roughly classifying the use scenes of the users, and the gesture information and the resource load are used for further finely classifying the use scenes of the users. For example, when the application program category is games, the user usage scenario may be roughly considered as a game scenario, and it may be further determined that the user usage scenario belongs to a running chart, a battle or a static scenario in the game scenario according to the gesture information and the resource load.
Wherein the classification engine may be provided in the electronic device. I.e. training a user to use the scene model by the electronic device. Or the electronic device may send information such as the category of the application program, the gesture information, and the resource load to the cloud server, and the cloud server trains the user to use the scene model, which is not limited in the present application.
As shown in fig. 9, after a user starts an application program, the category of the current application program, the gesture information of the current user, and the current resource load may be collected, and these pieces of information are input into a classification engine, and the classification engine may identify the current usage scenario of the user based on a usage scenario model of the user. The user using scene is identified through various information, and the false identification probability of the user using scene can be reduced.
In some embodiments, the classification engine may implement the identification of the user usage scenario according to a bayesian classification method, as shown in equation (6):
Figure BDA0003152636320000141
wherein, P (Class | App, capture, Load) represents the probability of different user usage scenarios obtained according to the category of the current application program, the Gesture information of the current user, and the current resource Load decision, and the user usage scenario with the highest probability is used as the current user usage scenario.
P (App, gettrue, Load | Class) represents the probability of the occurrence of the category of the current application program, the Gesture information of the current user, and the current resource Load in different user usage scenarios. P (class) represents the probability of occurrence of usage scenarios of different users. P (App, gettrue, Load) represents the category of the current application, the Gesture information of the current user, and the probability of the occurrence of the current resource Load. P (App, gettrue, Load | Class), P (Class), and P (App, gettrue, Load) may be a probability value calculated from history data.
As shown in fig. 9, further, the electronic device may obtain user experience factors (e.g., network delay, whether the user has stuck and dropped frames, whether the user directly feeds back and is satisfied, etc.) of the user in the current user usage scenario, and adaptively adjust the user usage scenario model of the user according to the user experience factors, for example, may adjust the classification weights of f (app), f (capture), and f (load) in the user usage scenario model, so as to enhance the robustness of the user usage scenario model.
Considering the user experience, the user usage scenario model may be as shown in equation (7):
f (class). varies { F (App). alpha., (Gesture) }, F (load) }, F (experience) } formula (7)
Where f (experience) represents a user experience factor.
As shown in equation (8), when the current user usage scenario is judged by using the bayesian classification algorithm, the user experience factor can be increased, as shown in equation (8):
Figure BDA0003152636320000151
the method comprises the following steps that P (Class | App, Gesture, Load, expeience) represents the probability of different user use scenes obtained according to the type of a current application program, the Gesture information of a current user, the current resource Load and the current user Experience factor decision, and the user use scene with the maximum probability is used as the current user use scene.
P (App, gettrue, Load, Experience | Class) represents the probability of the occurrence of the category of the current application program, the Gesture information of the current user, the current resource Load, and the current user Experience factor in different user usage scenarios. P (class) represents the probability of occurrence of usage scenarios of different users. P (App, gettrue, Load, Experience) represents the category of the current application program, the Gesture information of the current user, the current resource Load, and the probability of occurrence of the current user Experience factor. P (App, gettrue, Load, Experience | Class), P (Class), and P (App, gettrue, Load, Experience) may be probability values calculated from the history data.
In other embodiments, an electronic device vendor may collaborate with an application developer such that the application may provide an API interface for system applications of the electronic device to query the user for usage scenarios. In this way, the system application of the electronic device can query the current user usage scenario, such as a running chart, a battle, a still, etc., according to the API interface.
The API interface is a set of definitions, programs and protocols, and can realize mutual communication between an operating system and an application program of the electronic equipment. The API is also a middleware and provides data sharing for various platforms. The API may include the following four types: remote Procedure Call (RPC) type: inter-program communication is achieved through processes (or tasks) that act on the shared data buffer. Standard Query Language (SQL) type: and standard query language for accessing data realizes data sharing among application programs through a general database. The file transmission type: and realizing data sharing among the application programs by sending the formatted file. Information delivery type: the method refers to small-sized formatted information between loosely-coupled or tightly-coupled application programs, and realizes data sharing through direct communication between the programs.
202. And optimizing system resources according to the use scene of the user.
In one possible design, the frequency of the CPU may be adjusted based on system load, device temperature, remaining battery level, etc. The frequency of the CPU is a digital clock signal frequency inside the CPU, and may be referred to as a clock frequency. Illustratively, an interative strategy can be adopted as a CPU frequency modulation strategy, namely frequency increasing or frequency reducing according to system load. Or, a temperature control module can be arranged, and the temperature control module limits the frequency of the CPU to prevent the temperature of the equipment from being too high. For example, when the temperature is higher than the temperature control threshold T, the frequency is directly decreased no matter what user usage scenario is currently in.
For example, assuming that the temperature control threshold T is 50 degrees, the processor frequency may be reduced when the processor temperature is greater than 50 degrees (e.g., 52 degrees). Taking the example of the processor being kylin 980, the default frequency of the processor may be 2.5GHz, and when the processor temperature is greater than 50 degrees (e.g., 52 degrees), the processor frequency may be reduced to 1.9 GHz. Taking the processor as intel Core i3-8350k as an example, the default frequency of the processor may be 4GHz, and when the processor temperature is greater than 50 degrees (e.g., 52 degrees), the processor frequency may be reduced to 3.9 GHz. Taking the processor as AMD Ruilong 31300X for example, the default frequency of the processor may be 3.5GHz, and when the processor temperature is greater than 50 degrees, the processor frequency may be reduced to 3.4 GHz.
In another possible design, the frequency of the CPU may be adjusted according to the user usage scenario.
In some embodiments, for the game scene, 3 different CPU frequencies may be preset, for example, max, medium and min respectively, and the CPU frequencies corresponding to max, medium and min respectively decrease sequentially. For example, max, medium, and min may correspond to CPU frequencies of 2.6GHz, 2.2GHz, and 1.9GHz, respectively. The game system can run at the CPU frequency corresponding to max in a battle scene so as to ensure the system performance and improve the game experience; running at the CPU frequency corresponding to medium in the running chart scene to reduce the problems of heating and power consumption, and running at the CPU frequency corresponding to min in the standing scene to further reduce the problems of heating and power consumption.
TABLE 2
Figure BDA0003152636320000161
Alternatively, the temperature factor may be further considered in addition to the adjustment of the frequency of the CPU based on the usage scenario of the user. For example, T _ safe may be introduced on the basis of the above temperature control threshold T, for example, T _ safe ═ T × 1.2. In a game fighting scene, the frequency can be increased when the temperature is higher than T, and the frequency is gradually reduced until the temperature is higher than T _ safe, so that the overhigh temperature of the electronic equipment can be avoided, the high performance can be ensured, and the dual optimization of the system performance and the power consumption can be achieved.
For example, assuming that the temperature control threshold T is 50 degrees and T _ soft is 1.2 is 60 degrees, when the processor temperature is greater than 50 and less than 60 degrees, the high frequency (the highest frequency of the processor) may still be increased or maintained to ensure performance, and when the processor temperature is greater than 60 degrees, the processor frequency may be gradually decreased. For example, the processor is kylin 980, the default frequency of the processor may be 2.5GHz, and when the processor temperature is greater than 50 degrees and less than 60 degrees, the processor frequency may be increased to 2.6GHz to improve performance. When the processor temperature is greater than 60 degrees, the processor frequency may be stepped down, for example, to 2.5GHz or below.
Alternatively, if the electronic device (e.g., UE) is connected to the 4G base station and the 5G base station through LTE-NR dual connectivity technology, dual connectivity can be maintained in a battle scene to ensure high performance and user experience. And releasing the connection between the UE and the 5G base station (equivalent to frequency reduction) in the running chart scene or the standing scene so as to achieve the effects of reducing the power consumption of the equipment and avoiding the waste of network resources. Wherein releasing the connection of the UE and the 5G base station comprises: the RRC layer receives a first RRC reconfiguration message sent by the 5G base station, wherein the first RRC reconfiguration message is used for indicating the RRC layer to release the radio resources occupied by the UE side when the UE and the 5G base station are in connection configuration so as to release the connection between the UE and the 5G base station on the UE side; and the RRC layer releases the radio resources occupied at the UE side when the UE is in connection configuration with the 5G base station according to the indication of the first RRC reconfiguration message so as to release the connection between the UE and the 5G base station at the UE side.
Or, the electronic device may start the performance mode in the battle scene to achieve high frame rate and high performance, thereby improving user experience. And starting a power saving mode or an equalizing mode in a running chart scene or a standing scene so as to achieve the effects of reducing the power consumption of equipment and avoiding the waste of network resources.
In addition, network scheduling, memory resource scheduling, generation of accurate user images and other processing can be performed according to the identified user use scene, and the application is not limited.
Based on the method provided by the embodiment of the application, the user use scene can be identified according to the category of the application program and the gesture information of the user, so that the frequency of the processor is adjusted according to the identified user use scene. It should be understood that, because the gesture is a main way for the user to interact with the device, and the user has a certain difference from the gesture interaction mode of the electronic device in different usage scenarios, the usage scenario of the user can be identified according to the gesture information of the user. The processor can be adjusted to different frequencies based on different user use scenes so as to meet the requirements of the current user use scenes.
As shown in fig. 10, the embodiment of the present application provides a method for resource optimization based on a user usage scenario, which is applied to an electronic device, and includes:
1001. and collecting gesture information of the user in response to the operation of opening the first application program by the user.
The gesture information comprises a gesture operation type, the receiving times of the gesture and coordinates corresponding to the gesture.
When the category of the first application program is game application, if the gesture information meets a first condition, determining that a user usage scenario is a first scenario, wherein the first condition comprises: the sliding time length is greater than or equal to the preset time length, and the sliding track does not exceed a first preset range.
If the gesture information meets a second condition, determining that the user usage scene is a second scene, wherein the second condition comprises: the sliding time length is larger than or equal to the preset time length, the sliding track does not exceed a first preset range, the clicking frequency is larger than or equal to the first preset frequency, and the clicking area does not exceed a second preset range.
1002. The processor frequency is adjusted according to the first scenario or the second scenario.
The first scene and the second scene correspond to a first frequency and a second frequency, respectively, and the first frequency and the second frequency are different.
In addition, if the gesture information satisfies a third condition, determining that the user usage scenario is a third scenario, where the third condition includes: the click frequency is less than a first preset frequency and/or the sliding frequency is less than a second preset frequency, and the click frequency is less than the second preset frequency and/or the sliding frequency is less than a third preset frequency; the processor frequency may then be adjusted according to a third scenario, the third scenario corresponding to a third frequency, the first frequency, the second frequency, and the third frequency being different.
In one possible design, if the gesture information does not satisfy the first condition, the second condition, and the third condition, the processor frequency is adjusted according to at least one of a temperature, a remaining power, and a resource load.
In one possible implementation manner, in response to the operation of opening the second application program by the user, querying the user usage scenario from an API (application programming interface) provided by the second application program; the processor frequency is adjusted according to the user usage scenario. That is, the user usage scenario of the second application may be queried according to the API interface.
In a possible implementation manner, in a second scenario, when the temperature is lower than a second preset temperature threshold value and the first preset temperature threshold value, if the processor frequency is lower than the second frequency, the frequency boosting processing is performed. The first preset temperature threshold may be T in the above embodiment, and the second preset temperature threshold may be T _ soft.
In one possible implementation, the first scenario is a roadmap scenario, and the first condition includes: SliTime > T1& & SliCutRecArea ∈ [ R1 ]; the SliTime represents the single sliding time length or the total continuous sliding time length for multiple times, T1 represents a preset time length, SliCutRecArea represents a section rectangular range corresponding to a sliding track, R1 represents a preset first preset range, and & represents a logical AND; the second scene is a battle scene, and the second condition comprises the following steps: SliTime > T1& & slicutrecanarea ∈ [ R1] & & CliFreq > F1& & clicutrecanarea ∈ [ R2 ]; wherein CliFreq represents the click frequency, clicutrake represents the rectangular range of the section corresponding to the click area, F1 represents the first preset frequency, and R2 represents the second preset range.
In a possible implementation manner, if the electronic device is connected to the 4G base station and the 5G base station through a dual connection technology, the method further includes: the dual connectivity of the electronic device with the 4G base station and the 5G base station is maintained in the battle scene, and the connectivity of the electronic device with the 5G base station is released in the roadmap scene.
In a possible implementation manner, the third scenario is a static scenario, and the third condition includes: CliCou < C1& & CliFreq < F2& & SliCou < C2& & SliFreq < F3; where CliCou represents the number of clicks, C1 represents a first preset number, CliFreq represents the click frequency, F2 represents a second preset frequency, F2 is smaller than F1, SliCou represents the number of slips, C2 represents the second preset number, SliFreq represents the slip frequency, and F3 represents a third preset frequency.
In one possible implementation, the method further includes: and releasing the connection between the electronic equipment and the 5G base station in the static scene.
In one possible implementation, the method further includes: when first gesture information of a user is collected, monitoring resource load of electronic equipment, wherein the resource load comprises at least one resource information of processor frequency, memory occupancy rate and network occupancy rate; and if the resource load of the electronic equipment is smaller than the preset threshold value and keeps unchanged, discarding the first gesture information.
In one possible implementation, F (class) ({ F (App)), F (Gesture), F (Load) }; wherein f (class) represents a user usage scenario, f (app) represents a category of an application program, f (load) represents resource load information, f (gesture) represents a condition satisfied by gesture information, and oc represents a positive correlation.
Wherein F (gesture) ═ F (CliCou, clicfreq, clicutreetareare, SliCou, SliFreq, SliTime, slittreetareare) ═ r (CliCou) & r (clicfreq) & r (clicutreetarearea) & r (SliCou) & r (SliFreq) & r (SliTime) & r (slitreecaearea)) & r (slituce) & r (slicutreecaearea)); where SliTime represents a single sliding duration or a total duration of continuous multiple sliding, slicustrecanarea represents a tangent plane rectangular range corresponding to a sliding track, CliFreq represents a click frequency, clicutracearea represents a tangent plane rectangular range corresponding to a click region, CliCou represents a click number, CliFreq represents a click frequency, SliCou represents a sliding number, SliFreq represents a sliding frequency, R () represents a condition that a variable in () satisfies, & represents joint decision based on each R ().
F (class) ((F) (App), F (Gesture), F (load), F (expeience)); wherein, f (experience) represents a user experience factor, and the user experience factor is used for indicating at least one of network delay information, stuck or frame loss information, and satisfaction of user feedback.
Figure BDA0003152636320000181
The method comprises the following steps that P (Class | App, Gesture, Load) represents the probability of different user use scenes obtained according to the type of a first application program, Gesture information of a user and resource Load decision, and the user use scene with the highest probability is used as the user use scene; p (App, gettrue, Load | Class) represents the probability of the occurrence of the category of the first application, the Gesture information of the user, and the resource Load in different user usage scenarios. P (class) represents the probability of occurrence of usage scenarios of different users. P (App, gettrue, Load) represents the category of the first application, the Gesture information of the user, and the probability of the resource Load occurring. P (App, gettrue, Load | Class), P (Class), and P (App, gettrue, Load) may be a probability value calculated from history data.
In one possible implementation, the method further includes: and performing at least one of network scheduling, memory resource scheduling and user portrait generation according to the identified user use scene.
Based on the method provided by the embodiment of the application, the user use scene can be identified according to the category of the application program and the gesture information of the user, so that the frequency of the processor is adjusted according to the identified user use scene. It should be understood that, because the gesture is a main way for the user to interact with the device, and the user has a certain difference from the gesture interaction mode of the electronic device in different usage scenarios, the usage scenario of the user can be identified according to the gesture information of the user. The processor can be adjusted to different frequencies based on different user use scenes so as to meet the requirements of the current user use scenes.
It should be noted that, for parts not described in detail in the embodiment of fig. 10, reference may be made to the foregoing embodiment (for example, the embodiment shown in fig. 2), and details are not described herein.
Embodiments of the present application further provide a chip system, as shown in fig. 11, where the chip system includes at least one processor 1101 and at least one interface circuit 1102. The processor 1101 and the interface circuit 1102 may be interconnected by wires. For example, the interface circuit 1102 may be used to receive signals from other devices (e.g., a memory of an electronic device). As another example, the interface circuit 1102 may be used to send signals to other devices (e.g., the processor 1101).
For example, the interface circuit 1102 may read instructions stored in a memory in the electronic device and send the instructions to the processor 1101. The instructions, when executed by the processor 1101, may cause an electronic device (such as the handset 100 shown in fig. 1) to perform the various steps in the embodiments described above.
Of course, the chip system may further include other discrete devices, which is not specifically limited in this embodiment of the present application.
Embodiments of the present application also provide a computer-readable storage medium, which includes computer instructions, and when the computer instructions are executed on an electronic device (such as the mobile phone 100 shown in fig. 1), the mobile phone 100 executes various functions or steps performed by the electronic device in the above-described method embodiments.
Embodiments of the present application further provide a computer program product, which, when running on a computer, causes the computer to execute each function or step performed by the electronic device in the above method embodiments.
The embodiment of the present application further provides an identity authentication apparatus, where the apparatus may be divided into different logic units or modules according to functions, and each unit or module executes different functions, so that the apparatus executes each function or step executed by the electronic device in the above method embodiments.
From the above description of the embodiments, it is obvious for those skilled in the art to realize that the above function distribution can be performed by different function modules according to the requirement, that is, the internal structure of the device is divided into different function modules to perform all or part of the above described functions.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described device embodiments are merely illustrative, and for example, the division of the modules or units is only one logical functional division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or may be integrated into another device, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may be one physical unit or a plurality of physical units, that is, may be located in one place, or may be distributed in a plurality of different places. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application may be essentially or partially contributed to by the prior art, or all or part of the technical solutions may be embodied in the form of a software product, where the software product is stored in a storage medium and includes several instructions to enable a device (which may be a single chip, a chip, or the like) or a processor (processor) to execute all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above description is only an embodiment of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present disclosure should be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (20)

1. A method for optimizing resources based on user usage scenarios is applied to electronic equipment and is characterized by comprising the following steps:
responding to an operation of opening a first application program by a user, and collecting gesture information of the user on an interface of the first application program; the gesture information comprises a gesture operation type, the receiving times of the gesture and coordinates corresponding to the gesture;
when the type of the first application program is game application, if the gesture information meets a first condition, determining that a user usage scenario is a first scenario, where the first condition includes: the sliding time length is greater than or equal to the preset time length, and the sliding track does not exceed a first preset range;
if the gesture information meets a second condition, determining that the user usage scene is a second scene, wherein the second condition comprises: the sliding time length is greater than or equal to the preset time length, the sliding track does not exceed a first preset range, the clicking frequency is greater than or equal to the first preset frequency, and the clicking area does not exceed a second preset range;
adjusting processor frequency according to the first scene or the second scene, wherein the first scene and the second scene respectively correspond to a first frequency and a second frequency, and the first frequency and the second frequency are different.
2. The method of claim 1,
if the gesture information meets a third condition, determining that a user usage scene is a third scene, wherein the third condition comprises: the click frequency is less than a first preset frequency and/or the sliding frequency is less than a second preset frequency, and the click frequency is less than the second preset frequency and/or the sliding frequency is less than a third preset frequency;
and adjusting the processor frequency according to the third scene, wherein the third scene corresponds to a third frequency, and the first frequency, the second frequency and the third frequency are different.
3. The method of claim 2,
if the gesture information does not satisfy the first condition, the second condition and the third condition, adjusting the processor frequency according to at least one of temperature, remaining power and resource load.
4. The method according to any one of claims 1-3, further comprising:
responding to the operation of opening a second application program by a user, and inquiring a user usage scene from an Application Program Interface (API) provided by the second application program;
and adjusting the processor frequency according to the user use scene.
5. The method according to any one of claims 1 to 4,
in the second scenario, when the temperature is higher than a first preset temperature threshold and lower than a second preset temperature threshold, if the processor frequency is lower than the second frequency, frequency boosting processing is performed.
6. The method of any of claims 1-5, wherein the first scenario is a roadmap scenario, and wherein the first condition comprises:
SliTime>T1&&SliCutRecArea∈[R1]
the SliTime represents the single sliding time length or the total continuous sliding time length for multiple times, T1 represents a preset time length, SliCutRecArea represents a section rectangular range corresponding to a sliding track, R1 represents a preset first preset range, and & represents a logical AND;
the second scene is a battle scene, and the second condition includes:
SliTime>T1&&SliCutRecArea∈[R1]&&CliFreq>F1&&CliCutRecArea∈[R2]
wherein CliFreq represents the click frequency, clicutrake represents the rectangular range of the section corresponding to the click area, F1 represents the first preset frequency, and R2 represents the second preset range.
7. The method of claim 6, wherein if the electronic device is connected to a 4G base station and a 5G base station via dual connectivity techniques, the method further comprises:
and maintaining the dual connection of the electronic equipment and the fourth generation 4G base station and the fifth generation 5G base station in the battle scene, and releasing the connection of the electronic equipment and the 5G base station in the running scene.
8. The method of claim 6 or 7, wherein the third scenario is a rest scenario, and wherein the third condition comprises:
CliCou<C1&&CliFreq<F2&&SliCou<C2&&SliFreq<F3
where CliCou represents the number of clicks, C1 represents a first preset number, CliFreq represents the click frequency, F2 represents a second preset frequency, F2 is smaller than F1, SliCou represents the number of slips, C2 represents the second preset number, SliFreq represents the slip frequency, and F3 represents a third preset frequency.
9. The method of claim 8, further comprising:
and releasing the connection between the electronic equipment and the 5G base station in the standing scene.
10. The method according to any one of claims 1-9, further comprising:
monitoring the resource load of the electronic equipment when first gesture information of a user is collected, wherein the resource load comprises at least one resource information of processor frequency, memory occupancy rate and network occupancy rate;
and if the resource load of the electronic equipment is smaller than a preset threshold value and keeps unchanged, discarding the first gesture information.
11. The method according to any one of claims 1 to 10,
F(Class)∝{F(App)、F(Gesture)、F(Load)}
wherein f (class) represents a user usage scenario, f (app) represents a category of an application program, f (load) represents resource load information, f (gesture) represents a condition satisfied by gesture information, and oc represents a positive correlation.
12. The method of claim 11,
F(Gesture)
=F(CliCou,CliFreq,CliCutRecArea,SliCou,SliFreq,SliTime,SliCutRecArea)
=R(CliCou)&R(CliFreq)&R(CliCutRecArea)&R(SliCou)&R(SliFreq)&R(SliTime)&R(SliCutRecArea)
where SliTime represents a single sliding duration or a total duration of continuous multiple sliding, slicustrecanarea represents a tangent plane rectangular range corresponding to a sliding track, CliFreq represents a click frequency, clicustrecanarea represents a tangent plane rectangular range corresponding to a click region, CliCou represents a click number, SliCou represents a sliding number, SliFreq represents a sliding frequency, R () represents a condition that a variable in () satisfies, and & represents joint decision based on each R ().
13. The method according to claim 11 or 12,
F(Class)∝{F(App)、F(Gesture)、F(Load)、F(Experience)}
wherein f (experience) represents a user experience factor, and the user experience factor is used for indicating at least one of network delay information, stuck or frame loss information, and satisfaction of user feedback.
14. The method according to any one of claims 1 to 13,
Figure FDA0003152636310000021
selecting a user use scene with the highest probability according to the type of the first application program, the Gesture information of the user and the probability of different user use scenes obtained by the resource Load decision; p (App, gettrue, Load | Class) represents a category of a first application program appearing in different user usage scenarios, Gesture information of the user, and a probability of the resource Load. P (class) represents the probability of occurrence of usage scenarios of different users. P (App, gettrue, Load) represents a category of the first application, Gesture information of the user, and a probability of the resource Load occurring. P (App, gettrue, Load | Class), P (Class), and P (App, gettrue, Load) may be a probability value calculated from history data.
15. The method according to any one of claims 1-14, further comprising:
and performing at least one of network scheduling, memory resource scheduling and user portrait generation according to the identified user use scene.
16. An electronic device, comprising:
the processing unit is used for responding to the operation of opening a first application program by a user and acquiring gesture information of the user on an interface of the first application program; the gesture information comprises a gesture operation type, the receiving times of the gesture and coordinates corresponding to the gesture;
the processing unit is further configured to determine that a usage scenario of a user is a first scenario if the gesture information satisfies a first condition when the type of the first application program is a game application, where the first condition includes: the sliding time length is greater than or equal to the preset time length, and the sliding track does not exceed a first preset range;
if the gesture information meets a second condition, determining that the user usage scene is a second scene, wherein the second condition comprises: the sliding time length is greater than or equal to the preset time length, the sliding track does not exceed a first preset range, the clicking frequency is greater than or equal to the first preset frequency, and the clicking area does not exceed a second preset range;
the processing unit is further configured to adjust a processor frequency according to the first scene or the second scene, where the first scene and the second scene correspond to a first frequency and a second frequency, respectively, and the first frequency and the second frequency are different.
17. The electronic device of claim 16,
the processing unit is further configured to determine that a usage scenario of the user is a third scenario if the gesture information satisfies a third condition, where the third condition includes: the click frequency is less than a first preset frequency and/or the sliding frequency is less than a second preset frequency, and the click frequency is less than the second preset frequency and/or the sliding frequency is less than a third preset frequency;
the processing unit is further configured to adjust a processor frequency according to the third scenario, where the third scenario corresponds to a third frequency, and the first frequency, the second frequency, and the third frequency are different.
18. A chip system, comprising one or more interface circuits and one or more processors; the interface circuit and the processor are interconnected through a line;
the chip system is applied to an electronic device comprising a communication module and a memory; the interface circuit to receive signals from the memory and to send the signals to the processor, the signals including computer instructions stored in the memory; the electronic device performs the method of any of claims 1-15 when the processor executes the computer instructions.
19. A computer-readable storage medium comprising computer instructions;
the computer instructions, when executed on an electronic device, cause the electronic device to perform the method of any of claims 1-15.
20. An electronic device comprising a processor coupled to a memory, the memory storing program instructions that, when executed by the processor, cause the electronic device to implement the method of any of claims 1-15.
CN202110767941.8A 2021-07-07 2021-07-07 Method and device for optimizing resources based on user use scene Pending CN113641488A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110767941.8A CN113641488A (en) 2021-07-07 2021-07-07 Method and device for optimizing resources based on user use scene

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110767941.8A CN113641488A (en) 2021-07-07 2021-07-07 Method and device for optimizing resources based on user use scene

Publications (1)

Publication Number Publication Date
CN113641488A true CN113641488A (en) 2021-11-12

Family

ID=78416839

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110767941.8A Pending CN113641488A (en) 2021-07-07 2021-07-07 Method and device for optimizing resources based on user use scene

Country Status (1)

Country Link
CN (1) CN113641488A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115543061A (en) * 2022-04-12 2022-12-30 荣耀终端有限公司 Power consumption control method and electronic equipment
CN115599513A (en) * 2022-04-07 2023-01-13 荣耀终端有限公司(Cn) Resource scheduling method and electronic equipment
CN116027880A (en) * 2022-05-16 2023-04-28 荣耀终端有限公司 Resource scheduling method and electronic equipment
CN116089055A (en) * 2022-05-16 2023-05-09 荣耀终端有限公司 Resource scheduling method and device
CN116089096A (en) * 2023-04-09 2023-05-09 荣耀终端有限公司 Load resource scheduling method and electronic equipment
CN116661584A (en) * 2022-10-11 2023-08-29 荣耀终端有限公司 Resource scheduling method and related equipment

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107861814A (en) * 2017-10-31 2018-03-30 广东欧珀移动通信有限公司 Resource allocation method and equipment
CN109582463A (en) * 2018-11-30 2019-04-05 Oppo广东移动通信有限公司 Resource allocation method, device, terminal and storage medium
CN110377359A (en) * 2019-07-11 2019-10-25 努比亚技术有限公司 Game performance optimization method, mobile terminal and computer readable storage medium
CN110704191A (en) * 2019-09-29 2020-01-17 Oppo广东移动通信有限公司 Game optimization method, game optimization device and mobile terminal
CN110809297A (en) * 2019-09-27 2020-02-18 华为技术有限公司 Data transmission method and electronic equipment
WO2020182016A1 (en) * 2019-03-09 2020-09-17 华为技术有限公司 Network connection processing method, related device and computer storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107861814A (en) * 2017-10-31 2018-03-30 广东欧珀移动通信有限公司 Resource allocation method and equipment
CN109582463A (en) * 2018-11-30 2019-04-05 Oppo广东移动通信有限公司 Resource allocation method, device, terminal and storage medium
WO2020182016A1 (en) * 2019-03-09 2020-09-17 华为技术有限公司 Network connection processing method, related device and computer storage medium
CN110377359A (en) * 2019-07-11 2019-10-25 努比亚技术有限公司 Game performance optimization method, mobile terminal and computer readable storage medium
CN110809297A (en) * 2019-09-27 2020-02-18 华为技术有限公司 Data transmission method and electronic equipment
CN110704191A (en) * 2019-09-29 2020-01-17 Oppo广东移动通信有限公司 Game optimization method, game optimization device and mobile terminal

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115599513A (en) * 2022-04-07 2023-01-13 荣耀终端有限公司(Cn) Resource scheduling method and electronic equipment
CN115599513B (en) * 2022-04-07 2023-11-03 荣耀终端有限公司 Resource scheduling method and electronic equipment
CN115543061A (en) * 2022-04-12 2022-12-30 荣耀终端有限公司 Power consumption control method and electronic equipment
CN115543061B (en) * 2022-04-12 2023-11-07 荣耀终端有限公司 Power consumption control method and electronic equipment
CN116027880A (en) * 2022-05-16 2023-04-28 荣耀终端有限公司 Resource scheduling method and electronic equipment
CN116089055A (en) * 2022-05-16 2023-05-09 荣耀终端有限公司 Resource scheduling method and device
CN116027880B (en) * 2022-05-16 2023-11-24 荣耀终端有限公司 Resource scheduling method and electronic equipment
CN116089055B (en) * 2022-05-16 2024-04-02 荣耀终端有限公司 Resource scheduling method and device
CN116661584A (en) * 2022-10-11 2023-08-29 荣耀终端有限公司 Resource scheduling method and related equipment
CN116661584B (en) * 2022-10-11 2024-04-12 荣耀终端有限公司 Resource scheduling method and related equipment
CN116089096A (en) * 2023-04-09 2023-05-09 荣耀终端有限公司 Load resource scheduling method and electronic equipment
CN116089096B (en) * 2023-04-09 2023-09-01 荣耀终端有限公司 Load resource scheduling method and electronic equipment

Similar Documents

Publication Publication Date Title
CN113645351B (en) Application interface interaction method, electronic device and computer-readable storage medium
CN110244893B (en) Operation method for split screen display and electronic equipment
WO2021023032A1 (en) Device unlocking method and system, and related device
CN117014567A (en) Video call display method and related device applied to electronic equipment
CN113641488A (en) Method and device for optimizing resources based on user use scene
CN112671976B (en) Control method and device of electronic equipment, electronic equipment and storage medium
WO2021185105A1 (en) Method for switching between sim card and esim card, and electronic device
CN111742539B (en) Voice control command generation method and terminal
CN111124503B (en) Automatic activation method of NFC application and terminal
WO2021052139A1 (en) Gesture input method and electronic device
WO2020015144A1 (en) Photographing method and electronic device
CN113542580B (en) Method and device for removing light spots of glasses and electronic equipment
CN110727380A (en) Message reminding method and electronic equipment
WO2021238370A1 (en) Display control method, electronic device, and computer-readable storage medium
CN114090102B (en) Method, device, electronic equipment and medium for starting application program
CN111343326A (en) Method and related device for acquiring test log
CN111580671A (en) Video image processing method and related device
CN114528581A (en) Safety display method and electronic equipment
CN113973398A (en) Wireless network connection method, electronic equipment and chip system
CN113472861B (en) File transmission method and electronic equipment
CN114095602B (en) Index display method, electronic device and computer readable storage medium
CN113901485B (en) Application program loading method, electronic device and storage medium
CN113918003A (en) Method and device for detecting time length of skin contacting screen and electronic equipment
CN113645595A (en) Equipment interaction method and device
CN113391735A (en) Display form adjusting method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20211112

RJ01 Rejection of invention patent application after publication