CN111273769A - Equipment control method and device, electronic equipment and storage medium - Google Patents

Equipment control method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN111273769A
CN111273769A CN202010044212.5A CN202010044212A CN111273769A CN 111273769 A CN111273769 A CN 111273769A CN 202010044212 A CN202010044212 A CN 202010044212A CN 111273769 A CN111273769 A CN 111273769A
Authority
CN
China
Prior art keywords
gesture
hardware
parameter
target
target air
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010044212.5A
Other languages
Chinese (zh)
Other versions
CN111273769B (en
Inventor
高龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202010044212.5A priority Critical patent/CN111273769B/en
Publication of CN111273769A publication Critical patent/CN111273769A/en
Application granted granted Critical
Publication of CN111273769B publication Critical patent/CN111273769B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures

Abstract

The embodiment of the application discloses a device control method and device, electronic equipment and a storage medium. The method comprises the following steps: if the target space gesture control scene is entered, identifying a currently acquired target space gesture; acquiring hardware operating parameters corresponding to the target air gesture; configuring the hardware operating parameter to the electronic device for the electronic device to execute a control function corresponding to the target spaced gesture based on the hardware operating parameter. Therefore, the electronic equipment is controlled to operate based on the hardware operating parameters corresponding to the target air-separating gesture collected currently, so that the electronic equipment can adapt to different air-separating gestures more flexibly, the electronic equipment can operate the control function corresponding to the target air-separating gesture with the more adaptive performance of the target air-separating gesture, and the adaptability of the performance of the electronic equipment and the air-separating gesture is improved.

Description

Equipment control method and device, electronic equipment and storage medium
Technical Field
The present application relates to the field of computer technologies, and in particular, to a device control method and apparatus, an electronic device, and a storage medium.
Background
With the development of sensing technology, the operation mode of electronic equipment is more and more abundant. For example, the electronic device may be triggered to perform a corresponding function by a blank gesture. However, in the related operation of the spaced gesture, there is a problem that the performance of the electronic device is not adapted to the control function corresponding to the spaced gesture, which results in poor user experience.
Disclosure of Invention
In view of the above problems, the present application provides a device control method, apparatus, electronic device and storage medium to improve the above problems.
In a first aspect, the present application provides a device control method applied to an electronic device, the method including: if the target space gesture control scene is entered, identifying a currently acquired target space gesture; acquiring hardware operating parameters corresponding to the target air gesture; configuring the hardware operating parameter to the electronic device for the electronic device to execute a control function corresponding to the target spaced gesture based on the hardware operating parameter.
In a second aspect, the present application provides a device control apparatus, operable on an electronic device, the apparatus comprising: the gesture recognition unit is used for recognizing the currently acquired target air-separating gesture if entering an air-separating gesture control scene; the operation parameter acquisition unit is used for acquiring hardware operation parameters corresponding to the target air-separating gesture; and the operation parameter configuration unit is used for configuring the hardware operation parameters to the electronic equipment so that the electronic equipment can execute a control function corresponding to the target air-separating gesture based on the hardware operation parameters.
In a third aspect, the present application provides an electronic device comprising a processor and a memory; one or more programs are stored in the memory and configured to be executed by the processor to implement the methods described above.
In a fourth aspect, the present application provides a computer readable storage medium having program code stored therein, wherein the method described above is performed when the program code is executed by a processor.
According to the equipment control method and device, the electronic equipment and the storage medium, if the equipment enters an air-separating gesture control scene, the currently acquired target air-separating gesture is identified, then the hardware operation parameters corresponding to the target air-separating gesture are acquired, and then the hardware operation parameters are configured to the electronic equipment, so that the electronic equipment can execute the control function corresponding to the target air-separating gesture based on the hardware operation parameters. Therefore, the electronic equipment is controlled to operate based on the hardware operating parameters corresponding to the target air-separating gesture collected currently, so that the electronic equipment can adapt to different air-separating gestures more flexibly, the electronic equipment can operate the control function corresponding to the target air-separating gesture with the more adaptive performance of the target air-separating gesture, and the adaptability of the performance of the electronic equipment and the air-separating gesture is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 illustrates a schematic diagram of an isolated gesture operation in an embodiment of the present application;
fig. 2 is a flowchart illustrating an apparatus control method according to an embodiment of the present application;
FIG. 3 is a diagram illustrating a configuration control proposed by an embodiment of the present application;
fig. 4 is a flowchart illustrating a method for controlling a device according to another embodiment of the present application;
fig. 5 is a flowchart illustrating an apparatus control method according to still another embodiment of the present application;
fig. 6 is a flowchart illustrating a device control method according to still another embodiment of the present application;
fig. 7 is a flowchart illustrating a device control method according to still another embodiment of the present application;
fig. 8 is a flowchart illustrating a device control method according to still another embodiment of the present application;
fig. 9 is a block diagram showing a structure of a device control apparatus according to an embodiment of the present application;
fig. 10 is a block diagram showing a configuration of an apparatus control device according to another embodiment of the present application;
fig. 11 is a block diagram showing another electronic device of the present application for executing a device control method according to an embodiment of the present application;
fig. 12 illustrates a storage unit in an embodiment of the present application, configured to store or carry program code for implementing a device control method according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
With the development of touch display screens, users can perform a series of touch gestures on the touch display screen to complete required functions. For example, in the incoming call interface, the user can trigger to answer the call through a sliding operation. For another example, in a map interface, the user can switch the area currently displayed in the screen area by a slide operation. For another example, in a browser interface, a user may perform page refreshing through a sliding operation to load more content into the current screen.
However, a touch gesture performed on a screen requires that a user's hand must make contact with the screen, which may cause inconvenience to operations in some scenarios. For example, when the hands of the user are wet, the touch operation on the screen may not be effectively performed. Therefore, the hands are taken away in the air. In the operation mode based on the air gesture, the user can make the gesture at a position with a certain distance from the electronic equipment to control the electronic equipment. For example, as shown in fig. 1, when a gesture operation is performed in the spaced gesture action area shown in fig. 1, an electronic device at a certain distance can be used.
However, the inventor finds that, in the research on the related air gesture, there is a problem that the performance of the electronic device is not adapted to the air gesture in the related air gesture operation, which results in poor user experience. The inventor finds that, in the control scene of the air-separating gesture, the control functions corresponding to different air-separating gestures are different, and the hardware operating parameter requirements corresponding to different control functions are different. For example, for the blank operation gesture representing the click, the starting of the application program or the generation of new interface rendering can be controlled, and in this case, the inventor finds that better performance parameters are needed to enable the application program to be started more quickly or the new interface to be rendered to be completed more quickly. For the blank operation gesture representing left-right or up-down sliding, the page turning can be controlled or the content which is loaded in the background is displayed, so that better hardware performance parameters are not needed in comparison. However, the inventors found that in a related scenario, no matter which kind of air-separating gesture is made by a current user, the electronic device executes a control function corresponding to the air-separating gesture with the same hardware operating parameter, so that the performance of the electronic device is not adapted to the air-separating gesture.
Therefore, the inventor provides an apparatus control method, an apparatus, an electronic device, and a storage medium provided in the present application, and by identifying a current acquired target space gesture if entering a space gesture control scenario, then acquiring a hardware operating parameter corresponding to the target space gesture, and then configuring the hardware operating parameter to the electronic device, the electronic device may execute a control function corresponding to the target space gesture based on the hardware operating parameter. Therefore, the electronic equipment is controlled to operate based on the hardware operating parameters corresponding to the target air-separating gesture collected currently, so that the electronic equipment can adapt to different air-separating gestures more flexibly, the electronic equipment can operate the control function corresponding to the target air-separating gesture with the more adaptive performance of the target air-separating gesture, and the adaptability of the performance of the electronic equipment and the air-separating gesture is improved.
Before further detailed description of the embodiments of the present application, terms and expressions referred to in the embodiments of the present application will be described, and the terms and expressions referred to in the embodiments of the present application will be applied to the following explanations.
Target spaced gesture: and the gesture of the air operation currently recognized by the electronic equipment is identified.
Hardware operating parameters: and the operation parameters are corresponding to each hardware in the electronic equipment. Alternatively, the hardware may include a processor, a graphics processor, and a random access memory. The hardware operating parameters may include an operating frequency of the processor, a number of cores enabled in the processor, an operating frequency of the graphics processor, and so on.
Embodiments of the present application will be described in detail below with reference to the accompanying drawings.
Referring to fig. 2, fig. 2 is a flowchart illustrating an apparatus control method according to an embodiment of the present application, where the method includes:
step S110: and if the target space gesture control scene is entered, identifying the currently acquired target space gesture.
It should be noted that there are many ways to detect whether to enter the air gesture control scenario in this embodiment.
As one way, a configuration control that is operated by an open-air gesture can be configured in a setting interface of the electronic device. In this manner, the user can select to turn on or off the spaced gesture operation function by touching the configuration control. And then determining to enter an air-insulated gesture control scene when detecting that the configuration control is in the operation function of representing and starting the air-insulated gesture. In this way, when the system detects that the configuration control is switched to the function representing the open space gesture operation, the system can inform other local programs by sending a broadcast. Illustratively, as shown in fig. 3, a configuration control 10 is configured in the setting interface shown in fig. 3, where in the left diagram shown in fig. 3, the configuration control 10 is configured to be in a state of representing an operation function of the on-spaced hand power, and in the right diagram, the configuration control 10 is configured to be in a state of representing an operation function of the off-spaced hand power.
It should be noted that, in order to facilitate the system to identify the current state of the configuration control, in this way, the configuration control may correspond to a configuration file, a parameter may be stored in the configuration file, and the current state of the configuration control is identified by the parameter. Illustratively, a parameter a is stored in the configuration file, where a being 0 corresponds to the configuration control being currently in a state that characterizes an operation of closing the air-insulated gesture, and a being 1 corresponds to the configuration control being currently in a state that characterizes an operation of opening the air-insulated gesture. Therefore, the system or other application program can identify the state of the configuration control by inquiring the value of the parameter a in the configuration file, and further determine whether the air gesture control scene is entered currently.
Alternatively, whether to enter an empty gesture control scenario may be determined by an application currently running in the foreground. It should be noted that, for some application programs, the idle-mode gesture operation function is turned on by default when the application programs are started, and it is determined that the idle-mode gesture control scenario is entered when it is detected that the application programs, which are turned on by default when the application programs are started, run in the foreground.
Optionally, the application program that will default to turn on the blank gesture operation function at the time of starting may be stored by means of a configuration list. Wherein the application program may be stored in the list by storing a package name of the application program. In this way, the name packet of the application program currently running in the foreground can be matched with the packet name in the list, and if the matching is successful, the application program currently running in the foreground is determined to be the application program which defaults to start the operation function of the blank gesture when the application program is started, so that the condition that the application program currently enters the blank gesture control scene is determined.
As one way, one application may run multiple processes, and all the processes correspond to one process priority, and the process priority may correspond to a state (foreground running state, background running state, etc.) of the application program to which the process belongs currently in the system. When an interface of an application is being displayed, the priority of the process running the interface is IMPORTANCE _ FOREGROUND 100. The system allows the application to obtain the priorities of all processes currently applied by the application during running, and then by using the principle, the priorities of all processes of the application can be traversed, when detecting that a certain process priority is IMPORTANCE _ FOREGROUND in the processes included in the application, the application is judged to run in the FOREGROUND, and then the packet name of the application running in the FOREGROUND can be obtained.
Step S120: and acquiring hardware operating parameters corresponding to the target air gesture.
In this embodiment, there may be a plurality of ways to obtain the hardware operating parameter corresponding to the target spaced gesture.
As one way, the corresponding relationship between the air-separating gesture and the hardware operating parameter may be configured in advance. For example, the corresponding relationship between the blank gesture and the hardware operating parameter may be stored in a configuration file, and in this way, if the target blank gesture is acquired, the corresponding hardware operating parameter may be queried based on the corresponding relationship. Optionally, if the target spaced gesture represents a click operation, determining a hardware operation parameter corresponding to the target spaced gesture as a first hardware operation parameter; if the target air-separating gesture represents sliding operation along the designated direction, determining that a hardware operation parameter corresponding to the target air-separating gesture is a second hardware operation parameter; the hardware operation efficiency represented by the first hardware operation parameter is higher than that represented by the first hardware operation parameter.
It should be noted that, for the blank operation gesture representing the click, the starting of the application program or the generation of a new interface rendering may be controlled, in which case, it is found that better performance parameters are needed to enable the application program to be started more quickly or the new interface to be rendered more quickly. For the blank operation gesture representing the left-right or up-down sliding, page turning can be controlled or the content which is loaded in the background is displayed, so that better hardware performance parameters are not needed in comparison, and the first hardware operation parameter corresponding to the target blank gesture representing the clicking operation is better than the second hardware operation parameter corresponding to the target blank gesture representing the sliding operation along the designated direction in the aspect of represented hardware performance.
The performance superiority can be understood as that the corresponding hardware operation parameter has a larger value. For example, if the hardware operating parameters include the minimum frequency of the CPU, the maximum frequency of the CPU, and the number of cores on the line of the CPU. Then the first hardware parameter may be a minimum frequency of 1.6GHz, a maximum frequency of 2.3GHz, an online core number: and the second hardware operating parameter may be CPU: minimum frequency 1.3GHz, maximum frequency 2.3GHz, number of cores on line: 6. the minimum frequency of the CPU of the first hardware operation parameter is greater than that of the CPU of the second hardware operation parameter, so that more data can be processed in the electronic equipment per unit time when the electronic equipment operates based on the first hardware operation parameter. The fact that more data can be processed is understood to mean that more images can be rendered, or more floating point data operations can be performed, and the like.
It should be noted that the minimum frequency and the maximum frequency shown in the foregoing are the operating frequencies of the CPU.
Step S130: configuring the hardware operating parameter to the electronic device for the electronic device to execute a control function corresponding to the target spaced gesture based on the hardware operating parameter.
Optionally, in the Android system, the hardware operating parameter corresponding to the target air-separating gesture may be configured to the hardware of the electronic device by calling a setAction function. For example, when the setAction function is used, the hardware operation parameter configuration may be performed by using setAction (S1, timeout). The parameter S1 is a determined hardware operating parameter corresponding to the target air-spaced gesture, and timeout is a time length for determining that the electronic device maintains the hardware operating parameter corresponding to the target air-spaced gesture. For example, if timeout is 2, the characterization electronic device operates for 2 seconds based on the hardware operating parameter corresponding to the target spaced gesture.
It should be noted that, if the electronic device keeps operating in a high-performance state for a long time, more processing resources and power resources are inevitably consumed. Furthermore, in order to promote the control function corresponding to the target air-separating gesture to be completed with higher efficiency, so as to promote the interaction fluency, and simultaneously, effectively save processing resources or electric quantity resources on the whole, the duration of the electronic equipment for maintaining the hardware operation parameters corresponding to the target air-separating gesture can be determined according to specific practical conditions.
As one way, the duration of time for which the determined electronic device maintains the hardware operating parameter corresponding to the target spaced gesture may be determined according to the current amount of power. For example, if the power of the electronic device is in a sufficient state, for example, the remaining power is more than 80%, the duration of the hardware operating parameter corresponding to the target spaced gesture may be made relatively longer. Wherein, relatively longer refers to the duration of the holding corresponding to the hardware operating parameter corresponding to the target air-separating gesture corresponding to the condition that the electric quantity is lower than 80%. For example, if the current remaining capacity is 85%, the corresponding determined duration for maintaining the hardware operating parameter corresponding to the target spaced apart gesture may be 3 seconds, and if the current remaining capacity is 65%, the corresponding determined duration for maintaining the hardware operating parameter corresponding to the target spaced apart gesture may be 1 second.
It can be understood that, after the time period is over, the electronic device may restore the originally adopted hardware operating parameters before switching to the hardware operating parameters corresponding to the target spaced gesture. Hardware operating parameters corresponding to the target air-spaced gesture may also continue to be maintained until a different target air-spaced gesture is identified.
In the device control method provided by this embodiment, if an air-break gesture control scene is entered, a currently acquired target air-break gesture is identified, then hardware operation parameters corresponding to the target air-break gesture are acquired, and then the hardware operation parameters are configured to the electronic device, so that the electronic device can execute a control function corresponding to the target air-break gesture based on the hardware operation parameters. Therefore, the electronic equipment is controlled to operate based on the hardware operating parameters corresponding to the target air-separating gesture collected currently, so that the electronic equipment can adapt to different air-separating gestures more flexibly, the electronic equipment can operate the control function corresponding to the target air-separating gesture with the more adaptive performance of the target air-separating gesture, and the adaptability of the performance of the electronic equipment and the air-separating gesture is improved.
Referring to fig. 4, a flowchart of an apparatus control method according to an embodiment of the present application is shown, where the method includes:
step S210: and if the target space gesture control scene is entered, identifying the currently acquired target space gesture.
Step S220: and acquiring a current operation scene.
Step S230: and acquiring a hardware operation parameter mapping relation corresponding to the current operation scene, wherein the hardware operation parameter mapping relation represents a hardware operation parameter corresponding to each spaced gesture in the current operation scene.
It should be noted that, even for the same air-separating gesture, the corresponding control functions in different operation scenarios may be different. For example, for an empty gesture representing a click, an application program is triggered to start in a desktop scene, and in an interface of the application program, the control function corresponding to the empty gesture representing the click is touched to a certain text, a button or a link. Thus, even if the corresponding control functions may be different in different operational scenarios for the same air-break gesture, the corresponding required hardware operational parameters may be different.
As a manner, a configuration file (which may be the configuration file storing the corresponding relationship between the air-separating gesture and the hardware operating parameter) may be stored in the electronic device, and then when the electronic device detects that there is a change in the operating scene, the hardware operating parameter corresponding to the air-separating gesture stored in the configuration file may be pre-determined, so that when the hardware operating parameter corresponding to the target air-separating gesture needs to be obtained, the configuration file may be directly queried, and the operating scene does not need to be identified any more, so as to improve the configuration efficiency of the hardware operating parameter, and also improve the reaction timeliness of the operation of the air-separating gesture as a whole. Illustratively, there is a hardware operation parameter set S2 corresponding to the desktop scene, and a hardware operation parameter set S3 corresponding to the browser operation scene. Then, when it is detected that the browser is currently in the desktop scene, the corresponding relationship between the blank gesture and the hardware operating parameter in the configuration file may be immediately switched to the hardware operating parameter set S2, and when it is detected that the browser is currently in the browser operating scene, the corresponding relationship between the blank gesture and the hardware operating parameter in the configuration file may be immediately switched to the hardware operating parameter set S3.
Step S240: and acquiring the hardware operating parameters corresponding to the target air-separating gesture based on the hardware operating parameter mapping relation. Configuring the hardware operating parameter to the electronic device for the electronic device to execute a control function corresponding to the target spaced gesture based on the hardware operating parameter.
In the device control method provided by this embodiment, if an air-break gesture control scene is entered, a currently acquired target air-break gesture is identified, then hardware operation parameters corresponding to the target air-break gesture are acquired, and then the hardware operation parameters are configured to the electronic device, so that the electronic device can execute a control function corresponding to the target air-break gesture based on the hardware operation parameters. Therefore, the electronic equipment is controlled to operate based on the hardware operating parameters corresponding to the target air-separating gesture collected currently, so that the electronic equipment can adapt to different air-separating gestures more flexibly, the electronic equipment can operate the control function corresponding to the target air-separating gesture with the more adaptive performance of the target air-separating gesture, and the adaptability of the performance of the electronic equipment and the air-separating gesture is improved. In addition, in this embodiment, the hardware operating parameter mapping relationship corresponding to the current operating scene may be further obtained by combining the current operating scene, so that the accuracy and the fine granularity of obtaining the hardware operating parameters are further improved.
Referring to fig. 5, a flowchart of an apparatus control method according to an embodiment of the present application is shown, where the method includes:
step S310: and if the target space gesture control scene is entered, identifying the currently acquired target space gesture.
Step S320: and acquiring the application program currently running in the foreground.
Step S330: and determining the application program running in the foreground as a current running scene.
Step S340: and acquiring a hardware operation parameter mapping relation corresponding to the application program operated by the foreground, wherein the hardware operation parameter mapping relation represents a hardware operation parameter corresponding to each spaced gesture in the current operation scene.
Step S350: and acquiring the hardware operating parameters corresponding to the target air-separating gesture based on the hardware operating parameter mapping relation. Configuring the hardware operating parameter to the electronic device for the electronic device to execute a control function corresponding to the target spaced gesture based on the hardware operating parameter.
In the device control method provided by this embodiment, if an air-break gesture control scene is entered, a currently acquired target air-break gesture is identified, then hardware operation parameters corresponding to the target air-break gesture are acquired, and then the hardware operation parameters are configured to the electronic device, so that the electronic device can execute a control function corresponding to the target air-break gesture based on the hardware operation parameters. Therefore, the electronic equipment is controlled to operate based on the hardware operating parameters corresponding to the target air-separating gesture collected currently, so that the electronic equipment can adapt to different air-separating gestures more flexibly, the electronic equipment can operate the control function corresponding to the target air-separating gesture with the more adaptive performance of the target air-separating gesture, and the adaptability of the performance of the electronic equipment and the air-separating gesture is improved. In addition, in this embodiment, the mapping relationship of the hardware operating parameter corresponding to the current operating scene can be further acquired by combining the current foreground operating application program, so that the accuracy and the fine granularity of acquiring the hardware operating parameter are further improved.
Referring to fig. 6, a flowchart of an apparatus control method according to an embodiment of the present application is shown, where the method includes:
step S410: and if the target space gesture control scene is entered, identifying the currently acquired target space gesture.
Step S420: and acquiring the operation performance grade of the system configuration, wherein the power consumption corresponding to different operation performance grades is different.
As one approach, the operational performance level may be set for a user in a settings interface. Illustratively, the operational performance level may include a high level, a medium level, and a low level itself. Advanced characterization electronics can operate with higher performance and correspondingly higher power consumption. The intermediate level representation electronics will operate with a more balanced performance and the corresponding power consumption will be lower than the high level. The medium level characterization electronics will operate at a lower performance and the corresponding power consumption will be lower than the medium level.
Illustratively, taking the CPU operating frequency included in the hardware operating parameters as an example, in the case that the operating performance level is high, the CPU operating frequency corresponding to the target space gesture operation representing the click may be at least 2.0 GHz. And under the condition that the operation performance level is middle, the lowest CPU working frequency corresponding to the target air-spaced gesture operation representing the click can be 1.6 GHz. Under the condition that the operation performance level is low, the CPU working frequency corresponding to the target air-separating gesture operation representing clicking can be 1.3GHz at the lowest.
Step S430: and determining the operation performance grade as the current operation scene.
Step S440: and acquiring a hardware operation parameter mapping relation corresponding to the operation performance grade, wherein the hardware operation parameter mapping relation represents a hardware operation parameter corresponding to each spaced gesture in the current operation scene.
Step S450: and acquiring the hardware operating parameters corresponding to the target air-separating gesture based on the hardware operating parameter mapping relation. Configuring the hardware operating parameter to the electronic device for the electronic device to execute a control function corresponding to the target spaced gesture based on the hardware operating parameter.
It should be noted that, a plurality of parameters included in the hardware operation parameter corresponding to the higher operation performance level may all be better than a plurality of parameters included in the hardware operation parameter corresponding to the lower operation performance level, or some of the plurality of parameters included in the hardware operation parameter corresponding to the higher operation performance level may be better than a plurality of parameters included in the hardware operation parameter corresponding to the lower operation performance level. For example, in the case that the hardware operation parameters include an operating frequency of the processor, a number of cores enabled in the processor, and an operating frequency of the graphics processor, in the hardware operation parameters corresponding to the higher operation performance level, the operating frequency of the processor, the number of cores enabled in the processor, and the operating frequency of the graphics processor may all be higher than in the hardware operation parameters included in the hardware operation parameters corresponding to the lower operation performance level, the operating frequency of the processor, and the number of cores enabled in the processor, so as to achieve better performance. Moreover, at most two of the operating frequency of the processor, the number of cores enabled in the processor, and the operating frequency of the graphics processor in the hardware operating parameter corresponding to the higher operating performance level may be higher than the hardware operating parameter included in the hardware operating parameter corresponding to the lower operating performance level, so as to achieve better performance.
In the device control method provided by this embodiment, if an air-break gesture control scene is entered, a currently acquired target air-break gesture is identified, then hardware operation parameters corresponding to the target air-break gesture are acquired, and then the hardware operation parameters are configured to the electronic device, so that the electronic device can execute a control function corresponding to the target air-break gesture based on the hardware operation parameters. Therefore, the electronic equipment is controlled to operate based on the hardware operating parameters corresponding to the target air-separating gesture collected currently, so that the electronic equipment can adapt to different air-separating gestures more flexibly, the electronic equipment can operate the control function corresponding to the target air-separating gesture with the more adaptive performance of the target air-separating gesture, and the adaptability of the performance of the electronic equipment and the air-separating gesture is improved. In addition, in this embodiment, the hardware operating parameter mapping relationship corresponding to the current operating scene may be further obtained by combining the current operating performance level, so that the accuracy and the fine granularity of obtaining the hardware operating parameters are further improved.
Referring to fig. 7, a flowchart of an apparatus control method according to an embodiment of the present application is shown, where the method includes:
step S510: and if the target space gesture control scene is entered, identifying the currently acquired target space gesture.
Step S520: and acquiring the target air gesture acquired last time.
Step S530: and detecting whether the target air-separating gesture collected last time is the same as the target air-separating gesture collected currently.
And if the current acquired target air-separating gesture is the same as the last acquired target air-separating gesture, ending the process.
Step S540: and if the current acquired target space-exclusion gesture is different from the last acquired target space-exclusion gesture, acquiring hardware operating parameters corresponding to the target space-exclusion gesture.
It should be noted that, in the same operation scenario, the hardware operation parameters corresponding to the same air-separating operation gesture may be the same, so as to avoid resource waste caused by continuously configuring the same hardware operation parameters to the electronic device for multiple times, when the currently acquired target air-separating gesture is different from the last acquired target air-separating gesture, the hardware operation parameters corresponding to the currently acquired target air-separating gesture may be further acquired so as to be configured to the electronic device. Correspondingly, when the currently acquired target air-separating gesture is detected to be the same as the last acquired target air-separating gesture, subsequent hardware operation parameters corresponding to the currently acquired target air-separating gesture are not acquired.
Step S550: configuring the hardware operating parameter to the electronic device for the electronic device to execute a control function corresponding to the target spaced gesture based on the hardware operating parameter.
In the device control method provided by this embodiment, if an air-break gesture control scene is entered, a currently acquired target air-break gesture is identified, then hardware operation parameters corresponding to the target air-break gesture are acquired, and then the hardware operation parameters are configured to the electronic device, so that the electronic device can execute a control function corresponding to the target air-break gesture based on the hardware operation parameters. Therefore, the electronic equipment is controlled to operate based on the hardware operating parameters corresponding to the target air-separating gesture collected currently, so that the electronic equipment can adapt to different air-separating gestures more flexibly, the electronic equipment can operate the control function corresponding to the target air-separating gesture with the more adaptive performance of the target air-separating gesture, and the adaptability of the performance of the electronic equipment and the air-separating gesture is improved. In addition, in this embodiment, if the currently acquired target clear gesture is different from the last acquired target clear gesture, the acquisition of the hardware operating parameter corresponding to the target clear gesture is triggered, so that resource consumption caused by repeated configuration of the same hardware operating parameter is avoided.
Referring to fig. 8, a flowchart of an apparatus control method according to an embodiment of the present application is shown, where the method includes:
step S610: and if the target space gesture control scene is entered, identifying the currently acquired target space gesture.
Step S620: and acquiring hardware operation parameters corresponding to the target air-separating gesture, wherein each hardware operation parameter corresponds to an operation performance parameter, and the operation performance parameters represent the data volume processed in each unit time.
Step S630: and acquiring the operation performance parameters of the hardware operation parameters corresponding to the target air-separating gesture.
Step S640: and detecting whether the operation performance parameter of the hardware operation parameter corresponding to the target air-separating gesture is a target operation performance parameter, wherein the target operation performance parameter is the operation performance parameter with the maximum data volume processed in unit time.
Step S650: and if so, configuring the electronic equipment to keep running based on the target running performance parameter when the electric quantity is higher than a target threshold value, so that the electronic equipment executes a control function corresponding to the target air-separating gesture based on the target hardware running parameter.
It should be noted that configuring hardware operating parameters to hardware is also resource consuming in certain situations. Then, in order to better achieve a fast response to the operation of the spaced gesture, and at the same time also to achieve the purpose of reducing power consumption, operation based on the target operating performance parameter may be maintained when the electronic device is sufficiently charged (e.g., during a period when the charge is above a target threshold), and in such a case, after the electronic device operates based on the aforementioned target hardware operating parameters, when a new target air-separating operation gesture is detected, a stage of judging whether the electric quantity of the electronic device is higher than a target threshold value is needed first, if the electric quantity of the electronic equipment is higher than the target threshold value, the operation of acquiring the hardware operating parameters corresponding to the new target air-separating operation gesture is not executed, and if the electric quantity of the electronic equipment is not higher than the target threshold value, acquiring the hardware operating parameters corresponding to the new target air-separating operation gesture, and then configuring the hardware operating parameters corresponding to the new target air-separating operation gesture to the electronic equipment.
In the device control method provided by this embodiment, if an air-break gesture control scene is entered, a currently acquired target air-break gesture is identified, then hardware operation parameters corresponding to the target air-break gesture are acquired, and then the hardware operation parameters are configured to the electronic device, so that the electronic device can execute a control function corresponding to the target air-break gesture based on the hardware operation parameters. Therefore, the electronic equipment is controlled to operate based on the hardware operating parameters corresponding to the target air-separating gesture collected currently, so that the electronic equipment can adapt to different air-separating gestures more flexibly, the electronic equipment can operate the control function corresponding to the target air-separating gesture with the more adaptive performance of the target air-separating gesture, and the adaptability of the performance of the electronic equipment and the air-separating gesture is improved. In addition, in this embodiment, in the process of configuring the hardware operation parameter, it is detected whether the operation performance parameter of the hardware operation parameter corresponding to the target air-break gesture is the target operation performance parameter, and if so, the electronic device is configured to keep operating based on the operation performance parameter at a stage when the electric quantity is higher than the target threshold value, so that the electronic device can always keep a better hardware operation parameter when the electric quantity is sufficient, and the efficiency of the electronic device in responding to the air-break gesture is improved.
Referring to fig. 9, a block diagram of a device control apparatus 700 according to an embodiment of the present application is shown, where the apparatus 700 includes:
the gesture recognition unit 710 is configured to recognize a currently acquired target air gesture if the air gesture control scene is entered.
An operation parameter obtaining unit 720, configured to obtain a hardware operation parameter corresponding to the target spaced gesture.
As one mode, the operation parameter obtaining unit 720 is specifically configured to obtain a current operation scene;
acquiring a hardware operation parameter mapping relation corresponding to the current operation scene, wherein the hardware operation parameter mapping relation represents a hardware operation parameter corresponding to each spaced gesture in the current operation scene; and acquiring the hardware operating parameters corresponding to the target air-separating gesture based on the hardware operating parameter mapping relation. In this manner, optionally, the operation parameter obtaining unit 720 is specifically configured to obtain an application currently running in the foreground; determining the application program running in the foreground as a current running scene; and acquiring a hardware operation parameter mapping relation corresponding to the application program operated by the foreground. In this manner, optionally, the operation parameter obtaining unit 720 is specifically configured to obtain operation performance levels of the system configuration, where power consumption amounts corresponding to different operation performance levels are different; determining the operation performance grade as a current operation scene; and acquiring a hardware operation parameter mapping relation corresponding to the operation performance grade.
As another mode, the operation parameter obtaining unit 720 is specifically configured to determine, if the target spaced gesture represents a click operation, that a hardware operation parameter corresponding to the target spaced gesture is a first hardware operation parameter; if the target air-separating gesture represents sliding operation along the designated direction, determining that a hardware operation parameter corresponding to the target air-separating gesture is a second hardware operation parameter; the hardware operation efficiency represented by the first hardware operation parameter is higher than that represented by the first hardware operation parameter.
An operation parameter configuration unit 730, configured to configure the hardware operation parameter to the electronic device, so that the electronic device executes a control function corresponding to the target spaced gesture based on the hardware operation parameter.
As one mode, as shown in fig. 10, the apparatus further includes:
the gesture comparison unit 740 is configured to obtain a target air gesture acquired last time; and comparing whether the currently acquired target air-separating gesture is different from the last acquired target air-separating gesture. In this way, the operation parameter obtaining unit 720 is configured to, if the currently acquired target air-separating gesture is different from the target air-separating gesture acquired last time, perform the obtaining of the hardware operation parameter corresponding to the target air-separating gesture.
As one approach, each hardware operating parameter corresponds to an operating performance parameter that characterizes the amount of data processed per unit time. An operation parameter configuration unit 730, specifically configured to obtain an operation performance parameter of a hardware operation parameter corresponding to the target spaced gesture; detecting whether an operation performance parameter of a hardware operation parameter corresponding to the target air-separating gesture is a target operation performance parameter, wherein the target operation performance parameter is the operation performance parameter with the maximum data volume processed in unit time; and if so, configuring the electronic equipment to keep operating based on the operating performance parameters when the electric quantity is higher than a target threshold value.
The application provides a device control apparatus, through if when entering into the control scene of the space gesture, the space gesture of target that discerns the current collection, then acquire with the hardware operating parameter that the space gesture of target corresponds, will again hardware operating parameter configuration gives electronic equipment's mode makes electronic equipment can be based on hardware operating parameter execution with the control function that the space gesture of target corresponds. Therefore, the electronic equipment is controlled to operate based on the hardware operating parameters corresponding to the target air-separating gesture collected currently, so that the electronic equipment can adapt to different air-separating gestures more flexibly, the electronic equipment can operate the control function corresponding to the target air-separating gesture with the more adaptive performance of the target air-separating gesture, and the adaptability of the performance of the electronic equipment and the air-separating gesture is improved.
It should be noted that the device embodiment and the method embodiment in the present application correspond to each other, and specific principles in the device embodiment may refer to the contents in the method embodiment, which is not described herein again.
An electronic device provided by the present application will be described below with reference to fig. 11.
Referring to fig. 11, based on the device control method, another electronic device 200 including a processor 102 that can execute the device control method is provided in the embodiment of the present application, where the electronic device 200 may be a smart phone, a tablet computer, a computer, or a portable computer. The electronic device 200 also includes a memory 104, and a network module 106. The memory 104 stores programs that can execute the content of the foregoing embodiments, and the processor 102 can execute the programs stored in the memory 104. The internal structure of the processor 102 may be as shown in fig. 1.
Processor 102 may include, among other things, one or more cores for processing data and a message matrix unit. The processor 102 interfaces with various components throughout the electronic device 200 using various interfaces and circuitry to perform various functions of the electronic device 200 and process data by executing or executing instructions, programs, code sets, or instruction sets stored in the memory 104 and invoking data stored in the memory 104. Alternatively, the processor 102 may be implemented in hardware using at least one of Digital Signal Processing (DSP), Field-Programmable Gate Array (FPGA), and Programmable Logic Array (PLA). The processor 102 may integrate one or more of a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), a modem, and the like. Wherein, the CPU mainly processes an operating system, a user interface, an application program and the like; the GPU is used for rendering and drawing display content; the modem is used to handle wireless communications. It is understood that the modem may not be integrated into the processor 102, but may be implemented by a communication chip.
The Memory 104 may include a Random Access Memory (RAM) or a Read-Only Memory (Read-Only Memory). The memory 104 may be used to store instructions, programs, code sets, or instruction sets. The memory 104 may include a stored program area and a stored data area, wherein the stored program area may store instructions for implementing an operating system, instructions for implementing at least one function (such as a touch function, a sound playing function, an image playing function, etc.), instructions for implementing various method embodiments described below, and the like. The storage data area may also store data created by the terminal 100 in use, such as a phonebook, audio-video data, chat log data, and the like.
The network module 106 is configured to receive and transmit electromagnetic waves, and implement interconversion between the electromagnetic waves and electrical signals, so as to communicate with a communication network or other devices, for example, an audio playing device. The network module 106 may include various existing circuit elements for performing these functions, such as an antenna, a radio frequency transceiver, a digital signal processor, an encryption/decryption chip, a Subscriber Identity Module (SIM) card, memory, and so forth. The network module 106 may communicate with various networks, such as the internet, an intranet, a wireless network, or with other devices via a wireless network. The wireless network may comprise a cellular telephone network, a wireless local area network, or a metropolitan area network. For example, the network module 106 may interact with a base station.
Referring to fig. 12, a block diagram of a computer-readable storage medium according to an embodiment of the present application is shown. The computer-readable medium 1100 has stored therein program code that can be called by a processor to perform the method described in the above-described method embodiments.
The computer-readable storage medium 1100 may be an electronic memory such as a flash memory, an EEPROM (electrically erasable programmable read only memory), an EPROM, a hard disk, or a ROM. Alternatively, the computer-readable storage medium 1100 includes a non-volatile computer-readable storage medium. The computer readable storage medium 1100 has storage space for program code 810 to perform any of the method steps of the method described above. The program code can be read from or written to one or more computer program products. The program code 1110 may be compressed, for example, in a suitable form.
In summary, according to the device control method, the device, the electronic device and the storage medium provided by the application, if the device enters the air gesture control scene, the currently acquired target air gesture is identified, then the hardware operation parameter corresponding to the target air gesture is acquired, and then the hardware operation parameter is configured to the electronic device, so that the electronic device can execute the control function corresponding to the target air gesture based on the hardware operation parameter. Therefore, the electronic equipment is controlled to operate based on the hardware operating parameters corresponding to the target air-separating gesture collected currently, so that the electronic equipment can adapt to different air-separating gestures more flexibly, the electronic equipment can operate the control function corresponding to the target air-separating gesture with the more adaptive performance of the target air-separating gesture, and the adaptability of the performance of the electronic equipment and the air-separating gesture is improved.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not necessarily depart from the spirit and scope of the corresponding technical solutions in the embodiments of the present application.

Claims (10)

1. An apparatus control method applied to an electronic apparatus, the method comprising:
if the target space gesture control scene is entered, identifying a currently acquired target space gesture;
acquiring hardware operating parameters corresponding to the target air gesture;
configuring the hardware operating parameter to the electronic device for the electronic device to execute a control function corresponding to the target spaced gesture based on the hardware operating parameter.
2. The method of claim 1, wherein the obtaining hardware operating parameters corresponding to the target spaced gesture comprises:
acquiring a current operation scene;
acquiring a hardware operation parameter mapping relation corresponding to the current operation scene, wherein the hardware operation parameter mapping relation represents a hardware operation parameter corresponding to each spaced gesture in the current operation scene;
and acquiring the hardware operating parameters corresponding to the target air-separating gesture based on the hardware operating parameter mapping relation.
3. The method of claim 2, wherein the obtaining the current operational scenario comprises:
acquiring an application program currently running in a foreground;
determining the application program running in the foreground as a current running scene;
the obtaining of the hardware operation parameter mapping relationship corresponding to the current operation scene includes:
and acquiring a hardware operation parameter mapping relation corresponding to the application program operated by the foreground.
4. The method of claim 2, wherein the obtaining the current operational scenario comprises:
acquiring operation performance grades configured by a system, wherein the power consumption corresponding to different operation performance grades is different;
determining the operation performance grade as a current operation scene;
the obtaining of the hardware operation parameter mapping relationship corresponding to the current operation scene includes:
and acquiring a hardware operation parameter mapping relation corresponding to the operation performance grade.
5. The method of claim 1, wherein the obtaining hardware operating parameters corresponding to the target spaced gesture comprises:
if the target air-separating gesture represents a click operation, determining a hardware operation parameter corresponding to the target air-separating gesture as a first hardware operation parameter;
if the target air-separating gesture represents sliding operation along the designated direction, determining that a hardware operation parameter corresponding to the target air-separating gesture is a second hardware operation parameter;
the hardware operation efficiency represented by the first hardware operation parameter is higher than that represented by the first hardware operation parameter.
6. The method of claim 1, wherein obtaining hardware operating parameters corresponding to the target spaced gesture further comprises:
acquiring a target air gesture acquired last time;
and if the current acquired target air-separating gesture is different from the last acquired target air-separating gesture, executing the acquisition of the hardware operating parameters corresponding to the target air-separating gesture.
7. The method of claim 1, wherein each hardware operating parameter corresponds to an operating performance parameter that characterizes an amount of data processed per unit time, and wherein configuring the hardware operating parameters to the electronic device comprises:
acquiring the operation performance parameters of the hardware operation parameters corresponding to the target air-separating gesture;
detecting whether an operation performance parameter of a hardware operation parameter corresponding to the target air-separating gesture is a target operation performance parameter, wherein the target operation performance parameter is the operation performance parameter with the maximum data volume processed in unit time;
and if so, configuring the electronic equipment to keep operating based on the operating performance parameters when the electric quantity is higher than a target threshold value.
8. An apparatus control device, operable with an electronic device, the device comprising:
the gesture recognition unit is used for recognizing the currently acquired target air-separating gesture if entering an air-separating gesture control scene;
the operation parameter acquisition unit is used for acquiring hardware operation parameters corresponding to the target air-separating gesture;
and the operation parameter configuration unit is used for configuring the hardware operation parameters to the electronic equipment so that the electronic equipment can execute a control function corresponding to the target air-separating gesture based on the hardware operation parameters.
9. An electronic device comprising a processor and a memory; one or more programs are stored in the memory and configured to be executed by the processor to implement the method of any of claims 1-7.
10. A computer-readable storage medium, having program code stored therein, wherein the program code when executed by a processor performs the method of any of claims 1-7.
CN202010044212.5A 2020-01-15 2020-01-15 Equipment control method and device, electronic equipment and storage medium Active CN111273769B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010044212.5A CN111273769B (en) 2020-01-15 2020-01-15 Equipment control method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010044212.5A CN111273769B (en) 2020-01-15 2020-01-15 Equipment control method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN111273769A true CN111273769A (en) 2020-06-12
CN111273769B CN111273769B (en) 2022-06-17

Family

ID=70997245

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010044212.5A Active CN111273769B (en) 2020-01-15 2020-01-15 Equipment control method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111273769B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113110887A (en) * 2021-03-31 2021-07-13 联想(北京)有限公司 Information processing method and device, electronic equipment and storage medium
CN113190106A (en) * 2021-03-16 2021-07-30 青岛小鸟看看科技有限公司 Gesture recognition method and device and electronic equipment
CN115079822A (en) * 2022-05-31 2022-09-20 荣耀终端有限公司 Air-spaced gesture interaction method and device, electronic chip and electronic equipment

Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5288938A (en) * 1990-12-05 1994-02-22 Yamaha Corporation Method and apparatus for controlling electronic tone generation in accordance with a detected type of performance gesture
US20110173574A1 (en) * 2010-01-08 2011-07-14 Microsoft Corporation In application gesture interpretation
CN102426675A (en) * 2011-11-07 2012-04-25 北京邮电大学 Method for optimizing working states of software and hardware by information related to user subjective feelings
WO2012126103A1 (en) * 2011-03-23 2012-09-27 Mgestyk Technologies Inc. Apparatus and system for interfacing with computers and other electronic devices through gestures by using depth sensing and methods of use
CN103984416A (en) * 2014-06-10 2014-08-13 北京邮电大学 Gesture recognition method based on acceleration sensor
CN104460935A (en) * 2013-09-18 2015-03-25 联想(北京)有限公司 Information processing method and electronic equipment
CN104615366A (en) * 2014-12-31 2015-05-13 中国人民解放军国防科学技术大学 Gesture interactive method oriented to multiple devices
CN104748737A (en) * 2013-12-30 2015-07-01 华为技术有限公司 Multi terminal positioning method and related equipment and system
CN104978014A (en) * 2014-04-11 2015-10-14 维沃移动通信有限公司 Method for quickly calling application program or system function, and mobile terminal thereof
CN105242861A (en) * 2015-10-15 2016-01-13 广东欧珀移动通信有限公司 Ultrasonic wave-based parameter adjustment method and device
US20170017306A1 (en) * 2013-01-15 2017-01-19 Leap Motion, Inc. Dynamic, free-space user interactions for machine control
CN106445567A (en) * 2015-08-04 2017-02-22 西安中兴新软件有限责任公司 Starting method for terminal application and terminal
WO2017128676A1 (en) * 2016-01-29 2017-08-03 宇龙计算机通信科技(深圳)有限公司 Application starting method, starting device, and terminal
CN107660278A (en) * 2015-06-19 2018-02-02 英特尔公司 To the technology of the computing resource of control electronics
US20180088849A1 (en) * 2016-09-29 2018-03-29 Intel Corporation Multi-dimensional optimization of electrical parameters for memory training
CN108646938A (en) * 2018-03-13 2018-10-12 广东欧珀移动通信有限公司 Configuration method, device, terminal and the storage medium of touch screen
CN109271208A (en) * 2018-09-26 2019-01-25 Oppo广东移动通信有限公司 Parameter setting method, device, terminal and storage medium
CN110114256A (en) * 2016-10-03 2019-08-09 三菱电机株式会社 Automatic Pilot control parameter change device and automatic Pilot control parameter variation

Patent Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5288938A (en) * 1990-12-05 1994-02-22 Yamaha Corporation Method and apparatus for controlling electronic tone generation in accordance with a detected type of performance gesture
US20110173574A1 (en) * 2010-01-08 2011-07-14 Microsoft Corporation In application gesture interpretation
WO2012126103A1 (en) * 2011-03-23 2012-09-27 Mgestyk Technologies Inc. Apparatus and system for interfacing with computers and other electronic devices through gestures by using depth sensing and methods of use
CN102426675A (en) * 2011-11-07 2012-04-25 北京邮电大学 Method for optimizing working states of software and hardware by information related to user subjective feelings
US20170017306A1 (en) * 2013-01-15 2017-01-19 Leap Motion, Inc. Dynamic, free-space user interactions for machine control
CN104460935A (en) * 2013-09-18 2015-03-25 联想(北京)有限公司 Information processing method and electronic equipment
CN104748737A (en) * 2013-12-30 2015-07-01 华为技术有限公司 Multi terminal positioning method and related equipment and system
CN104978014A (en) * 2014-04-11 2015-10-14 维沃移动通信有限公司 Method for quickly calling application program or system function, and mobile terminal thereof
CN103984416A (en) * 2014-06-10 2014-08-13 北京邮电大学 Gesture recognition method based on acceleration sensor
CN104615366A (en) * 2014-12-31 2015-05-13 中国人民解放军国防科学技术大学 Gesture interactive method oriented to multiple devices
CN107660278A (en) * 2015-06-19 2018-02-02 英特尔公司 To the technology of the computing resource of control electronics
CN106445567A (en) * 2015-08-04 2017-02-22 西安中兴新软件有限责任公司 Starting method for terminal application and terminal
CN105242861A (en) * 2015-10-15 2016-01-13 广东欧珀移动通信有限公司 Ultrasonic wave-based parameter adjustment method and device
WO2017128676A1 (en) * 2016-01-29 2017-08-03 宇龙计算机通信科技(深圳)有限公司 Application starting method, starting device, and terminal
US20180088849A1 (en) * 2016-09-29 2018-03-29 Intel Corporation Multi-dimensional optimization of electrical parameters for memory training
CN110114256A (en) * 2016-10-03 2019-08-09 三菱电机株式会社 Automatic Pilot control parameter change device and automatic Pilot control parameter variation
CN108646938A (en) * 2018-03-13 2018-10-12 广东欧珀移动通信有限公司 Configuration method, device, terminal and the storage medium of touch screen
CN109271208A (en) * 2018-09-26 2019-01-25 Oppo广东移动通信有限公司 Parameter setting method, device, terminal and storage medium

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
NAUSHEEN QAISER QURESHI: "Interactive Control through Hand Gestures", 《2013 11TH INTERNATIONAL CONFERENCE ON FRONTIERS OF INFORMATION TECHNOLOGY》 *
吴彩芳等: "基于手势识别的人机交互技术研究", 《计算机时代》 *
孙效华等: "隔空手势交互的设计要素与原则", 《包装工程》 *
高雄: "基于手势的自然用户界面开发环境研究与实现", 《西北大学信息科技专辑》 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113190106A (en) * 2021-03-16 2021-07-30 青岛小鸟看看科技有限公司 Gesture recognition method and device and electronic equipment
CN113190106B (en) * 2021-03-16 2022-11-22 青岛小鸟看看科技有限公司 Gesture recognition method and device and electronic equipment
CN113110887A (en) * 2021-03-31 2021-07-13 联想(北京)有限公司 Information processing method and device, electronic equipment and storage medium
CN113110887B (en) * 2021-03-31 2023-07-21 联想(北京)有限公司 Information processing method, device, electronic equipment and storage medium
CN115079822A (en) * 2022-05-31 2022-09-20 荣耀终端有限公司 Air-spaced gesture interaction method and device, electronic chip and electronic equipment
CN115079822B (en) * 2022-05-31 2023-07-21 荣耀终端有限公司 Alternate gesture interaction method and device, electronic chip and electronic equipment

Also Published As

Publication number Publication date
CN111273769B (en) 2022-06-17

Similar Documents

Publication Publication Date Title
CN109213539B (en) Memory recovery method and device
CN111273769B (en) Equipment control method and device, electronic equipment and storage medium
CN108363593B (en) Application program preloading method and device, storage medium and terminal
CN107943650B (en) Application program control method and device, storage medium and terminal
CN110764906B (en) Memory recovery processing method and device, electronic equipment and storage medium
CN109033247B (en) Application program management method and device, storage medium and terminal
CN108762831B (en) Application program preloading method and device, storage medium and terminal
CN103218137B (en) Automatically method and the device of control is adjusted according to user operation
CN109495875B (en) SIM card selection method and device, electronic equipment and storage medium
CN108647056B (en) Application program preloading method and device, storage medium and terminal
CN106937258B (en) A kind of control method of broadcast, device and mobile terminal
CN108770050B (en) Control method and device for carrier aggregation function
CN111124173A (en) Working state switching method and device of touch screen, mobile terminal and storage medium
CN108762836B (en) Management method and device for preloaded application, storage medium and intelligent terminal
CN108845838B (en) Application program preloading method and device, storage medium and terminal
CN112135081B (en) Mode control method and device, frame insertion chip and electronic equipment
CN111008090B (en) Battery electric quantity control method and device, storage medium and terminal equipment
CN107070670A (en) Broadcast transmission person is controlled to send method, device and the terminal device of broadcast message
CN109195197A (en) Dual-card dual-standby terminal network selecting method, device, terminal device and storage medium
CN105511587B (en) Method and device for controlling radio frequency link
CN104298505A (en) Operation method for application program
CN113783994A (en) Prompt information display method and device, electronic equipment and storage medium
CN105138107A (en) Mobile terminal downloading method and device and mobile terminal
CN111970749A (en) Network connection method and device, intelligent household equipment and intelligent household system
CN108700975B (en) Anti-interference method and device for touch panel

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant