CN114724233A - Method and device for gesture control of terminal equipment and terminal equipment - Google Patents
Method and device for gesture control of terminal equipment and terminal equipment Download PDFInfo
- Publication number
- CN114724233A CN114724233A CN202011519884.3A CN202011519884A CN114724233A CN 114724233 A CN114724233 A CN 114724233A CN 202011519884 A CN202011519884 A CN 202011519884A CN 114724233 A CN114724233 A CN 114724233A
- Authority
- CN
- China
- Prior art keywords
- gesture
- instruction
- posture
- terminal equipment
- target
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 42
- 230000002452 interceptive effect Effects 0.000 claims abstract description 7
- 238000012545 processing Methods 0.000 claims description 12
- 238000001914 filtration Methods 0.000 claims description 4
- 230000003993 interaction Effects 0.000 abstract description 14
- 238000004891 communication Methods 0.000 description 6
- 238000010586 diagram Methods 0.000 description 6
- 230000006870 function Effects 0.000 description 5
- 238000004590 computer program Methods 0.000 description 4
- 238000004422 calculation algorithm Methods 0.000 description 3
- 238000004364 calculation method Methods 0.000 description 3
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 238000003062 neural network model Methods 0.000 description 2
- 238000012790 confirmation Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000007781 pre-processing Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 230000001052 transient effect Effects 0.000 description 1
- 238000005406 washing Methods 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02D—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
- Y02D30/00—Reducing energy consumption in communication networks
- Y02D30/70—Reducing energy consumption in communication networks in wireless communication networks
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The application relates to the technical field of human-computer interaction, and discloses a method for controlling the posture of terminal equipment, which comprises the following steps: obtaining a target posture instruction corresponding to the user posture; adjusting an initial clock period according to the historical posture instruction frequency and the resource occupation value of the terminal equipment to obtain a current clock period; and controlling the terminal equipment to execute corresponding interactive operation according to the target posture instruction based on the current clock period. Under the condition that the use performance of the terminal equipment is limited, no matter how the frequency of data received by a user interface interaction layer changes, the initial clock period is adjusted according to the historical posture instruction frequency and the resource occupation value of the terminal equipment to obtain the current clock period, and the terminal equipment is controlled to execute corresponding interactive operation according to the target posture instruction based on the current clock period, so that the operation of the terminal equipment is uniform, the focus movement is continuous, and the user experience can be better improved. The application also discloses a device for controlling the terminal equipment posture and the terminal equipment.
Description
Technical Field
The present application relates to the field of human-computer interaction technologies, and in particular, to a method and an apparatus for controlling a terminal device gesture, and a terminal device.
Background
The control of a terminal device such as a television set through a user gesture, such as a hand gesture or a body gesture, is one of man-machine interaction technologies, and compared with the traditional input through a mouse, a keyboard and a remote controller, gesture control does not require a user to hold a specific input device, and only a specific user gesture is required to control the terminal device or input specific information into the terminal device. In view of convenience and interest of gesture control, the gesture control is being widely applied to control of terminal devices in the industry.
The gesture control terminal device is used for acquiring static or dynamic gesture image data made by a user through a camera, outputting the image to an image recognition neural network model for processing after preprocessing, and obtaining gesture information such as the position, size, gesture type and the like of the user gesture in the image after the image data is cut and classified by the image recognition neural network model. The gesture data processing algorithm filters and processes the gesture information into data required by a user interface interaction layer, and finally the user interface interaction layer completes user interface display and operation which can be sensed by a user, wherein the operation refers to mapping the gesture into operations such as up, down, left, right, confirmation, return and the like of a remote controller.
In the process of implementing the embodiments of the present disclosure, it is found that at least the following problems exist in the related art:
the time taken for different terminal devices to recognize the gesture information in the image is different, and even if the terminal devices are the same, the time taken for recognizing the gesture information in the image is different in different states, so that the frequency of data handed to the user interface interaction layer is greatly fluctuated. Therefore, the device terminal is controlled according to the frequency of the data received by the user interface interaction layer to execute corresponding interaction operation according to the gesture instruction, which easily causes discontinuous focus movement of the terminal device and poor user experience.
Disclosure of Invention
The following presents a simplified summary in order to provide a basic understanding of some aspects of the disclosed embodiments. This summary is not an extensive overview nor is intended to identify key/critical elements or to delineate the scope of such embodiments but rather as a prelude to the more detailed description that is presented later.
The embodiment of the disclosure provides a method and a device for controlling a posture of a terminal device, and the terminal device, so as to solve the problems that the focus movement of the terminal device is discontinuous and the user experience is poor due to the fact that the frequency control device terminal which receives data according to a user interface interaction layer at present executes corresponding interaction operation according to a posture instruction.
In some embodiments, a method for terminal device gesture control comprises: obtaining a target posture instruction corresponding to the user posture; adjusting an initial clock period according to the historical posture instruction frequency and the resource occupation value of the terminal equipment to obtain a current clock period; and controlling the terminal equipment to execute corresponding interactive operation according to the target posture instruction based on the current clock period.
In some embodiments, an apparatus for terminal device gesture control includes a processor and a memory storing program instructions, the processor configured to, when executing the program instructions, perform the aforementioned method for terminal device gesture control.
In some embodiments, the terminal device comprises means for terminal device gesture control.
The method and the device for controlling the posture of the terminal equipment and the terminal equipment provided by the embodiment of the disclosure can realize the following technical effects:
under the condition that the use performance of the terminal equipment is limited, no matter how the frequency of data received by a user interface interaction layer changes, the initial clock period is adjusted according to the historical posture instruction frequency and the resource occupation value of the terminal equipment to obtain the current clock period, and the terminal equipment is controlled to execute corresponding interactive operation according to the target posture instruction based on the current clock period, so that the operation of the terminal equipment is uniform, the focus movement is continuous, and the user experience can be better improved.
The foregoing general description and the following description are exemplary and explanatory only and are not restrictive of the application.
Drawings
One or more embodiments are illustrated by way of example in the accompanying drawings, which correspond to the accompanying drawings and not in limitation thereof, in which elements having the same reference numeral designations are shown as like elements and not in limitation thereof, and wherein:
fig. 1 is a flowchart illustrating a method for gesture control of a terminal device according to an embodiment of the present disclosure;
FIG. 2 is a flow chart illustrating a method for gesture control of a terminal device according to an embodiment of the present disclosure;
FIG. 3 is a flowchart illustrating a method for gesture control of a terminal device according to an embodiment of the present disclosure;
fig. 4 is a schematic structural diagram of an apparatus for controlling a terminal device gesture according to an embodiment of the present disclosure.
Detailed Description
So that the manner in which the features and elements of the disclosed embodiments can be understood in detail, a more particular description of the disclosed embodiments, briefly summarized above, may be had by reference to the embodiments, some of which are illustrated in the appended drawings. In the following description of the technology, for purposes of explanation, numerous details are set forth in order to provide a thorough understanding of the disclosed embodiments. However, one or more embodiments may be practiced without these details. In other instances, well-known structures and devices may be shown in simplified form in order to simplify the drawing.
The terms "first," "second," and the like in the description and in the claims, and the above-described drawings of embodiments of the present disclosure, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It should be understood that the data so used may be interchanged under appropriate circumstances such that embodiments of the present disclosure described herein may be made. Furthermore, the terms "comprising" and "having," as well as any variations thereof, are intended to cover non-exclusive inclusions.
The term "plurality" means two or more unless otherwise specified. In the embodiment of the present disclosure, the character "/" indicates that the preceding and following objects are in an or relationship. For example, A/B represents: a or B. The term "and/or" is an associative relationship that describes objects, meaning that three relationships may exist. For example, a and/or B, represents: a or B, or A and B.
The terminal device is, for example, a television, a refrigerator, an air conditioner, a washing machine, a water heater, or the like, or any combination thereof.
With reference to fig. 1, an embodiment of the present disclosure provides a method for controlling a terminal device gesture, including the following steps:
s101: and obtaining a target gesture instruction corresponding to the user gesture.
The target gesture instruction is a final gesture instruction for instructing the terminal device to perform a corresponding operation.
S102: and adjusting the initial clock period according to the historical posture instruction frequency and the resource occupation value of the terminal equipment to obtain the current clock period.
Here, the historical posture instruction frequency may be calculated according to the number of times of posture instructions acquired within a preset time period, that is, may be calculated according to the following formula:
wherein the content of the first and second substances,in the historical posture instruction frequency, Δ t is the total time length in a preset time period, and n is the number of times of posture instructions acquired in the preset time period.
The resource occupation value of the terminal equipment is used for reflecting the resource occupation condition of the hardware of the terminal equipment.
Optionally, adjusting the initial clock cycle according to the historical posture instruction frequency and the resource occupation value of the terminal device to obtain the current clock cycle includes: calculating to obtain the resource overall condition parameters of the terminal equipment based on the historical posture instruction frequency and the resource occupation value; calculating the difference value of the resource overall condition parameter and a preset threshold value under the condition that the resource overall condition parameter is greater than the preset threshold value; and adjusting the initial clock period by using the difference value to obtain the current clock period.
And the resource overall situation parameter of the terminal equipment reflects the overall situation of the terminal equipment resource. The larger the overall resource condition parameter is, the more tense the resources of the terminal equipment are, the worse the operating environment of the posture control is, and the limitation of the clock period needs to be relaxed; the smaller the overall resource condition parameter is, the larger the residual resource of the terminal equipment is represented, and the better the operating environment of the posture control is, the clock period can be properly shortened.
The current clock period can be calculated according to the following formula:
T=T0+(α-β)
wherein T is the current clock cycle, T0In the initial clock period, α is a parameter of the overall condition of the resource, and β is a preset threshold.
For example, the initial clock period T0When the value is 0.3, the preset threshold value beta is 0.5, and the resource overall situation parameter alpha is 0.8, the current clock period T is 0.6; when the resource overall situation parameter α is 0.525, the current clock cycle T is 0.325.
Optionally, calculating a resource overall situation parameter of the terminal device based on the historical posture instruction frequency and the resource occupancy value, where the calculating includes: acquiring a frequency weight of historical posture instruction frequency and an occupation weight of a resource occupation value; and calculating to obtain the resource overall condition parameters based on the historical posture instruction frequency, the frequency weight, the resource occupation value and the occupation weight.
Alternatively, the resource overall situation parameter may be calculated according to the following formula:
wherein alpha is a parameter of the overall situation of the resource,for historical attitude command frequency, θ1In order to be the weight of the frequency,to a resource occupation value, θ2Are the occupancy weights.
Optionally, the resource occupation value includes a central processing unit occupation value and a memory occupation value; the occupation weight of the resource occupation value is the sum of the occupation weight of the central processing unit occupation value and the occupation weight of the memory occupation value.
The central processing unit occupation value is used for reflecting the occupation condition of a Central Processing Unit (CPU) of the terminal equipment, and the memory occupation value is used for reflecting the occupation condition of a memory of the terminal equipment.
The resource overall situation parameter can be obtained by calculating according to the following formula:
wherein alpha is a parameter of the overall situation of the resource,for historical attitude command frequency, θ1In order to be the weight of the frequency,is the CPU occupancy value, θ21Is the occupation weight of the occupation value of the central processing unit,is the memory footprint value, θ22The occupation weight, theta, of the memory occupation value2=θ21+θ22And theta1+θ21+θ22=1。
Here, historical gesture command frequencyHas a value range of [ A ]1,A2]Frequency weight θ1The values of (A) are as follows:
for example,has a value range of [0.5,10 ]](ii) a Frequency weight theta1The occupation weight theta of the occupation value of the central processing unit21And an occupancy weight θ of the memory occupancy value220.2, 0.6 and 0.2, respectively. Obtaining the historical posture instruction frequency in the calculationIs 0.5 (i.e. 5 gesture commands acquired in 2.5 seconds), CPU occupancy valueIs 0.9, memory footprint valueWhen the value is 0.8, the resource overall condition parameter α is 0.2 × 0.5+0.6 × 0.9+0.2 × 0.8 — 0.8; obtaining the historical posture instruction frequency in the calculationIs 0.2 (i.e. 5 gesture commands acquired in 1 second), CPU occupancy valueIs 0.4, memory footprint valueAt 0.9, the resource overall condition parameter α is (0 × 0.2+0.6 × 0.4+0.2 × 0.9)/(0.6+0.2) ═ 0.525.
S103: and controlling the terminal equipment to execute corresponding interactive operation according to the target posture instruction based on the current clock period.
And after the current clock period is obtained through calculation, controlling the terminal equipment to execute corresponding interactive operation according to the target posture instruction based on the current clock period.
By adopting the method for controlling the posture of the terminal equipment, under the condition that the use performance of the terminal equipment is limited, no matter how the frequency of data received by a user interface interaction layer changes, the terminal equipment is controlled to execute corresponding interaction operation according to a target posture instruction based on the current clock period, so that the operation of the terminal equipment is uniform, the focus movement is continuous, and the user experience can be better improved; meanwhile, the initial clock period is adjusted according to the historical posture instruction frequency and the resource occupation value of the terminal equipment to obtain the current clock period, the clock period is adjusted through the use performance monitoring of the terminal equipment, the most appropriate clock period in the current state of the terminal equipment is found, and high fault tolerance and tolerance can be guaranteed.
As shown in fig. 2, obtaining the target gesture command corresponding to the user gesture includes the following steps:
s201: obtaining a plurality of gesture instructions corresponding to the user gestures at different times.
S202: one gesture command is determined from the plurality of gesture commands to be a target gesture command.
Optionally, determining one gesture instruction as the target gesture instruction from the plurality of gesture instructions comprises: obtaining a confidence level of each gesture instruction; filtering gesture instructions with confidence degrees smaller than a preset confidence degree in the gesture instructions; one gesture command is determined as a target gesture command from the remaining gesture commands.
The confidence level of a gesture command represents the confidence level of the gesture command, i.e. the probability that the algorithm has obtained this gesture command data as correct. If the confidence of the gesture command is 95%, it indicates that the gesture command has a 95% probability of being correct. The confidence level is preset to be in the range of [ 70%, 85% ], for example, 70%, 80%, 85%. By filtering the posture instructions with the confidence coefficient smaller than the preset confidence coefficient from the plurality of posture instructions, the accuracy of the target posture instruction finally obtained by the terminal equipment can be improved.
Optionally, determining one gesture instruction as the target gesture instruction from the remaining gesture instructions comprises: obtaining the characteristic parameters of each rest gesture instruction; wherein the characteristic parameters comprise hit probability, gesture command identifier and/or time interval from last gesture command; one gesture command is determined to be a target gesture command from the remaining gesture commands based on the characteristic parameters.
The hit probability of the posture instruction is the output probability of the posture instruction in the preset time period history, for example, the posture instruction is output for 5 times in the preset time period history, wherein if the a posture instruction is output for 3 times, the hit probability (output probability) of the a posture instruction is 60%; the gesture command identifier is an identification number (e.g., a unique identity code) of the gesture command; the time interval between the gesture command and the previous gesture command is the time interval between the gesture command receiving time and the previous gesture command receiving time.
Optionally, determining one gesture command as the target gesture command from the remaining gesture commands based on the feature parameters comprises: determining the gesture instruction with the highest hit probability as a target gesture instruction when the time interval is greater than or equal to a preset time interval; determining the gesture instruction with the highest hit probability as a target gesture instruction under the condition that the time interval is smaller than the preset time interval and the probability difference of two gesture instructions with the hit probability positioned at the top two bits is larger than or equal to the preset probability difference; when the time interval is smaller than a preset time interval, the probability difference of two posture instructions with the hit probability positioned at the top two bits is smaller than the preset probability difference, and the posture instruction identifier is different from the posture instruction identifier of the posture instruction output last time, determining the posture instruction with the highest hit probability as a target posture instruction; and when the time interval is smaller than a preset time interval, the probability difference of two posture instructions with the hit probability positioned at the first two bits is smaller than the preset probability difference, and the posture instruction identifier is the same as the posture instruction identifier of the posture instruction output last time, determining that the posture instruction output last time is the target posture instruction.
For example, the preset confidence level is 80%, the preset time interval is 0.2s (second), and the preset probability difference is 0.05. 4 groups of posture instruction data (such as gesture instruction data) are obtained in a period of time, wherein the posture instruction data are respectively G1, G2, G3 and G4, the confidence degrees are respectively 0.93, 0.76, 0.89 and 0.82, each group of two posture instructions (the type directly represents the index by data) with the hit probability at the top two bits, and the time interval and probability difference of the two posture instructions are respectively: (3, 4, 0.2s, 0.4), (4, 3, 0.2s, 0.02), (3, 4, 0.1s, 0.6), (1, 3, 0.15s, 0.01). Then, in the first group of data, the time interval is equal to the preset time interval, so that the gesture instruction 3 with the highest hit probability is determined as the target gesture instruction; in the second group of data, the confidence coefficient is smaller than the preset confidence coefficient, so that the data are filtered; in the third group of data, the time interval is smaller than the preset time interval, and the probability difference of two posture instructions with the hit probability positioned at the top two bits is larger than the preset probability difference, so that the posture instruction 3 with the highest hit probability is determined as the target posture instruction; in the fourth group of data, the time interval is smaller than the preset time interval, the probability difference between two posture instructions with the hit probability located at the top two bits is smaller than the preset probability difference, and the posture instruction identifier is the same as the posture instruction identifier of the last output posture instruction (3), so that the last output posture instruction 3 is determined as the target posture instruction.
In the embodiment of the disclosure, firstly, posture instructions with the confidence coefficient smaller than the preset confidence coefficient are filtered out from a plurality of posture instructions, and then one posture instruction is determined as a target posture instruction from the rest posture instructions based on the characteristic parameters of the posture instructions, so that the accuracy of the final target posture instruction of the terminal equipment can be improved, and the accuracy of the terminal posture control is further improved.
An implementation of obtaining a target gesture instruction corresponding to a user gesture is described below with reference to fig. 3, which specifically includes the following steps:
s301: obtaining a plurality of gesture instructions corresponding to the user gestures at different times.
S302: a confidence level is obtained for each gesture instruction.
S303: and judging whether the confidence coefficient of each gesture instruction is smaller than a preset confidence coefficient.
S304: when the confidence coefficient of the gesture instruction is smaller than the preset confidence coefficient, filtering the gesture instruction; otherwise, S305 is executed.
S305: the feature parameters of each remaining gesture command are obtained.
Wherein the characteristic parameters include hit probability, gesture command identifier, and/or time interval from last gesture command.
S306: and judging whether the time interval is greater than or equal to a preset time interval.
S307: determining the gesture instruction with the highest hit probability as a target gesture instruction when the time interval is greater than or equal to a preset time interval; otherwise, S308 is executed.
S308: and judging whether the probability difference of the two posture instructions with the hit probability positioned at the top two digits is larger than or equal to the preset probability difference.
S309: determining the gesture instruction with the highest hit probability as a target gesture instruction under the condition that the probability difference of two gesture instructions with the highest hit probability located at the first two bits is larger than or equal to the preset probability difference; otherwise, S310 is performed.
S310: it is determined whether the gesture command identifier is the same as the gesture command identifier of the last gesture command output.
S311: when the posture command identifier is the same as the posture command identifier of the previous output posture command, the previous output posture command is determined as the target posture command.
S312: when the posture command identifier is different from the posture command identifier of the previous output posture command, the posture command with the highest hit probability is determined as the target posture command.
In the embodiment of the disclosure, firstly, posture instructions with the confidence coefficient smaller than the preset confidence coefficient are filtered out from a plurality of posture instructions, and then one posture instruction is determined as a target posture instruction from the rest posture instructions based on the characteristic parameters of the posture instructions, so that the accuracy of the final target posture instruction of the terminal equipment can be improved, and the accuracy of the terminal posture control is further improved.
The embodiment of the present disclosure shown in fig. 4 provides an apparatus for gesture control of a terminal device, which includes a processor (processor)40 and a memory (memory)41, and may further include a Communication Interface (Communication Interface)42 and a bus 43. The processor 40, the communication interface 42 and the memory 41 can communicate with each other through the bus 43. Communication interface 42 may be used for information transfer. Processor 40 may invoke logic instructions in memory 41 to perform the method for terminal device gesture control of the above-described embodiments.
In addition, the logic instructions in the memory 41 may be implemented in the form of software functional units and stored in a computer readable storage medium when the logic instructions are sold or used as independent products.
The memory 41 is a computer-readable storage medium, and can be used for storing software programs, computer-executable programs, such as program instructions/modules corresponding to the methods in the embodiments of the present disclosure. The processor 40 executes functional applications and data processing by executing program instructions/modules stored in the memory 41, that is, implements the method for terminal device gesture control in the above-described method embodiments.
The memory 41 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created according to the use of the terminal device, and the like. Further, the memory 41 may include a high-speed random access memory, and may also include a nonvolatile memory.
The embodiment of the disclosure provides a terminal device, which comprises the above device for controlling the posture of the terminal device.
Embodiments of the present disclosure provide a computer-readable storage medium storing computer-executable instructions configured to perform the above-described method for terminal device gesture control.
Embodiments of the present disclosure provide a computer program product comprising a computer program stored on a computer-readable storage medium, the computer program comprising program instructions which, when executed by a computer, cause the computer to perform the above-described method for terminal device gesture control.
The computer-readable storage medium described above may be a transitory computer-readable storage medium or a non-transitory computer-readable storage medium.
The technical solution of the embodiments of the present disclosure may be embodied in the form of a software product, where the computer software product is stored in a storage medium and includes one or more instructions to enable a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method of the embodiments of the present disclosure. And the aforementioned storage medium may be a non-transitory storage medium comprising: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes, and may also be a transient storage medium.
The above description and drawings sufficiently illustrate embodiments of the disclosure to enable those skilled in the art to practice them. Other embodiments may incorporate structural, logical, electrical, process, and other changes. The examples merely typify possible variations. Individual components and functions are optional unless explicitly required, and the sequence of operations may vary. Portions and features of some embodiments may be included in or substituted for those of others. The scope of the disclosed embodiments includes the full ambit of the claims, as well as all available equivalents of the claims. As used in this application, although the terms "first," "second," etc. may be used in this application to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first element could be termed a second element, and, similarly, a second element could be termed a first element, unless the meaning of the description changes, so long as all occurrences of the "first element" are renamed consistently and all occurrences of the "second element" are renamed consistently. The first and second elements are both elements, but may not be the same elements. Furthermore, the words used in the specification are words of description only and are not intended to limit the claims. As used in the description of the embodiments and the claims, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. Similarly, the term "and/or" as used in this application is meant to encompass any and all possible combinations of one or more of the associated listed. Furthermore, the terms "comprises" and/or "comprising," when used in this application, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of additional identical elements in the process, method or apparatus comprising the element. In this document, each embodiment may be described with emphasis on differences from other embodiments, and the same and similar parts between the respective embodiments may be referred to each other. For methods, products, etc. of the embodiment disclosures, reference may be made to the description of the method section for relevance if it corresponds to the method section of the embodiment disclosure.
Those of skill in the art would appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software may depend upon the particular application and design constraints imposed on the technical solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the disclosed embodiments. It can be clearly understood by the skilled person that, for convenience and brevity of description, the specific working processes of the system, the apparatus and the unit described above may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the embodiments disclosed herein, the disclosed methods, products (including but not limited to devices, apparatuses, etc.) may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units may be merely a logical division, and in actual implementation, there may be another division, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form. The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to implement the present embodiment. In addition, functional units in the embodiments of the present disclosure may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. In the description corresponding to the flowcharts and block diagrams in the figures, operations or steps corresponding to different blocks may also occur in different orders than disclosed in the description, and sometimes there is no specific order between the different operations or steps. For example, two sequential operations or steps may in fact be executed substantially concurrently, or they may sometimes be executed in the reverse order, depending upon the functionality involved. Each block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
Claims (10)
1. A method for gesture control of a terminal device, comprising:
obtaining a target posture instruction corresponding to the user posture;
adjusting an initial clock period according to the historical posture instruction frequency and the resource occupation value of the terminal equipment to obtain a current clock period;
and controlling the terminal equipment to execute corresponding interactive operation according to the target posture instruction based on the current clock period.
2. The method of claim 1, wherein the adjusting an initial clock cycle according to the historical gesture instruction frequency and the resource occupancy value of the terminal device to obtain a current clock cycle comprises:
calculating and obtaining the resource overall situation parameter of the terminal equipment based on the historical posture instruction frequency and the resource occupation value;
under the condition that the resource overall situation parameter is larger than a preset threshold value, calculating a difference value between the resource overall situation parameter and the preset threshold value;
and adjusting the initial clock period by using the difference value to obtain the current clock period.
3. The method according to claim 2, wherein the calculating and obtaining the resource overall situation parameter of the terminal device based on the historical posture instruction frequency and the resource occupancy value comprises:
obtaining a frequency weight of the historical posture instruction frequency and an occupation weight of the resource occupation value;
and calculating to obtain the resource overall situation parameter based on the historical posture instruction frequency, the frequency weight, the resource occupation value and the occupation weight.
4. The method of claim 3, wherein the resource occupancy values include a central processing unit occupancy value and a memory occupancy value; the occupation weight of the resource occupation value is the sum of the occupation weight of the central processing unit occupation value and the occupation weight of the memory occupation value.
5. The method according to any one of claims 1 to 4, wherein the obtaining of the target gesture instruction corresponding to the user gesture comprises:
obtaining a plurality of gesture instructions corresponding to user gestures at different moments;
determining one gesture instruction from the plurality of gesture instructions as the target gesture instruction.
6. The method of claim 5, wherein determining one gesture instruction from the plurality of gesture instructions as the target gesture instruction comprises:
obtaining a confidence level of each gesture instruction;
filtering out the posture instructions with the confidence coefficient smaller than a preset confidence coefficient from the plurality of posture instructions;
and determining one gesture command as the target gesture command from the rest gesture commands.
7. The method of claim 6, wherein the determining one of the remaining gesture instructions as the target gesture instruction comprises:
obtaining the characteristic parameters of each rest gesture instruction; wherein the feature parameters comprise hit probability, gesture command identifier, and/or time interval to last gesture command;
and determining one gesture command from the rest of gesture commands as the target gesture command based on the characteristic parameters.
8. The method of claim 7, wherein the determining one gesture command from the remaining gesture commands as the target gesture command based on the feature parameters comprises:
determining the gesture instruction with the highest hit probability as the target gesture instruction when the time interval is greater than or equal to a preset time interval;
determining the gesture instruction with the highest hit probability as the target gesture instruction when the time interval is smaller than a preset time interval and the probability difference of two gesture instructions with the hit probabilities located at the top two bits is larger than or equal to the preset probability difference;
when the time interval is smaller than a preset time interval, the probability difference of two posture instructions with the hit probability being located at the top two bits is smaller than the preset probability difference, and the posture instruction identifier is different from the posture instruction identifier of the posture instruction output last time, determining the posture instruction with the highest hit probability as the target posture instruction;
and determining that the last output gesture instruction is the target gesture instruction when the time interval is smaller than a preset time interval, the probability difference of two gesture instructions with the hit probabilities located at the first two bits is smaller than the preset probability difference, and the gesture instruction identifier is the same as the gesture instruction identifier of the last output gesture instruction.
9. An apparatus for terminal device gesture control, comprising a processor and a memory storing program instructions, characterized in that the processor is configured to perform the method for terminal device gesture control according to any of claims 1 to 8 when executing the program instructions.
10. A terminal device characterized by comprising an apparatus for terminal device gesture control as claimed in claim 9.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011519884.3A CN114724233A (en) | 2020-12-21 | 2020-12-21 | Method and device for gesture control of terminal equipment and terminal equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011519884.3A CN114724233A (en) | 2020-12-21 | 2020-12-21 | Method and device for gesture control of terminal equipment and terminal equipment |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114724233A true CN114724233A (en) | 2022-07-08 |
Family
ID=82229832
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011519884.3A Pending CN114724233A (en) | 2020-12-21 | 2020-12-21 | Method and device for gesture control of terminal equipment and terminal equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114724233A (en) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110258417A1 (en) * | 2010-04-20 | 2011-10-20 | Senthilkannan Chandrasekaran | Power and throughput optimization of an unbalanced instruction pipeline |
CN102906696A (en) * | 2010-03-26 | 2013-01-30 | 维尔图尔梅特里克斯公司 | Fine grain performance resource management of computer systems |
CN109062715A (en) * | 2018-07-05 | 2018-12-21 | Oppo(重庆)智能科技有限公司 | The determination method, apparatus and terminal of memory clock frequency |
WO2020065931A1 (en) * | 2018-09-28 | 2020-04-02 | 日本電気株式会社 | Photographing control system, photographing control method, control device, control method, and storage medium |
CN111159660A (en) * | 2019-12-30 | 2020-05-15 | 龙芯中科技术有限公司 | Instruction execution method, processor and electronic equipment |
-
2020
- 2020-12-21 CN CN202011519884.3A patent/CN114724233A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102906696A (en) * | 2010-03-26 | 2013-01-30 | 维尔图尔梅特里克斯公司 | Fine grain performance resource management of computer systems |
US20110258417A1 (en) * | 2010-04-20 | 2011-10-20 | Senthilkannan Chandrasekaran | Power and throughput optimization of an unbalanced instruction pipeline |
CN109062715A (en) * | 2018-07-05 | 2018-12-21 | Oppo(重庆)智能科技有限公司 | The determination method, apparatus and terminal of memory clock frequency |
WO2020065931A1 (en) * | 2018-09-28 | 2020-04-02 | 日本電気株式会社 | Photographing control system, photographing control method, control device, control method, and storage medium |
CN111159660A (en) * | 2019-12-30 | 2020-05-15 | 龙芯中科技术有限公司 | Instruction execution method, processor and electronic equipment |
Non-Patent Citations (1)
Title |
---|
张坤;毕方鸿;李克丽;梁颖;杨军;: "一种基于Qsys的双IP核无人机飞行控制系统设计与实现", 实验科学与技术, no. 01, 28 February 2020 (2020-02-28) * |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3510529B1 (en) | Deep machine learning to perform touch motion prediction | |
CN105261361A (en) | Methods and systems for managing speech recognition in a multi-speech system environment | |
CN107450717B (en) | Information processing method and wearable device | |
CN113190006A (en) | Robot path planning method and device and storage medium | |
CN114091589B (en) | Model training method and device, electronic equipment and medium | |
CN113531819B (en) | Method and device for preheating air conditioner, air conditioner and air conditioning system | |
CN114494943A (en) | Novel video target detection and evaluation method, device, product and storage medium | |
CN114724233A (en) | Method and device for gesture control of terminal equipment and terminal equipment | |
CN110941187A (en) | Household appliance control method and device | |
CN115660941A (en) | Image moving method and device, electronic equipment and computer readable storage medium | |
CN113531797B (en) | Method and device for preheating air conditioner, air conditioner and air conditioning system | |
CN111563401A (en) | Vehicle-mounted gesture recognition method and system, storage medium and electronic equipment | |
CN115601781A (en) | Method and device for dynamic gesture recognition and electronic equipment | |
CN112085030A (en) | Similar image determining method and device | |
CN113639432B (en) | Method and device for controlling air conditioner, air conditioner and readable storage medium | |
WO2020248903A1 (en) | Drug storage method and apparatus for dual-mechanical hand pharmacy control system | |
CN113251632A (en) | Method and device for controlling air supply of air conditioner and electronic equipment | |
CN111857073A (en) | Method, device and equipment for controlling bathing equipment | |
CN113758010A (en) | Method and device for controlling electric water heater | |
CN113251620A (en) | Method and device for controlling primary and secondary air conditioners and intelligent air conditioner | |
CN112991008A (en) | Position recommendation method and device and electronic equipment | |
CN112613409A (en) | Hand key point detection method and device, network equipment and storage medium | |
CN113639430B (en) | Method and device for controlling air conditioner, air conditioner and readable storage medium | |
CN114888809B (en) | Robot control method and device, computer readable storage medium and robot | |
CN112084394B (en) | Search result recommending method and device based on image recognition |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |