CN114529977A - Gesture control method and device for intelligent equipment and intelligent equipment - Google Patents

Gesture control method and device for intelligent equipment and intelligent equipment Download PDF

Info

Publication number
CN114529977A
CN114529977A CN202011203653.1A CN202011203653A CN114529977A CN 114529977 A CN114529977 A CN 114529977A CN 202011203653 A CN202011203653 A CN 202011203653A CN 114529977 A CN114529977 A CN 114529977A
Authority
CN
China
Prior art keywords
gesture
application program
application
requirement
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011203653.1A
Other languages
Chinese (zh)
Inventor
黄俊杰
谭心睿
瞿志
张宗锐
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Haier Smart Home Co Ltd
Qingdao Haier Multimedia Co Ltd
Original Assignee
Haier Smart Home Co Ltd
Qingdao Haier Multimedia Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Haier Smart Home Co Ltd, Qingdao Haier Multimedia Co Ltd filed Critical Haier Smart Home Co Ltd
Priority to CN202011203653.1A priority Critical patent/CN114529977A/en
Publication of CN114529977A publication Critical patent/CN114529977A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application relates to the technical field of gesture control and discloses a gesture control method for intelligent equipment. The gesture control method for the intelligent device comprises the steps that a first application program detects a gesture image and recognizes a gesture in the gesture image; the first application program sends gesture information of the gesture to the second application program under the condition that the second application program has a gesture requirement, so that the second application program executes an operation instruction corresponding to the gesture information; the intelligent device comprises a first application program and a second application program. According to the application, the first application program sends the gesture information of the gesture meeting the gesture requirement to the second application program under the condition that the second application program has the gesture requirement, so that the second application program can execute the operation instruction corresponding to the gesture detected by the first application program, the data operation process of the second application program is simplified, and the response speed of the gesture is improved. The application also discloses a gesture controlling means and smart machine for smart machine.

Description

Gesture control method and device for intelligent equipment and intelligent equipment
Technical Field
The application relates to the technical field of gesture control, for example, to a gesture control method and device for an intelligent device and the intelligent device.
Background
At present, many intelligent devices, such as smart televisions, adopt a remote controller or a touch control mode, and users need to directly touch the remote controller or a screen in the control process no matter using the remote controller to control the intelligent devices or using the touch control mode.
In the process of implementing the embodiments of the present disclosure, it is found that at least the following problems exist in the related art:
in practical application, the situation that both hands of a user are occupied and the remote controller or the screen is inconvenient to touch often occurs, and the intelligent device needs popularization of a control mode that the user does not need to touch.
Disclosure of Invention
The following presents a simplified summary in order to provide a basic understanding of some aspects of the disclosed embodiments. This summary is not an extensive overview nor is intended to identify key/critical elements or to delineate the scope of such embodiments but rather as a prelude to the more detailed description that is presented later.
The embodiment of the disclosure provides a gesture control method and device for an intelligent device and the intelligent device, and aims to solve the technical problem that the control of the intelligent device can be realized only by contacting a user with a remote controller or a screen when the control is performed in a remote controller or touch mode.
In some embodiments, the gesture control method for the smart device includes: the first application program detects the gesture image and identifies a gesture in the gesture image; the first application program sends gesture information of the gesture to the second application program under the condition that the second application program has a gesture requirement, so that the second application program executes an operation instruction corresponding to the gesture information; the intelligent device comprises a first application program and a second application program.
In some embodiments, the gesture control method for the smart device includes: the second application expresses a gesture requirement so that the first application can acquire the gesture requirement; the second application program receives the gesture information sent by the first application program and executes an operation instruction corresponding to the gesture information; the intelligent device comprises a first application program and a second application program; the gesture information is gesture information corresponding to a gesture in the gesture image detected by the first application program.
In some embodiments, the gesture control apparatus for a smart device includes a processor and a memory storing program instructions, wherein the processor executes the program instructions to perform the gesture control method for a smart device.
In some embodiments, the smart device includes a first application, a second application, and the gesture control apparatus for a smart device described above.
The gesture control method and device for the intelligent device and the intelligent device provided by the embodiment of the disclosure can achieve the following technical effects:
detecting a gesture image through a first application program, and identifying a gesture in the gesture image; the first application program sends gesture information of the gesture to the second application program under the condition that the second application program has a gesture requirement, so that the second application program executes an operation instruction corresponding to the gesture information; therefore, the second application program needing the gesture control function can enable the first application program to recognize the gesture in the gesture image through expressing the gesture requirement, send the gesture information of the gesture meeting the gesture requirement to the second application program, and enable the second application program to execute the operation instruction corresponding to the gesture detected by the first application program, so that a user can control the intelligent device through a non-contact gesture control mode, control operation of the user is facilitated, and convenience in controlling the intelligent device is improved.
The foregoing general description and the following description are exemplary and explanatory only and are not restrictive of the application.
Drawings
One or more embodiments are illustrated by way of example in the accompanying drawings, which correspond to the accompanying drawings and not in limitation thereof, in which elements having the same reference numeral designations are shown as like elements and not in limitation thereof, and wherein:
fig. 1 is a schematic diagram of a gesture control method for a smart device according to an embodiment of the present disclosure;
fig. 2 is a schematic diagram of another gesture control method for a smart device provided by an embodiment of the present disclosure;
FIG. 3 is a schematic diagram of interaction between a first application and a second application provided by an embodiment of the present disclosure;
fig. 4 is a schematic diagram of interaction between a first application and a second application provided by an embodiment of the present disclosure;
fig. 5 is a schematic diagram of interaction between another first application and a second application provided by an embodiment of the present disclosure;
fig. 6 is a schematic diagram of a gesture control apparatus for a smart device according to an embodiment of the present disclosure.
Detailed Description
So that the manner in which the features and elements of the disclosed embodiments can be understood in detail, a more particular description of the disclosed embodiments, briefly summarized above, may be had by reference to the embodiments, some of which are illustrated in the appended drawings. In the following description of the technology, for purposes of explanation, numerous details are set forth in order to provide a thorough understanding of the disclosed embodiments. However, one or more embodiments may be practiced without these details. In other instances, well-known structures and devices may be shown in simplified form in order to simplify the drawing.
The terms "first," "second," and the like in the description and in the claims, and the above-described drawings of embodiments of the present disclosure, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It should be understood that the data so used may be interchanged under appropriate circumstances such that embodiments of the present disclosure described herein may be made. Furthermore, the terms "comprising" and "having," as well as any variations thereof, are intended to cover non-exclusive inclusions.
The term "plurality" means two or more unless otherwise specified.
In the embodiment of the present disclosure, the character "/" indicates that the preceding and following objects are in an or relationship. For example, A/B represents: a or B.
The term "and/or" is an associative relationship that describes objects, meaning that three relationships may exist. For example, a and/or B, represents: a or B, or A and B.
With reference to fig. 1, an embodiment of the present disclosure provides a gesture control method for a smart device, where the smart device includes a first application program and a second application program, and the method includes:
s11, the first application program detects the gesture image and recognizes the gesture in the gesture image.
And S12, the first application program sends the gesture information of the gesture to the second application program under the condition that the second application program has the gesture requirement, so that the second application program executes the operation instruction corresponding to the gesture information.
By adopting the gesture control method for the intelligent device, the gesture image can be detected through the first application program, and the gesture in the gesture image can be recognized; the first application program sends gesture information of the gesture to the second application program under the condition that the second application program has a gesture requirement, so that the second application program executes an operation instruction corresponding to the gesture information; therefore, the second application program needing the gesture control function can enable the first application program to recognize the gesture in the gesture image through expressing the gesture requirement, send the gesture information of the gesture meeting the gesture requirement to the second application program, and enable the second application program to execute the operation instruction corresponding to the gesture detected by the first application program, so that a user can control the intelligent device through a non-contact gesture control mode, control operation of the user is facilitated, and convenience in controlling the intelligent device is improved.
Herein, the smart device may be a smart household appliance such as a smart refrigerator, a smart television, a smart air conditioner, a smart sound box, a smart washing machine, or a smart portable device such as a smart phone and a tablet computer, which is not limited herein. In the embodiment of the present disclosure, a gesture control method is applied to an intelligent electronic device, and other embodiments of the present disclosure also apply the gesture control method to other intelligent devices, where the intelligent device needs to have an image acquisition function and can install an application program to detect a gesture image by a first application program, and when a gesture requirement is met by a second application program, the first application program sends gesture information corresponding to a gesture in the gesture image to the second application program, so that the second application program executes an operation instruction corresponding to the received gesture information, and quickly implements a function corresponding to the operation instruction.
Generally, a smart television with a gesture control function is provided with a camera for collecting images, the collected gesture images are identified and then mapped into a key instruction of a remote controller, and a user can control the smart television by making a gesture. For example, the user makes a gesture of drawing upwards, when the smart television acquires a gesture image of drawing upwards, the gesture of drawing upwards is analyzed from the gesture image, and the gesture of drawing upwards is mapped into a key instruction of an upwards key of the remote controller. Therefore, the application program of the smart television is controlled through the gesture, for example, the smart television is controlled to execute a video playing function, a user often needs to make a plurality of gestures, the application program of the smart television also needs to acquire a gesture image for many times, identify the gesture in the gesture image and map the gesture into a key instruction, so that the data calculation amount is large in the process that the application program of the smart television completes the video playing function, and the response speed of controlling the application program through the gesture is slow.
By adopting the gesture control method for the intelligent device, the gesture image can be detected through the first application program, and the gesture in the gesture image is recognized; and the first application program sends gesture information of the gesture to the second application program under the condition that the second application program has a gesture requirement, so that the second application program executes an operation instruction corresponding to the gesture information. The processes of detecting the gesture image and recognizing the gesture in the gesture image are separated, and the steps of detecting the gesture image and recognizing the gesture are not required to be respectively carried out by each second application program; and the steps of detecting the gesture image and recognizing the gesture are carried out by utilizing the first application program, and under the condition that any second application program has a gesture requirement, the second application program expresses the requirement to the first application program. Therefore, the second application program receives the gesture information corresponding to the gesture, so that the second application program can execute the operation instruction corresponding to the gesture information, the non-contact control of the intelligent device is realized, the control operation of a user is facilitated, the convenience for controlling the intelligent device is improved, the data operation process of the second application program is simplified, and the response speed of the gesture control application program is improved.
Optionally, the intelligent device may further include a camera and a master control switch, and when the master control switch is turned off, the camera is deactivated and the first application program does not detect the gesture image; under the condition that the master control switch is turned on, the camera is started, and the first application program acquires images through controlling the camera to acquire and detect gesture images.
Here, the gesture requirement may include a gesture that the second application needs to use; the gesture information may include a gesture number corresponding to the gesture that needs to be used.
Optionally, the second application includes a corresponding relationship between a gesture that the second application needs to use and an operation instruction, and when the second application receives the gesture number, the gesture corresponding to the second application may be determined according to the gesture number, and the gesture is mapped to the operation instruction to execute the operation instruction, so that the gesture image detected by the first application is realized, and the gesture in the gesture image is recognized, and when the second application has a gesture requirement, the gesture may be converted into the operation instruction that the second application can execute. In this way, the second application program can execute the control instruction different from other application programs, so that customized gesture operation is realized by writing the preset corresponding relation between the gesture information and the operation instruction into the second application program.
Optionally, the gesture control method for the smart device further includes: the first application program detects whether the second application program has a gesture requirement before or after the gesture is recognized. Therefore, the second application program does not need to carry out the steps of detecting the gesture image and recognizing the gesture, the data operation process of the second application program can be simplified, and the response speed of the gesture control application program is improved.
Optionally, after the first application detects the gesture image, the gesture in the detected gesture image is recognized first, and then whether the second application has a gesture requirement is detected. And under the condition that the gesture requirement of the second application program is detected, sending the gesture information corresponding to the recognized gesture to the second application program, so that the second application program can execute the operation instruction corresponding to the gesture information.
Optionally, after the first application detects the gesture image, first detecting whether the second application has a gesture requirement, and under the condition that the second application has the gesture requirement, recognizing a gesture in the gesture image meeting the gesture requirement; and sending the gesture information corresponding to the recognized gesture to a second application program so that the second application program can execute an operation instruction corresponding to the gesture information.
Optionally, the detecting, by the first application program, whether the second application program has a gesture requirement includes: the first application program detects the shared file and judges whether the shared file contains the gesture requirement written by the second application program.
Optionally, when the first application detects the gesture image, the gesture in the detected gesture image may be recognized first, and then the shared file is detected, and it is determined whether the shared file includes a gesture requirement written by the second application. Under the condition that the shared file is detected to contain the gesture requirement written by the second application program, determining that the second application program has the gesture requirement; at this time, the gesture information corresponding to the recognized gesture is sent to the second application program, so that the second application program can execute the operation instruction corresponding to the gesture information.
Optionally, when the first application detects the gesture image, the shared file may be detected first, and it may be determined whether the shared file includes a gesture requirement written by the second application. And under the condition that the shared file is detected to contain the gesture requirement written by the second application program, determining that the second application program has the gesture requirement. At this time, the gesture in the gesture image is recognized, and the gesture information corresponding to the recognized gesture is sent to the second application program, so that the second application program can execute the operation instruction corresponding to the gesture information.
Optionally, sending gesture information of the gesture to the second application, including: and sending a broadcast containing the gesture information to the second application program. And under the condition that the second application program receives the gesture information, determining an operation instruction corresponding to the gesture information, and executing the operation instruction. Therefore, the second application program does not need to carry out the steps of detecting the gesture image and recognizing the gesture, the non-contact control of the intelligent device is realized, the control operation of a user is facilitated, the convenience for controlling the intelligent device is improved, the data operation process of the second application program is simplified, and the response speed of the gesture control application program is improved.
Here, the second application may also register the broadcast receiver when writing a gesture request into the shared file, so that when the first application sends a broadcast containing gesture information to the broadcast, the second application can receive the gesture information contained in the broadcast.
At this time, the second application needs to adapt the gesture recognition performance of the first application, such as recognition speed, recognition frequency, delay condition, and the like.
Therefore, the gesture information between the first application program and the second application program is transmitted through the broadcasting mechanism, and the second application program is adopted to write the gesture requirement into the shared file, and the shared file can be accessed by any application, so that any second application program can write the gesture requirement into the shared file and register the broadcast receiver, no additional adaptation step needs to be added, and the universality is high.
Optionally, the detecting, by the first application program, whether the second application program has a gesture requirement includes: the first application program detects whether the content transmitted by the second application program contains the gesture requirement of the second application program.
Optionally, when the first application detects the gesture image, the gesture in the detected gesture image may be recognized first, and then it is detected whether the content transmitted by the second application includes the gesture requirement of the second application. Under the condition that the content transmitted by the second application program is detected to contain the gesture requirement written by the second application program, determining that the second application program has the gesture requirement; at this time, the gesture information corresponding to the recognized gesture is sent to the second application program, so that the second application program can execute the operation instruction corresponding to the gesture information.
Optionally, when the first application detects the gesture image, it may be detected whether the content transmitted from the second application includes a gesture requirement of the second application. And under the condition that the content transmitted by the second application program is detected to contain the gesture requirement written by the second application program, determining that the second application program has the gesture requirement. At this time, the gesture in the gesture image is recognized, and the gesture information corresponding to the recognized gesture is sent to the second application program, so that the second application program can execute the operation instruction corresponding to the gesture information.
Optionally, before the first application detects the gesture image, the gesture control method for the smart device further includes: the first application establishes a connection with the second application. So that the first application program and the second application program establish a connection channel capable of carrying out information bidirectional transmission. The second application is enabled to transmit gesture requirements to the first application, and the first application is enabled to transmit gesture information to the second application.
Here, the connection between the first application and the second application may be established under an Android Interface Definition Language (AIDL for short). And transmitting the gesture requirement and the gesture information between the first application program and the second application program through an AIDL mechanism, and initiating a connection request by the second application program. And under the condition that the first application program returns connection confirmation to the second application program, the first application program establishes connection with the second application program.
The connection request may include a package name or a class name of the second application program; synchronization (SYNC) identification may be included in the connection confirmation. Here, the synchronization identification means identification information for processing data using a SYNC function; specifically, after the SYNC function queues all modified block buffers in the write queue, the return command is executed without waiting for the end of the actual write disk operation.
Therefore, the gesture requirement and the gesture information between the first application program and the second application program are transmitted through the AIDL mechanism, the information transmission safety is higher, the customization degree is high, and the response speed is higher. In addition, the second application program can also send out the requirement of the gesture recognition parameter through the AIDL interface according to the requirement; the gesture recognition parameters comprise the image acquisition resolution of the camera, the image acquisition interval of the camera, data filtering standards and the like.
Optionally, the priority of the gesture demand acquisition mode for establishing the connection between the first application program and the second application program is higher than that of the gesture demand acquisition mode under the broadcast mechanism. And applying the mode of establishing the connection to the second application program with higher associated compactness, and applying the broadcast mechanism to the second application program with lower associated compactness.
With reference to fig. 2, an embodiment of the present disclosure further provides a gesture control method for a smart device, where the smart device includes a first application program and a second application program, and the method includes:
and S21, the second application program expresses a gesture requirement so that the first application program can acquire the gesture requirement.
And S22, the second application program receives the gesture information sent by the first application program and executes the operation instruction corresponding to the gesture information.
The gesture information is gesture information corresponding to a gesture in the gesture image detected by the first application program.
By adopting the gesture control method for the intelligent device, the gesture requirement can be expressed through the second application program, so that the first application program can acquire the gesture requirement; the second application program receives the gesture information sent by the first application program and executes an operation instruction corresponding to the gesture information; therefore, the second application program needing the gesture control function can enable the first application program to recognize the gesture in the gesture image by expressing the gesture requirement, send the gesture information of the gesture meeting the gesture requirement to the second application program, enable the second application program to execute the operation instruction corresponding to the gesture detected by the first application program, achieve non-contact control over the intelligent device, facilitate control operation of a user, improve convenience of controlling the intelligent device, simplify the data operation process of the second application program, and improve the response speed of the gesture.
Optionally, expressing the gesture requirement comprises: writing a gesture requirement into the shared file so that the first application program can acquire the gesture requirement of the second application program by detecting the shared file. In this way, when the second application program receives the gesture information, the operation instruction corresponding to the gesture information can be determined, and the operation instruction can be executed. The method and the device have the advantages that non-contact control over the intelligent device is achieved, control operation of a user is facilitated, convenience in controlling the intelligent device is improved, the data operation process of the second application program can be simplified, and the response speed of the gesture control application program is improved.
Optionally, the second application program may also perform registration of the broadcast receiver when writing the gesture requirement into the shared file, so that when the first application program sends the broadcast containing the gesture information to the first application program, the second application program can receive the gesture information contained in the broadcast. Here, the gesture requirement may include a gesture that the second application needs to use; the gesture information may include a gesture number corresponding to the gesture that needs to be used.
Optionally, expressing the gesture requirement comprises: in the event that the first application establishes a connection therewith, a gesture requirement is sent to the first application. In this way, the second application program establishes connection with the first application program, and realizes information bidirectional transmission between the first application program and the second application program. The second application is enabled to transmit gesture requirements to the first application, and the first application is enabled to transmit gesture information to the second application.
Optionally, the second application includes a corresponding relationship between a gesture that the second application needs to use and an operation instruction, and when the second application receives gesture information, the gesture corresponding to the second application can be determined according to the gesture information, and the gesture is mapped to the operation instruction to execute the operation instruction, so that the gesture can be converted into the operation instruction that can be executed by the second application when the second application has a gesture requirement. In this way, the second application program can respond to matching of gesture information and control instructions different from other applications, and accordingly customized gesture operation can be achieved by writing the preset corresponding relation between the gesture information and the operation instructions into the second application program.
In some embodiments, as shown in fig. 3 and 4, the first application detects the gesture image and recognizes the gesture in the gesture image, including:
and S111, detecting a gesture image.
And S112, recognizing the gesture in the gesture image.
In some embodiments, as shown in fig. 3 and fig. 4, when the second application has a gesture requirement, the first application sends gesture information of the gesture to the second application, so that the second application executes an operation instruction corresponding to the gesture information, including:
and S121, reading the shared file.
And S122, determining that the second application program has a gesture requirement.
And S123, sending gesture information of the gesture to the second application program.
In some embodiments, as shown in fig. 3 and 4, the second application expresses a gesture requirement to enable the first application to obtain the gesture requirement, including:
s211, writing the gesture requirement into the shared file.
As shown in fig. 3, when the first application detects the gesture image, the gesture in the gesture image may be recognized first, and then the shared file is read, so as to determine whether the shared file includes a gesture requirement written by the second application. Under the condition that the shared file is detected to contain the gesture requirement written by the second application program, determining that the second application program has the gesture requirement; at this time, the gesture information of the gesture is sent to the second application program, and the second application program executes the operation instruction corresponding to the gesture information.
As shown in fig. 4, when the first application detects a gesture image, the shared file may be read first, and it may be determined whether the shared file includes a gesture requirement written by the second application. And under the condition that the shared file is detected to contain the gesture requirement written by the second application program, determining that the second application program has the gesture requirement. At the moment, the gesture in the gesture image is recognized, the gesture information of the gesture is sent to the second application program, and the second application program executes the operation instruction corresponding to the gesture information.
In some embodiments, as shown in fig. 5, the establishing a connection between the first application and the second application includes:
s01, the second application sends a connection request to the first application.
S02, the first application returns a connection confirmation to the second application.
As shown in fig. 5, under the aid dl mechanism, the second application first initiates a connection request, and the first application returns a connection confirmation to the second application, at this time, the first application and the second application establish a connection channel capable of performing bidirectional information transmission. The second application program expresses a gesture requirement to the first application program, the first application program determines that the second application program has the gesture requirement, and the first application program sends gesture information to the second application program; and the second application program receives the gesture information sent by the first application program and executes an operation instruction corresponding to the gesture information.
In this way, the gesture requirement is expressed through the second application program so that the first application program can acquire the gesture requirement; the second application program receives the gesture information sent by the first application program and executes an operation instruction corresponding to the gesture information; therefore, the second application program needing the gesture control function can enable the first application program to recognize the gesture in the gesture image by expressing the gesture requirement, send the gesture information of the gesture meeting the gesture requirement to the second application program, enable the second application program to execute the operation instruction corresponding to the gesture detected by the first application program, achieve non-contact control over the intelligent device, facilitate control operation of a user, improve convenience of controlling the intelligent device, simplify the data operation process of the second application program, and improve the response speed of the gesture.
As shown in fig. 6, an embodiment of the present disclosure provides a gesture control apparatus for a smart device, which includes a processor (processor)100 and a memory (memory) 101. Optionally, the apparatus may also include a Communication Interface (Communication Interface)102 and a bus 103. The processor 100, the communication interface 102, and the memory 101 may communicate with each other via a bus 103. The communication interface 102 may be used for information transfer. The processor 100 may invoke logic instructions in the memory 101 to perform the gesture control method for the smart device of the above-described embodiments.
In addition, the logic instructions in the memory 101 may be implemented in the form of software functional units and stored in a computer readable storage medium when the logic instructions are sold or used as independent products.
The memory 101, which is a computer-readable storage medium, may be used for storing software programs, computer-executable programs, such as program instructions/modules corresponding to the methods in the embodiments of the present disclosure. The processor 100 executes functional applications and data processing by executing program instructions/modules stored in the memory 101, that is, implements the gesture control method for the smart device in the above embodiments.
The memory 101 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created according to the use of the terminal device, and the like. In addition, the memory 101 may include a high-speed random access memory, and may also include a nonvolatile memory.
The embodiment of the disclosure provides an intelligent device, which comprises a first application program, a second application program and the gesture control device for the intelligent device.
The intelligent device may be an intelligent household appliance such as an intelligent refrigerator, an intelligent television, an intelligent air conditioner, an intelligent sound box, an intelligent washing machine and the like, or an intelligent portable device such as a smart phone, a tablet personal computer and the like, and is not particularly limited. The intelligent device has an image acquisition function and can be provided with an application program.
By adopting the intelligent equipment provided by the embodiment of the disclosure, the gesture image can be detected through the first application program, and the gesture in the gesture image can be recognized; the first application program sends gesture information of the gesture to the second application program under the condition that the second application program has a gesture requirement, so that the second application program executes an operation instruction corresponding to the gesture information; therefore, the second application program needing the gesture control function can enable the first application program to recognize the gesture in the gesture image by expressing the gesture requirement, send the gesture information of the gesture meeting the gesture requirement to the second application program, enable the second application program to execute the operation instruction corresponding to the gesture detected by the first application program, achieve non-contact control over the intelligent device, facilitate control operation of a user, improve convenience of controlling the intelligent device, simplify the data operation process of the second application program, and improve the response speed of the gesture.
The disclosed embodiments provide a computer-readable storage medium storing computer-executable instructions configured to perform the above-described gesture control method for an intelligent device.
The disclosed embodiments provide a computer program product comprising a computer program stored on a computer readable storage medium, the computer program comprising program instructions which, when executed by a computer, cause the computer to perform the above-described gesture control method for a smart device.
The computer-readable storage medium described above may be a transitory computer-readable storage medium or a non-transitory computer-readable storage medium.
The technical solution of the embodiments of the present disclosure may be embodied in the form of a software product, where the computer software product is stored in a storage medium and includes one or more instructions to enable a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method of the embodiments of the present disclosure. And the aforementioned storage medium may be a non-transitory storage medium comprising: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes, and may also be a transient storage medium.
The above description and drawings sufficiently illustrate embodiments of the disclosure to enable those skilled in the art to practice them. Other embodiments may incorporate structural, logical, electrical, process, and other changes. The examples merely typify possible variations. Individual components and functions are optional unless explicitly required, and the sequence of operations may vary. Portions and features of some embodiments may be included in or substituted for those of others. Furthermore, the words used in the specification are words of description only and are not intended to limit the claims. As used in the description of the embodiments and the claims, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. Similarly, the term "and/or" as used in this application is meant to encompass any and all possible combinations of one or more of the associated listed. Furthermore, the terms "comprises" and/or "comprising," when used in this application, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. Without further limitation, an element defined by the phrase "comprising an …" does not exclude the presence of other like elements in a process, method or apparatus that comprises the element. In this document, each embodiment may be described with emphasis on differences from other embodiments, and the same and similar parts between the respective embodiments may be referred to each other. For methods, products, etc. of the embodiment disclosures, reference may be made to the description of the method section for relevance if it corresponds to the method section of the embodiment disclosure.
Those of skill in the art would appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software may depend upon the particular application and design constraints imposed on the technical solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the disclosed embodiments. It can be clearly understood by the skilled person that, for convenience and brevity of description, the specific working processes of the system, the apparatus and the unit described above may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the embodiments disclosed herein, the disclosed methods, products (including but not limited to devices, apparatuses, etc.) may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units may be only one type of logical functional division, and there may be other divisions in actual implementation, for example, multiple units or components may be combined or may be integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form. The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to implement the present embodiment. In addition, functional units in the embodiments of the present disclosure may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. In the description corresponding to the flowcharts and block diagrams in the figures, operations or steps corresponding to different blocks may also occur in different orders than disclosed in the description, and sometimes there is no specific order between the different operations or steps. For example, two sequential operations or steps may in fact be executed substantially concurrently, or they may sometimes be executed in the reverse order, depending upon the functionality involved. Each block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.

Claims (10)

1. A gesture control method for a smart device, the smart device including a first application and a second application, comprising:
the method comprises the steps that a first application program detects a gesture image and recognizes a gesture in the gesture image;
and the first application program sends the gesture information of the gesture to the second application program under the condition that the second application program has a gesture requirement, so that the second application program executes an operation instruction corresponding to the gesture information.
2. The gesture control method according to claim 1, further comprising:
the first application program detects whether the second application program has a gesture requirement before or after the gesture is recognized.
3. The gesture control method according to claim 2, wherein the first application detects whether the second application has a gesture requirement, and the method comprises the following steps:
the first application program detects a shared file and judges whether the shared file contains a gesture requirement written by the second application program.
4. The gesture control method according to claim 3, wherein the sending the gesture information of the gesture to the second application program comprises:
sending a broadcast containing the gesture information to the second application.
5. The gesture control method according to claim 2, wherein the first application detects whether the second application has a gesture requirement, and the method comprises the following steps:
the first application program detects whether the content transmitted by the second application program contains the gesture requirement of the second application program.
6. The gesture control method according to claim 5, wherein the first application further comprises, before detecting the gesture image:
the first application establishes a connection with the second application.
7. A gesture control method for a smart device, the smart device including a first application and a second application, comprising:
the second application expresses a gesture requirement so that the first application can acquire the gesture requirement;
the second application program receives the gesture information sent by the first application program and executes an operation instruction corresponding to the gesture information;
the gesture information is gesture information corresponding to a gesture in a gesture image detected by a first application program.
8. The gesture control method according to claim 7, wherein said expressing the gesture requirement comprises:
writing a gesture requirement into a shared file so that a first application program obtains the gesture requirement of a second application program by detecting the shared file; alternatively, the first and second electrodes may be,
in the event that the first application establishes a connection therewith, a gesture requirement is sent to the first application.
9. A gesture control apparatus for a smart device comprising a processor and a memory storing program instructions, characterized in that the processor is configured to perform the gesture control method for a smart device according to any one of claims 1 to 8 when executing the program instructions.
10. A smart device comprising a first application and a second application, further comprising the gesture control apparatus for a smart device of claim 9.
CN202011203653.1A 2020-11-02 2020-11-02 Gesture control method and device for intelligent equipment and intelligent equipment Pending CN114529977A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011203653.1A CN114529977A (en) 2020-11-02 2020-11-02 Gesture control method and device for intelligent equipment and intelligent equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011203653.1A CN114529977A (en) 2020-11-02 2020-11-02 Gesture control method and device for intelligent equipment and intelligent equipment

Publications (1)

Publication Number Publication Date
CN114529977A true CN114529977A (en) 2022-05-24

Family

ID=81619387

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011203653.1A Pending CN114529977A (en) 2020-11-02 2020-11-02 Gesture control method and device for intelligent equipment and intelligent equipment

Country Status (1)

Country Link
CN (1) CN114529977A (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101826035A (en) * 2010-04-07 2010-09-08 深圳创维-Rgb电子有限公司 Communication method between application programs
CN102799297A (en) * 2011-05-27 2012-11-28 宏碁股份有限公司 Gesture control method and electronic device
US20150363044A1 (en) * 2014-06-13 2015-12-17 Shui Ho Hung Gesture identification system in tablet projector and gesture identification method thereof
CN105872685A (en) * 2016-03-24 2016-08-17 深圳市国华识别科技开发有限公司 Intelligent terminal control method and system, and intelligent terminal
CN105979363A (en) * 2015-11-09 2016-09-28 乐视致新电子科技(天津)有限公司 Identity identification method and device
CN108880898A (en) * 2018-06-29 2018-11-23 新华三技术有限公司 Active and standby containment system switching method and device
CN111723353A (en) * 2019-03-22 2020-09-29 北京小米移动软件有限公司 Identity authentication method, device, terminal and storage medium based on face recognition

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101826035A (en) * 2010-04-07 2010-09-08 深圳创维-Rgb电子有限公司 Communication method between application programs
CN102799297A (en) * 2011-05-27 2012-11-28 宏碁股份有限公司 Gesture control method and electronic device
US20150363044A1 (en) * 2014-06-13 2015-12-17 Shui Ho Hung Gesture identification system in tablet projector and gesture identification method thereof
CN105979363A (en) * 2015-11-09 2016-09-28 乐视致新电子科技(天津)有限公司 Identity identification method and device
CN105872685A (en) * 2016-03-24 2016-08-17 深圳市国华识别科技开发有限公司 Intelligent terminal control method and system, and intelligent terminal
CN108880898A (en) * 2018-06-29 2018-11-23 新华三技术有限公司 Active and standby containment system switching method and device
CN111723353A (en) * 2019-03-22 2020-09-29 北京小米移动软件有限公司 Identity authentication method, device, terminal and storage medium based on face recognition

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
毛晓洁: "基于Android平台的手势识别技术的研究与应用", 中国优秀硕士学位论文全文数据库 信息科辑, 15 March 2016 (2016-03-15), pages 1 - 8 *

Similar Documents

Publication Publication Date Title
TWI488130B (en) Fingerprint identification and verification system and method thereof
US20160269946A1 (en) Electronic device and a method of operating the same
KR100872178B1 (en) Apparatus and method for managing forwarding service of WUSB based on priority
US11935398B1 (en) Electronic apparatus, remote control apparatus, control method thereof, and electronic system
CN112188461B (en) Control method and device of near field communication device, medium and electronic equipment
WO2020165885A1 (en) Computer-implemented method and system for providing interaction rules in mixed reality
CN113238727A (en) Screen switching method and device, computer readable medium and electronic equipment
US20160127768A1 (en) Usb sharing method for combo tv set, combo tv set and computer readable storage medium
CN104461978B (en) Method and device for unidirectional data transmission
US8312078B2 (en) Data transfer system and data transfer method
WO2021043021A1 (en) Memory write-back method and apparatus, and terminal and storage medium
CN104765585B (en) Display device and its control method
CN111290689B (en) Electronic equipment, main control device, control method and touch control sharing system thereof
US20170026617A1 (en) Method and apparatus for real-time video interaction by transmitting and displaying user interface correpsonding to user input
CN114529977A (en) Gesture control method and device for intelligent equipment and intelligent equipment
CN113311719A (en) Method, system and device for controlling household appliance and electronic equipment
JP2015156526A (en) Communication device, information processing device and control method therefor, and communication system
JP5376689B2 (en) Information processing apparatus, apparatus search method, apparatus search support method, and recording medium
CN114690651A (en) Method and device for screen image transmission, smart home equipment and system
CN115600174A (en) Intelligent equipment control method and device
CN108920122B (en) Screen display method, device, terminal and computer readable storage medium
US11334745B2 (en) Electronic device and method for providing service information related to broadcast content in electronic device
EP0913775A1 (en) Modem control
JP2017033397A (en) Operation information sharing device, application execution device, operation information sharing system, information processing device and program
CN112087476B (en) Page starting method, first hardware equipment, mobile terminal, server and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination