CN117631836A - Gesture data processing method and device, electronic equipment and storage medium - Google Patents

Gesture data processing method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN117631836A
CN117631836A CN202311598865.8A CN202311598865A CN117631836A CN 117631836 A CN117631836 A CN 117631836A CN 202311598865 A CN202311598865 A CN 202311598865A CN 117631836 A CN117631836 A CN 117631836A
Authority
CN
China
Prior art keywords
monitoring node
event
gesture
data
changed
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311598865.8A
Other languages
Chinese (zh)
Inventor
薛宇翔
程永胜
陈冠仪
庄竣程
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hefei Lianbao Information Technology Co Ltd
Original Assignee
Hefei Lianbao Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hefei Lianbao Information Technology Co Ltd filed Critical Hefei Lianbao Information Technology Co Ltd
Priority to CN202311598865.8A priority Critical patent/CN117631836A/en
Publication of CN117631836A publication Critical patent/CN117631836A/en
Pending legal-status Critical Current

Links

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The application provides a gesture data processing method, a gesture data processing device, electronic equipment and a storage medium, wherein the gesture data processing device is applied to the electronic equipment, and the electronic equipment comprises a sensor, a control chip and a gesture processing program, and the gesture data processing method comprises the following steps: under the condition that the application is started, the control chip monitors that the sensor recognizes gestures of a user, obtains gesture data of the sensor and sends the gesture data to the gesture processing program; the gesture processing program determines at least one event based on gesture data, and modifies a first monitoring node corresponding to the event, so that a driver corresponding to the event controls corresponding hardware to execute the event when detecting that the first monitoring node is changed; the gesture processing program modifies the second monitoring node based on an event execution result, so that the application determines and executes corresponding operation when detecting the change of the second monitoring node, and the event execution result is returned to the gesture processing program after the corresponding hardware is controlled by the driver to complete the corresponding event.

Description

Gesture data processing method and device, electronic equipment and storage medium
Technical Field
The present disclosure relates to the field of data processing, and in particular, to a gesture data processing method, a gesture data processing device, an electronic device, and a storage medium.
Background
When the third party application is started, the gesture of the user can be recognized by the electronic device, after the corresponding event is executed, a result returned after the event is executed is obtained, and feedback is given to the user based on the returned result (for example, when the third party application is a music application, the gesture of the user is recognized by the electronic device, the volume is reduced, the third party application needs to obtain the returned result of the event of reducing the volume, and a prompt that the volume of the user is reduced is given). However, since the existing software framework does not integrate related functions, the third party application cannot recognize the gesture of the user through the software framework in the electronic device, and acquire the result returned after the corresponding event is executed. Without the software framework, the device vendor would be required to authorize the third party application, but based on security considerations, the device vendor would not authorize the third party application.
Disclosure of Invention
The application provides a gesture data processing method, a gesture data processing device, electronic equipment and a storage medium.
An aspect of the present application provides a gesture data processing method, applied to an electronic device, where the electronic device includes a sensor, a control chip and a gesture processing program, the method includes:
under the condition that an application is started, the control chip monitors that the sensor recognizes gestures of a user, obtains gesture data of the sensor and sends the gesture data to the gesture processing program;
the gesture processing program determines at least one event based on the gesture data, and modifies a first monitoring node corresponding to the event, so that a driver corresponding to the event controls corresponding hardware to execute the event when detecting that the first monitoring node is changed;
the gesture processing program modifies the second monitoring node based on an event execution result so that the application determines and executes corresponding operation when detecting the change of the second monitoring node, and the event execution result is returned to the gesture processing program after the corresponding hardware is controlled by the driver to complete the corresponding event.
The electronic device further comprises a driver program, and the driver program corresponding to the event controls corresponding hardware to execute the event when the first monitoring node is monitored to be changed, wherein the driver program comprises at least one of the following components:
if the hardware is a display device, the driver program determines a display type and a display attribute based on the changed first monitoring node when the first monitoring node is changed, and controls the display device to execute the event based on the display type and the display attribute;
if the hardware is input equipment and the event is a first event, when the driver monitors that a first monitoring node is changed, determining input data based on the changed first monitoring node, and controlling the input equipment to execute the first event based on the input data;
if the hardware is input equipment and the event is a second event, the driver determines target coordinates based on the changed first monitoring node when the driver monitors the change of the first monitoring node, and controls the input equipment to execute the second event based on the target coordinates;
if the hardware is input equipment and the event is a third event, when the driver monitors that the first monitoring node is changed, determining a click zone bit based on the changed first monitoring node, and controlling the input equipment to execute the third event based on the click zone bit.
The application is an android application, the electronic device further comprises a virtual machine, and the application determines and executes corresponding operations when the second monitoring node is monitored to be changed, and the method comprises the following steps:
when the virtual machine monitors that the second monitoring node is changed, acquiring data of the second monitoring node;
and the virtual machine sends the data of the second monitoring node to the android application, so that the android application determines and executes corresponding operation based on the received data of the second monitoring node.
The application is a browser, the electronic device further comprises a D-BUS system, and the application determines and executes corresponding operations when the second monitoring node is monitored to be changed, and the method comprises the following steps:
the D-BUS system acquires data of a second monitoring node when the second monitoring node is monitored to be changed;
and the D-BUS system sends the data of the second monitoring node to the browser so that the browser determines and executes corresponding operation based on the data of the second monitoring node.
Wherein the obtaining gesture data of the sensor includes at least one of:
the control chip receives gesture data sent by the sensor;
and the control chip sends a data request to the sensor and receives gesture data sent by the sensor based on the data request.
Another aspect of an embodiment of the present application provides a gesture data processing apparatus, including:
the monitoring module is used for monitoring that the sensor recognizes a gesture of a user under the condition that an application is started, acquiring gesture data of the sensor and sending the gesture data to the gesture processing program;
the processing module is used for determining at least one event based on the gesture data by the gesture processing program and modifying a first monitoring node corresponding to the event so that a driver corresponding to the event controls corresponding hardware to execute the event when the first monitoring node is monitored to be changed;
the processing module is further configured to modify the second monitoring node based on an event execution result by the gesture processing program, so that the application determines and executes a corresponding operation when detecting that the second monitoring node is changed, where the event execution result is that the driver controls corresponding hardware to complete a corresponding event and then returns the event to the gesture processing program.
The processing module is further used for acquiring data of a second monitoring node when the virtual machine monitors that the second monitoring node is changed;
the processing module is further configured to send, by the virtual machine, data of the second monitoring node to the android application, so that the android application determines and executes a corresponding operation based on the received data of the second monitoring node.
The processing module is further used for acquiring data of the second monitoring node when the D-BUS system monitors that the second monitoring node is changed;
the processing module is further configured to send data of the second monitoring node to the browser by using the D-BUS system, so that the browser determines and executes a corresponding operation based on the data of the second monitoring node.
Still another aspect of the present invention provides an electronic device, including:
a processor, a memory for storing instructions executable by the processor;
the processor is used for reading the executable instructions from the memory and executing the instructions to realize the gesture data processing method.
In yet another aspect, the present invention provides a computer readable storage medium storing a computer program for executing the gesture data processing method.
It should be understood that the description of this section is not intended to identify key or critical features of the embodiments of the application or to delineate the scope of the application. Other features of the present application will become apparent from the description that follows.
Drawings
The above, as well as additional purposes, features, and advantages of exemplary embodiments of the present application will become readily apparent from the following detailed description when read in conjunction with the accompanying drawings. Several embodiments of the present application are illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings, in which:
in the drawings, the same or corresponding reference numerals indicate the same or corresponding parts.
FIG. 1 illustrates a flow chart of a gesture data processing method according to one embodiment of the present application;
FIG. 2 illustrates a flow chart of a gesture data processing method according to another embodiment of the present application;
FIG. 3 illustrates a flow chart of a gesture data processing method according to another embodiment of the present application;
FIG. 4 illustrates a flow chart of a gesture data processing method according to another embodiment of the present application;
FIG. 5 shows a schematic diagram of a gesture data processing apparatus according to one embodiment of the present application;
fig. 6 shows a schematic diagram of the composition structure of an electronic device according to an embodiment of the present application.
Detailed Description
In order to make the objects, features and advantages of the present application more obvious and understandable, the technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is apparent that the described embodiments are only some embodiments of the present application, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments herein without making any inventive effort, are intended to be within the scope of the present application.
In order to identify a gesture of a user by an electronic device, after executing a corresponding event, a third party application can obtain a result returned after executing the event, an embodiment of the present application provides a gesture data processing method, which is applied to the electronic device, where the electronic device includes a sensor, a control chip, and a gesture processing program, as shown in fig. 1, and the method includes:
step 101, under the condition that an application is started, the control chip monitors that the sensor recognizes a gesture of a user, obtains gesture data of the sensor, and sends the gesture data to the gesture processing program.
In this embodiment, when the third party application is started, the third party application needs to acquire the gesture recognized by the electronic device, and returns a result based on the event after the gesture executes the corresponding event.
A listening task is created on the control chip (e.g., EC embedded controller) side for monitoring the sensor. And actively acquiring gesture data collected by the sensor when the sensor is monitored to recognize the gesture of the user. After acquiring the gesture data collected by the sensor, the control chip sends the gesture data to a gesture processing program (the program is used for identifying the gesture data and performing corresponding processing, and the processor runs the program).
Step 102, the gesture processing program determines at least one event based on the gesture data, and modifies the first monitoring node corresponding to the event, so that the driver corresponding to the event controls the corresponding hardware to execute the event when the driver corresponding to the event monitors the change of the first monitoring node.
After receiving the gesture data sent by the control chip, the gesture processing program identifies the gesture data and determines at least one corresponding event (the event is an operation which needs to be executed by the electronic device and is represented by the gesture of the user, and it should be noted that the same gesture of the user may represent that the electronic device needs to execute a plurality of events).
After determining at least one event, the gesture processing program modifies a corresponding first monitoring node based on the event (the first monitoring node is stored in a naming space, the naming space is a storage area opened up in a memory), a driver corresponding to the event (if the event needs to operate a screen, the corresponding driver is a driver of a display screen) monitors the first monitoring node corresponding to the event, and when the driver monitors that the first monitoring node is changed, the driver controls corresponding hardware to complete the event based on the changed first monitoring node.
For example, the gesture handler determines event a based on gesture data, event a being to adjust screen brightness to 60%. The gesture processing program modifies and adjusts a first monitoring node corresponding to the screen brightness. When the driver program of the display screen monitors that the first monitoring node corresponding to the brightness of the adjustment screen is changed, the display screen is controlled to adjust the brightness of the display screen to 60% based on the changed first monitoring node.
Step 103, the gesture processing program modifies the second monitoring node based on the event execution result, so that the application determines and executes the corresponding operation when detecting the change of the second monitoring node, and the event execution result is that the driver controls the corresponding hardware to complete the corresponding event and then returns to the gesture processing program.
After the event is completed, the driver corresponding to the event returns an event execution result to the gesture processing program, and the gesture processing program modifies the second monitoring node based on the event execution result. When the second monitoring node is monitored to be changed, the application acquires the result of event execution, and determines and executes corresponding operation.
For example, the driver of the display screen changes when the first monitoring node corresponding to the brightness of the adjustment screen is monitored, and the display screen is controlled to adjust the brightness of the display screen to 60% based on the changed first monitoring node. After the execution is successful, the driver of the display screen returns an event execution result for adjusting the screen brightness to 60% of success to the gesture processing program. The gesture handler modifies the second monitoring node based on the event execution result. After the second monitoring node is monitored to be changed, the application obtains the value of the second monitoring node (namely, data representing that the screen brightness is adjusted to 60% of success), and prompts the user (such as popup window, voice or lamplight prompt) based on the value.
It should be noted that, the gesture processing program may also modify the first monitoring node and the second monitoring node simultaneously after determining at least one event based on the gesture data (in this case, the second monitoring node is modified to enable the application to monitor the execution of the event in advance, and then can respond quickly), and then after the driver returns the event execution result, the application can respond immediately, and determine and execute the responding operation based on the event execution result.
In the above scheme, under the condition that the application is started, the sensor is monitored through the control chip, and when the sensor recognizes the gesture of the user, gesture data are actively obtained from the sensor and sent to the gesture processing program, so that the gesture behavior of the user can be effectively monitored and processed. And after the gesture processing program acquires gesture data, the gesture data is identified, at least one corresponding event is determined, and the first monitoring node corresponding to the event is modified, so that the driver corresponding to the event can control corresponding hardware to execute the event in real time when the first monitoring node is monitored to be changed. After the driver program finishes executing the event, an event execution result is returned to the gesture processing program, and then the gesture processing program modifies the second monitoring node based on the event execution result, so that when the application monitors that the second monitoring node is changed, the application acquires the event execution result, and further determines and executes corresponding operation. The method and the device realize that the third party application can acquire the returned result after the corresponding event is executed after the gesture of the user is recognized by the electronic equipment and the corresponding event is executed.
In an example of the present application, a gesture data processing method is further provided, where the electronic device further includes a driver, and when the driver corresponding to an event detects that a first monitoring node is changed, the driver corresponding to the event controls corresponding hardware to execute the event, where the event may be implemented in at least one of the following manners:
mode one: and if the hardware is a display device, when the driver detects that the first monitoring node is changed, determining a display type and a display attribute based on the changed first monitoring node, and controlling the display device to execute the event based on the display type and the display attribute.
The display type may be an adjustment screen parameter (such as brightness, contrast, etc., and the corresponding display attribute is a value that the parameter needs to be adjusted), a switch screen (and the corresponding display attribute is on or off), etc.
Mode two: and if the hardware is input equipment and the event is a first event, when the driver monitors that the first monitoring node is changed, determining input data based on the changed first monitoring node, and controlling the input equipment to execute the first event based on the input data.
In the second mode, the input device may be a keyboard, a microphone, a camera, etc., and the first event is inputting text, capturing voice, taking an image or a photograph, etc.
For example, the input device is a keyboard, the first event is text input, and when the driver detects that the first monitoring node is changed, the driver controls the keyboard to virtually input text preset based on gestures.
For another example, the input device is a camera, the first event is shooting an image, and the driver controls the camera to shoot the image when detecting that the first monitoring node is changed.
Mode three: and if the hardware is input equipment and the event is a second event, when the driver monitors that the first monitoring node is changed, determining target coordinates based on the changed first monitoring node, and controlling the input equipment to execute the second event based on the target coordinates.
In the third mode, the input device may be a mouse, a touch pad, a touch screen, a stylus, or the like, and the second event is cursor movement, touch, or the like.
For example, the input device is a mouse, the second event is cursor movement, and when the driver detects that the first monitoring node is changed, the driver controls the mouse cursor to move to a position indicated by the target coordinates.
For another example, the input device is a touch pad, and the second event is a touch, and when the driver detects that the first monitoring node is changed, the driver controls the touch pad to virtually touch at the position indicated by the target coordinate.
Mode four: if the hardware is input equipment and the event is a third event, when the driver monitors that the first monitoring node is changed, determining a click zone bit based on the changed first monitoring node, and controlling the input equipment to execute the third event based on the click zone bit.
In the fourth aspect, the input device may be a mouse, a touch pad where a corresponding cursor exists (where the existence of a corresponding cursor refers to the existence of a cursor in an operating system of an electronic device, movement and clicking of the cursor may be controlled by the touch device), a touch screen where a corresponding cursor exists, a stylus where a corresponding cursor exists, and the like, and the third event is clicking of the cursor.
For example, the input device is a mouse, the third event is a cursor click, and when the driver detects that the first monitoring node is changed, the driver controls the mouse to click at the current position of the cursor.
For another example, the input device is a touch pad with a corresponding cursor, the third event is a cursor click, and when the driver detects that the first monitoring node is changed, the driver controls the touch pad to click at the current position of the cursor.
In the scheme, different events are realized by controlling different hardware by the driver in various modes, so that the gesture using mode of the user is more flexible, and the using experience of the user is obviously improved.
In an example of the present application, there is further provided a gesture data processing method, where the application is an android application, and the electronic device further includes a virtual machine, where the application determines and executes a corresponding operation when detecting that the second monitoring node is changed, as shown in fig. 2, including:
step 201, when the virtual machine monitors that the second monitoring node is changed, the virtual machine acquires data of the second monitoring node.
Android applications need to run on electronic devices (such as computers or notebook computers) that cannot run the android applications natively through virtual machines, and android applications also need to interact with the operating systems in these electronic devices through virtual machines.
The android application monitors the second monitoring node through the virtual machine, and the virtual machine acquires the data of the second monitoring node when the virtual machine monitors that the second monitoring node is changed.
Step 202, the virtual machine sends the data of the second monitoring node to the android application, so that the android application determines and executes corresponding operations based on the received data of the second monitoring node.
And after the virtual machine acquires the data of the second monitoring node, the data of the second monitoring node is sent to the android application. And after the android application receives the data of the second monitoring node, determining an execution result of the event and an operation to be performed based on the data of the second monitoring node, and performing the operation (if the event is successfully executed, the user needs to be prompted that the event is successfully executed).
In the above scheme, the second monitoring node is monitored through the virtual machine, so that the android application can identify the gesture of the user when the electronic device which cannot run the android application originally executes the gesture, and after the corresponding event is executed, a result returned after the event is executed is obtained. The flexibility of gesture data processing and the use experience of a user are further improved.
In an example of the present application, there is further provided a gesture data processing method, where the application is a browser, and the electronic device further includes a D-BUS system (a message BUS system), and the application determines and executes a corresponding operation when detecting a change of the second monitoring node, as shown in fig. 3, including:
step 301, the D-BUS system acquires data of the second monitoring node when it monitors that the second monitoring node is changed.
The browser cannot monitor the second monitoring node directly, and therefore, the second monitoring node needs to be monitored through the D-BUS system.
And when the D-BUS system monitors that the second monitoring node is changed, acquiring data of the second monitoring node.
Step 302, the D-BUS system sends the data of the second monitoring node to the browser, so that the browser determines and executes a corresponding operation based on the data of the second monitoring node.
And after the D-BUS system acquires the data of the second monitoring node, the data of the second monitoring node is sent to the browser. And after receiving the data of the second monitoring node, the browser determines an execution result of the event and an operation required to be performed based on the data of the second monitoring node and executes the operation.
In the above scheme, the second monitoring node is monitored through the D-BUS system, so that the browser can monitor the second monitoring node, and the browser can recognize the gesture of the user at the electronic device, and acquire the result returned after the corresponding event is executed. The flexibility of gesture data processing and the use experience of a user are further improved.
In an example of the present application, there is further provided a gesture data processing method, where the obtaining gesture data of the sensor may be implemented by at least one of the following manners:
mode one: and the control chip receives gesture data sent by the sensor.
After the sensor recognizes the gesture of the user and obtains gesture data, the sensor actively transmits the gesture data to the control chip. The real-time performance of data transmission can be improved.
Mode two: and the control chip sends a data request to the sensor and receives gesture data sent by the sensor based on the data request.
After the control chip monitors that the sensor recognizes the gesture of the user, the sensor actively sends a data request to the sensor, and the sensor sends gesture data to the control chip after receiving the data request.
In an example of the present application, there is also provided a gesture data processing method, as shown in fig. 4, including:
the sensor recognizes the gesture of the user and obtains corresponding gesture data based on the gesture of the user.
The control chip monitors the sensor, and when the sensor is monitored to recognize the gesture of the user, a data request is sent to the sensor, and gesture data is acquired from the sensor based on the data request.
And the control chip acquires gesture data and then sends the gesture data to the gesture processing program.
The gesture processing program identifies gesture data and determines at least one corresponding event.
The gesture handler modifies the corresponding first monitoring node based on each event.
When the driver corresponding to each event monitors that the corresponding first monitoring node is changed, acquiring data of the first monitoring node, determining the event based on the data of the first monitoring node, and controlling corresponding hardware to execute the event.
After the event is executed, the driver returns an event execution result to the gesture processing program.
And after receiving the event execution result, the gesture processing program modifies the second monitoring node based on the event execution result.
When the D-BUS system monitors that the second monitoring node is changed, acquiring data of the second monitoring node and sending the data to a browser, and determining and executing corresponding operation by the browser based on the data of the second monitoring node.
Or when the virtual machine monitors that the second monitoring node is changed, acquiring data of the second monitoring node and sending the data to the android application, and determining and executing corresponding operation by the android application based on the data of the second monitoring node.
Or when the primary application monitors that the second monitoring node is changed, acquiring the data of the second monitoring node, and determining and executing corresponding operation based on the data of the second monitoring node.
In order to implement the gesture data processing method described above, as shown in fig. 5, an example of the present application provides a gesture data processing apparatus, including:
the monitoring module 10 is configured to, when an application is turned on, monitor that the sensor recognizes a gesture of a user, obtain gesture data of the sensor, and send the gesture data to the gesture processing program;
the processing module 20 is configured to determine at least one event based on the gesture data by using the gesture processing program, and modify a first monitoring node corresponding to the event, so that a driver corresponding to the event controls corresponding hardware to execute the event when detecting that the first monitoring node is changed;
the processing module 20 is further configured to modify the second monitoring node based on an event execution result by the gesture processing program, so that the application determines and executes a corresponding operation when detecting that the second monitoring node is changed, where the event execution result is that the driver controls corresponding hardware to complete a corresponding event and then returns the event to the gesture processing program.
The processing module 20 is further configured to, if the hardware is a display device, determine, when the driver detects that the first monitoring node is changed, a display type and a display attribute based on the changed first monitoring node, and control the display device to execute the event based on the display type and the display attribute;
the processing module 20 is further configured to, if the hardware is an input device and the event is a first event, determine input data based on the changed first monitoring node when the driver detects that the first monitoring node is changed, and control the input device to execute the first event based on the input data;
the processing module 20 is further configured to, if the hardware is an input device and the event is a second event, determine, when the driver detects that the first monitoring node is changed, a target coordinate based on the changed first monitoring node, and control the input device to execute the second event based on the target coordinate;
the processing module 20 is further configured to, if the hardware is an input device and the event is a third event, determine, when the driver detects that the first monitoring node is changed, a click flag bit based on the changed first monitoring node, and control the input device to execute the third event based on the click flag bit.
The processing module 20 is further configured to obtain data of a second monitoring node when the virtual machine monitors that the second monitoring node is changed;
the processing module 20 is further configured to send, by the virtual machine, data of the second monitoring node to the android application, so that the android application determines and executes a corresponding operation based on the received data of the second monitoring node.
The processing module 20 is further configured to, when the D-BUS system monitors that the second monitoring node is changed, obtain data of the second monitoring node;
the processing module 20 is further configured to send the data of the second monitoring node to the browser by using the D-BUS system, so that the browser determines and executes a corresponding operation based on the data of the second monitoring node.
The processing module 20 is further configured to receive gesture data sent by the sensor by using the control chip;
the processing module 20 is further configured to send a data request to the sensor by using the control chip, and receive gesture data sent by the sensor based on the data request.
According to embodiments of the present disclosure, the present disclosure also provides an electronic device and a readable storage medium.
Fig. 6 shows a schematic block diagram of an example electronic device 400 that may be used to implement embodiments of the present disclosure. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular telephones, smartphones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the disclosure described and/or claimed herein.
As shown in fig. 6, the apparatus 400 includes a computing unit 401 that can perform various suitable actions and processes according to a computer program stored in a Read Only Memory (ROM) 402 or a computer program loaded from a storage unit 408 into a Random Access Memory (RAM) 403. In RAM 403, various programs and data required for the operation of device 400 may also be stored. The computing unit 401, ROM402, and RAM 403 are connected to each other by a bus 404. An input/output (I/O) interface 405 is also connected to bus 404.
Various components in device 400 are connected to I/O interface 405, including: an input unit 406 such as a keyboard, a mouse, etc.; an output unit 407 such as various types of displays, speakers, and the like; a storage unit 408, such as a magnetic disk, optical disk, etc.; and a communication unit 409 such as a network card, modem, wireless communication transceiver, etc. The communication unit 409 allows the device 400 to exchange information/data with other devices via a computer network, such as the internet, and/or various telecommunication networks.
The computing unit 401 may be a variety of general purpose and/or special purpose processing components having processing and computing capabilities. Some examples of computing unit 401 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various specialized Artificial Intelligence (AI) computing chips, various computing units running machine learning model algorithms, a Digital Signal Processor (DSP), and any suitable processor, controller, microcontroller, etc. The computing unit 401 performs the respective methods and processes described above, for example, a gesture data processing method. For example, in some embodiments, the gesture data processing method may be implemented as a computer software program tangibly embodied on a machine-readable medium, such as the storage unit 408. In some embodiments, part or all of the computer program may be loaded and/or installed onto the device 400 via the ROM402 and/or the communication unit 409. When the computer program is loaded into RAM 403 and executed by computing unit 401, one or more steps of the gesture data processing method described above may be performed. Alternatively, in other embodiments, the computing unit 401 may be configured to perform the gesture data processing method by any other suitable means (e.g. by means of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuitry, field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), integrated Systems On Chip (SOCs), complex Programmable Logic Devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs, the one or more computer programs may be executed and/or interpreted on a programmable system including at least one programmable processor, which may be a special purpose or general-purpose programmable processor, that may receive data and instructions from, and transmit data and instructions to, a storage system, at least one input device, and at least one output device.
Program code for carrying out methods of the present disclosure may be written in any combination of one or more programming languages. These program code may be provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable data processing apparatus such that the program code, when executed by the processor or controller, causes the functions/operations specified in the flowchart and/or block diagram to be implemented. The program code may execute entirely on the machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. The machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and pointing device (e.g., a mouse or trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user may be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic input, speech input, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a background component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such background, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), and the internet.
The computer system may include a client and a server. The client and server are typically remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server may be a cloud server, a server of a distributed system, or a server incorporating a blockchain.
It should be appreciated that various forms of the flows shown above may be used to reorder, add, or delete steps. For example, the steps recited in the present disclosure may be performed in parallel or sequentially or in a different order, provided that the desired results of the technical solutions of the present disclosure are achieved, and are not limited herein.
Furthermore, the terms "first," "second," and the like, are used for descriptive purposes only and are not to be construed as indicating or implying a relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include at least one such feature. In the description of the present disclosure, the meaning of "a plurality" is two or more, unless explicitly defined otherwise.
The foregoing is merely specific embodiments of the disclosure, but the protection scope of the disclosure is not limited thereto, and any person skilled in the art can easily think about changes or substitutions within the technical scope of the disclosure, and it is intended to cover the scope of the disclosure. Therefore, the protection scope of the present disclosure shall be subject to the protection scope of the claims.

Claims (10)

1. A gesture data processing method applied to an electronic device, the electronic device comprising a sensor, a control chip and a gesture processing program, the method comprising:
under the condition that an application is started, the control chip monitors that the sensor recognizes gestures of a user, obtains gesture data of the sensor and sends the gesture data to the gesture processing program;
the gesture processing program determines at least one event based on the gesture data, and modifies a first monitoring node corresponding to the event, so that a driver corresponding to the event controls corresponding hardware to execute the event when detecting that the first monitoring node is changed;
the gesture processing program modifies the second monitoring node based on an event execution result so that the application determines and executes corresponding operation when detecting the change of the second monitoring node, and the event execution result is returned to the gesture processing program after the corresponding hardware is controlled by the driver to complete the corresponding event.
2. The method of claim 1, the electronic device further comprising a driver, the driver corresponding to an event controlling corresponding hardware to execute the event upon monitoring a first monitoring node change, comprising at least one of:
if the hardware is a display device, the driver program determines a display type and a display attribute based on the changed first monitoring node when the first monitoring node is changed, and controls the display device to execute the event based on the display type and the display attribute;
if the hardware is input equipment and the event is a first event, when the driver monitors that a first monitoring node is changed, determining input data based on the changed first monitoring node, and controlling the input equipment to execute the first event based on the input data;
if the hardware is input equipment and the event is a second event, the driver determines target coordinates based on the changed first monitoring node when the driver monitors the change of the first monitoring node, and controls the input equipment to execute the second event based on the target coordinates;
if the hardware is input equipment and the event is a third event, when the driver monitors that the first monitoring node is changed, determining a click zone bit based on the changed first monitoring node, and controlling the input equipment to execute the third event based on the click zone bit.
3. The method of claim 1, the application being an android application, the electronic device further comprising a virtual machine, the application determining and performing a corresponding operation upon monitoring a second monitoring node change, comprising:
when the virtual machine monitors that the second monitoring node is changed, acquiring data of the second monitoring node;
and the virtual machine sends the data of the second monitoring node to the android application, so that the android application determines and executes corresponding operation based on the received data of the second monitoring node.
4. The method of claim 1, the application being a browser, the electronic device further comprising a D-BUS system, the application determining and performing a corresponding operation upon monitoring a second monitoring node change, comprising:
the D-BUS system acquires data of a second monitoring node when the second monitoring node is monitored to be changed;
and the D-BUS system sends the data of the second monitoring node to the browser so that the browser determines and executes corresponding operation based on the data of the second monitoring node.
5. The method of claim 1, the obtaining gesture data of the sensor comprising at least one of:
the control chip receives gesture data sent by the sensor;
and the control chip sends a data request to the sensor and receives gesture data sent by the sensor based on the data request.
6. A gesture data processing apparatus, the apparatus comprising:
the monitoring module is used for monitoring that the sensor recognizes a gesture of a user under the condition that an application is started, acquiring gesture data of the sensor and sending the gesture data to the gesture processing program;
the processing module is used for determining at least one event based on the gesture data by the gesture processing program and modifying a first monitoring node corresponding to the event so that a driver corresponding to the event controls corresponding hardware to execute the event when the first monitoring node is monitored to be changed;
the processing module is further configured to modify the second monitoring node based on an event execution result by the gesture processing program, so that the application determines and executes a corresponding operation when detecting that the second monitoring node is changed, where the event execution result is that the driver controls corresponding hardware to complete a corresponding event and then returns the event to the gesture processing program.
7. The apparatus of claim 6, comprising:
the processing module is further used for acquiring data of a second monitoring node when the virtual machine monitors that the second monitoring node is changed;
the processing module is further configured to send, by the virtual machine, data of the second monitoring node to the android application, so that the android application determines and executes a corresponding operation based on the received data of the second monitoring node.
8. The apparatus of claim 6, comprising:
the processing module is further used for acquiring data of the second monitoring node when the D-BUS system monitors that the second monitoring node is changed;
the processing module is further configured to send data of the second monitoring node to the browser by using the D-BUS system, so that the browser determines and executes a corresponding operation based on the data of the second monitoring node.
9. An electronic device, comprising:
a processor, a memory for storing instructions executable by the processor;
the processor is configured to read the executable instructions from the memory and execute the instructions to implement the gesture data processing method of any of claims 1-5.
10. A computer-readable storage medium storing a computer program for executing the gesture data processing method of any one of claims 1 to 5.
CN202311598865.8A 2023-11-24 2023-11-24 Gesture data processing method and device, electronic equipment and storage medium Pending CN117631836A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311598865.8A CN117631836A (en) 2023-11-24 2023-11-24 Gesture data processing method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311598865.8A CN117631836A (en) 2023-11-24 2023-11-24 Gesture data processing method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN117631836A true CN117631836A (en) 2024-03-01

Family

ID=90019381

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311598865.8A Pending CN117631836A (en) 2023-11-24 2023-11-24 Gesture data processing method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN117631836A (en)

Similar Documents

Publication Publication Date Title
US10503996B2 (en) Context-aware display of objects in mixed environments
US10353659B2 (en) Electronic device for controlling plurality of displays and control method
US11262895B2 (en) Screen capturing method and apparatus
EP3554034A1 (en) Method and device for authenticating login
US20150286328A1 (en) User interface method and apparatus of electronic device for receiving user input
WO2021092768A1 (en) Touch event processing method and apparatus, mobile terminal and storage medium
EP3714355B1 (en) Expanding physical motion gesture lexicon for an automated assistant
CN114748873B (en) Interface rendering method, device, equipment and storage medium
US11243679B2 (en) Remote data input framework
CN116483246A (en) Input control method and device, electronic equipment and storage medium
CN112965799B (en) Task state prompting method and device, electronic equipment and medium
EP3699731A1 (en) Method and device for calling input method, and server and terminal
CN107608521B (en) Touch screen failure processing method, storage medium and mobile terminal
US11460971B2 (en) Control method and electronic device
CN117631836A (en) Gesture data processing method and device, electronic equipment and storage medium
CN113253884A (en) Touch method, touch device and electronic equipment
CN113360074B (en) Soft keyboard display method, related device and computer program product
CN114428646B (en) Data processing method and device, electronic equipment and storage medium
CN112908329B (en) Voice control method and device, electronic equipment and medium
CN111797933B (en) Template matching method, device, electronic equipment and storage medium
CN113448668B (en) Method and device for skipping popup window and electronic equipment
CN117667201A (en) Response method, device, equipment and storage medium
CN116126270A (en) Display method and electronic equipment
CN113868558A (en) Interface display method and device, storage medium and electronic equipment
CN112711350A (en) Touch screen point reporting method and device, storage medium and mobile terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination