CN117348737A - Data processing system and method based on multi-channel interaction - Google Patents

Data processing system and method based on multi-channel interaction Download PDF

Info

Publication number
CN117348737A
CN117348737A CN202311665174.5A CN202311665174A CN117348737A CN 117348737 A CN117348737 A CN 117348737A CN 202311665174 A CN202311665174 A CN 202311665174A CN 117348737 A CN117348737 A CN 117348737A
Authority
CN
China
Prior art keywords
user
gesture
data processing
processor
wearable sensing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311665174.5A
Other languages
Chinese (zh)
Inventor
施明君
张璠璠
王永恒
傅四维
金雄男
张涛
杨博
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Lab
Original Assignee
Zhejiang Lab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Lab filed Critical Zhejiang Lab
Priority to CN202311665174.5A priority Critical patent/CN117348737A/en
Publication of CN117348737A publication Critical patent/CN117348737A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

According to the data processing system and the method based on multi-channel interaction, the hand wearable sensing device collects gesture information of a user and sends the gesture information to the processor, then the processor determines gesture actions of the user according to the gesture information of the user collected by the hand wearable sensing device, determines gesture operation instructions of the user according to the gesture actions of the user, and performs data processing on the obtained data to be processed according to the gesture operation instructions to obtain data processing results.

Description

Data processing system and method based on multi-channel interaction
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to a data processing system and method based on multi-channel interaction.
Background
In recent years, with rapid development of the computer industry, demands for analyzing and processing data by users have proliferated, and data can be analyzed and processed by a data analysis platform.
The existing data analysis platform is used for collecting data input by a user, and sequentially analyzing and processing the data according to logic processing steps given by the user, so that a data analysis result is obtained. Specifically, the user performs various data processing steps by clicking the logic buttons in sequence, and finally obtains a data processing result, and in general, one logic button corresponds to one data processing step.
However, in the data processing process, more data processing steps may occur, and accordingly, the number of logical buttons that a user needs to touch in the user interface may also increase, and in practical application, various logical buttons presented in the user interface may occur, such as a case of disordered arrangement, or even a case of hiding part of the logical buttons, which results in that the user needs to spend a relatively large time searching for each logical button one by one in the process of executing the data processing task, thereby reducing the data processing efficiency of the user.
Disclosure of Invention
The present disclosure provides a data processing system and method based on multi-channel interaction to partially solve the above-mentioned problems in the prior art.
The technical scheme adopted in the specification is as follows:
the present specification provides a data processing system based on multi-channel interaction, the system comprising: a hand wearable sensing device, a processor, wherein,
the hand wearable sensing device is used for collecting gesture information of a user and sending the gesture information to the processor;
the processor is used for determining the gesture action of the user according to the gesture information of the user acquired by the hand wearable sensing equipment, determining the gesture operation instruction of the user according to the gesture action of the user, and carrying out data processing on the acquired data to be processed according to the gesture operation instruction so as to obtain a data processing result.
Optionally, the hand wearable sensing device is further configured to perform feedback response to gesture operation performed by a user, where the feedback response is in a vibration form;
the processor is configured to determine whether a gesture motion of the user is matched with a gesture motion included in a preset gesture motion data set, if yes, send a first feedback instruction to the hand wearable sensing device, so that the hand wearable sensing device performs a first feedback response on the user, and if not, send a second feedback instruction to the hand wearable sensing device, so that the hand wearable sensing device performs a second feedback response on the user, where vibration intensity corresponding to the first feedback response is smaller than that of the second feedback response.
Optionally, the system further comprises a head wearable sensing device, wherein,
the head wearable sensing device is used for presenting the data processing result through a displayed user interface and collecting eye track information of the user so as to send the eye track information to the processor;
the processor is used for determining the sight line stay position and the sight line stay time value of the user on the user interface according to the eye track information, determining the data watched by the user on the user interface according to the sight line stay position, and generating a voice prompt instruction aiming at the target data when the sight line stay time value exceeds a sight line stay threshold value and sending the voice prompt instruction to the head wearable sensing equipment so that the head wearable sensing equipment carries out voice prompt on the user.
Optionally, the system further comprises a head wearable sensing device, wherein,
the processor is specifically configured to determine a task step in which the user is located in a data processing task of the data to be processed according to the gesture operation instruction of the user, and send a voice operation prompt instruction for the next task step to the head wearable sensing device in response to an operation corresponding to the next task step not being executed by the user at a designated time after the task step is completed, so that the head wearable sensing device performs voice prompt on the user.
Optionally, the head wearable sensing device is configured to collect voice information of the user and send the voice information to the processor;
the processor is specifically configured to parse the voice information to determine a voice operation instruction corresponding to the voice information, and execute data processing according to the voice operation instruction.
The present disclosure provides a data processing method based on multi-channel interaction, where the method is applied to a processor included in the system, and includes:
acquiring gesture information of a user acquired by hand wearable sensing equipment;
determining gesture actions of the user according to the gesture information;
determining a gesture operation instruction of the user according to the gesture action of the user;
and carrying out data processing on the acquired data to be processed according to the gesture operation instruction so as to obtain a data processing result.
Optionally, the method further comprises:
acquiring eye track information of the user acquired by head wearable sensing equipment;
determining the sight line stay position and the sight line dead time value of the user on a user interface according to the eye track information, and determining data watched by the user on the user interface according to the sight line stay position as target data;
and when the vision dead time value exceeds a vision dead threshold value, generating a voice prompt instruction aiming at the target data and sending the voice prompt instruction to the head wearable sensing equipment so that the head wearable sensing equipment carries out voice prompt on the user.
The present specification provides a data processing apparatus based on multi-channel interaction, the apparatus being deployed in a processor comprised in the above system, comprising:
the acquisition module is used for acquiring gesture information of a user acquired by the hand wearable sensing equipment;
the first determining module is used for determining gesture actions of the user according to the gesture information;
the second determining module is used for determining a gesture operation instruction of the user according to the gesture action of the user;
and the processing module is used for carrying out data processing on the acquired data to be processed according to the gesture operation instruction so as to obtain a data processing result.
The present specification provides a computer readable storage medium storing a computer program which when executed by a processor implements the above-described data processing method based on multi-channel interaction.
The present specification provides an electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the above-mentioned data processing method based on multi-channel interaction when executing the program.
The above-mentioned at least one technical scheme that this specification adopted can reach following beneficial effect:
according to the data processing system based on multi-channel interaction, the hand wearable sensing device collects gesture information of a user and sends the gesture information to the processor, then the processor determines gesture actions of the user according to the gesture information of the user collected by the hand wearable sensing device, determines gesture operation instructions of the user according to the gesture actions of the user, and performs data processing on the obtained data to be processed according to the gesture operation instructions to obtain a data processing result.
According to the method, the data processing system based on the multi-channel interaction provided by the specification determines the gesture action of the user by collecting the gesture information of the user, further determines the gesture operation instruction of the user according to the gesture action of the user, and processes the acquired data to be processed according to the gesture operation instruction of the user. Therefore, when the system provided by the specification is used, a user does not need to execute various data processing steps by touching the logic buttons in the user interface one by one, so that the situations of more logic buttons needing touch, disordered logic button arrangement, hidden part of logic buttons and the like caused by more data processing steps are avoided, the situation that the user needs to find each logic button one by one is avoided, and the data processing efficiency of the user is improved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the specification, illustrate and explain the exemplary embodiments of the present specification and their description, are not intended to limit the specification unduly. In the drawings:
FIG. 1 is a schematic diagram of a data processing system based on multi-channel interaction provided in the present specification;
FIG. 2 is a schematic diagram of a data processing system based on multi-channel interaction provided in the present specification;
FIG. 3 is a schematic flow chart of a data processing method based on multi-channel interaction provided in the present specification;
FIG. 4 is a schematic diagram of a data processing apparatus based on multi-channel interaction provided in the present specification;
fig. 5 is a schematic structural diagram of an electronic device corresponding to fig. 3 provided in the present specification.
Detailed Description
For the purposes of making the objects, technical solutions and advantages of the present specification more apparent, the technical solutions of the present specification will be clearly and completely described below with reference to specific embodiments of the present specification and corresponding drawings. It will be apparent that the described embodiments are only some, but not all, of the embodiments of the present specification. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the present disclosure, are intended to be within the scope of the present disclosure.
The following describes in detail the technical solutions provided by the embodiments of the present specification with reference to the accompanying drawings.
The existing data analysis platform needs users to touch the logic buttons in the user interface one by one to execute various data processing steps, so when more data processing steps occur, various logic buttons presented in the user interface may occur, such as disordered arrangement, and even a part of logic buttons are hidden. This results in a user spending a relatively large amount of time searching for each logical button during the execution of the data processing task, thereby reducing the user's data processing efficiency. Accordingly, in order to improve data processing efficiency for a user, the present specification provides a data processing system based on multi-channel interaction.
FIG. 1 is a schematic diagram of a data processing system based on multi-channel interaction provided in the present specification.
As can be seen from fig. 1, the system provided in this specification may include a hand wearable sensing device, a processor, wherein the hand wearable sensing device may be a device such as a virtual reality digital glove.
Specifically, the user may wear the hand wearable sensing device and make a corresponding gesture, and then the hand wearable sensing device may collect gesture information of the user and send the gesture information of the user to the processor.
The processor can determine gesture actions of the user according to gesture information of the user, further determine gesture operation instructions of the user according to the gesture actions of the user, and then perform data processing on data to be processed acquired by the system according to the gesture operation instructions, and further obtain data processing results.
In the process of determining the gesture operation instruction of the user according to the gesture action of the user, the processor can match the gesture action of the user with the gesture action contained in the gesture action data set, further determine a gesture operation instruction corresponding to the gesture action matched with the gesture action of the user in the gesture action data set, and determine the gesture operation instruction as the gesture operation instruction of the user. The gesture action data set is preset by a manager of the system, and may be used to represent a one-to-one correspondence between a gesture action and a gesture operation instruction, where the gesture action data set may be as shown in table 1:
TABLE 1
As can be seen from Table 1, when the system in the specification is used, a user can execute corresponding data processing steps only by making corresponding gestures through the hand wearable sensing device, and the gesture actions contained in the gesture action data set shown in Table 1 are simpler and are convenient to memorize, so that the user can operate more conveniently when the system in the specification is used, and the data processing efficiency of the user is improved.
According to the data processing system, two hand wearable sensing devices can be arranged, so that a user can wear the two hand wearable sensing devices to carry out gesture operation, wherein the user can enable a processor to carry out data processing on data to be processed according to corresponding gesture operation instructions by operating any hand wearable sensing device. In addition, the user can operate the two hand wearable sensing devices at the same time and make different gestures, and then the processor can determine a gesture operation instruction corresponding to the combined gesture (for example, the two gestures correspond to one gesture operation instruction) according to the combination of the two gestures, so as to execute data processing through the gesture operation instruction.
It should be noted that there may be two cases of matching and mismatching between the gesture motion of the user and the gesture motion included in the gesture motion data set, where when the matching between the gesture motion of the user and the gesture motion included in the gesture motion data set is successful, that is, the gesture information of the user is correct, in order to inform the user that the gesture operation is successful, the processor may perform feedback response to the user in a vibration manner. Specifically, the processor may send a first feedback instruction to the hand-wearable sensing device, thereby causing the hand-wearable sensing device to perform a first feedback response to the user.
In addition, when the gesture motion of the user does not match the gesture motion included in the gesture motion data set, that is, the gesture information of the user is wrong and the gesture operation fails. Therefore, in order to prompt the wrong gesture information of the user, the processor may send a second feedback instruction to the hand wearable sensing device in a vibration form, so that the hand wearable sensing device may perform a second feedback response on the user.
In order to distinguish the two feedback responses so that a user can obtain accurate operation feedback, the vibration intensity of the first feedback response is smaller than that of the second feedback response.
In addition, in order to perform more comprehensive operation feedback on the user and improve the data processing efficiency of the user, the system in the specification can further comprise a head wearable sensing device.
FIG. 2 is a schematic diagram of a data processing system based on multi-channel interaction provided in the present specification.
As can be seen from fig. 2, the system provided in this specification may include a hand wearable sensing device, a head wearable sensing device, and a processor, wherein the hand wearable sensing device may be a device such as a virtual reality digital glove, and the head wearable sensing device may be a device such as a virtual reality helmet.
The user interface displayed by the head wearable sensing device can display data processing results, for example, the user interface displayed by the head wearable sensing device can display a data analysis result graph in a three-dimensional form, can display a data analysis process in a three-dimensional dynamic effect form, and the like.
In addition, in order to perform more comprehensive operation feedback on the user, the processor may also perform feedback on the user according to the eye track information of the user acquired by the head wearable sensing device. Specifically, the head wearable sensing device may collect eye track information of the user and send the eye track information to the processor, where the eye track information includes physiological parameters of the user such as gaze time and pupil size of the user. The processor can further determine the stay position of the sight line and the stay time value of the sight line of the user in the user interface according to the eye track information, and determine the data watched by the user in the user interface according to the stay position of the sight line, and take the data as target data.
When the sight line dead time value exceeds the sight line dead time threshold value, namely the cognition degree of the user on the target data is possibly low, the processor can generate a voice prompt instruction aiming at the target data and send the voice prompt instruction to the head wearable sensing equipment, so that the head wearable sensing equipment carries out voice prompt on the user, and the user can carry out subsequent operation aiming at the target data according to the voice prompt. The threshold value of the sight line stagnation can be preset by a manager of the system, and the voice prompt can be that the head wearable sensing device prompts a source of target data and the like in a voice broadcasting mode.
In addition, in the process of executing the data processing task of the data to be processed by the user, the situation that the user is unclear about the next task step of the current task step can possibly occur, and further the subsequent operation cannot be performed, so that the system in the specification can perform operation feedback on the user according to the situation, and the user can successfully complete the subsequent operation.
Specifically, the processor may determine, according to a gesture operation instruction of the user, a task step in which the user is located in executing a data processing task of the data to be processed. And then, the processor sends a voice operation prompt instruction aiming at the next task step to the head wearable sensing device in response to the fact that the user does not execute the operation corresponding to the next task step at the appointed time after the current task step is completed, so that the head wearable sensing device carries out voice prompt on the user.
The processor may determine whether the user performs the operation corresponding to the next task step in the designated time by determining whether gesture information sent by the hand wearable sensing device is received in the designated time, where the voice prompt may be that the head wearable sensing device prompts the next task step that the user may perform in a voice broadcast mode.
In addition, a user who uses the system in this specification for the first time may have a situation that the overall task flow for performing the data processing task is unclear, and therefore, the processor may send a voice operation prompt instruction for each data processing task to the head-worn sensing device in response to an operation of the user for performing the data processing task for the first time, so that the head-worn sensing device performs voice prompt for the user.
In addition, in order to facilitate the interaction between the user and the system in the specification, the method can also be realized by collecting the voice information of the user through the head wearable sensing device. Specifically, the head wearable sensing device may send the collected voice information of the user to the processor, and the processor analyzes the received voice information of the user, so as to obtain a voice operation instruction corresponding to the voice information of the user, and execute data processing according to the voice operation instruction. Therefore, when the processor determines that the user is executing the task step of the data processing task of the data to be processed, the task step can be completed according to the voice operation instruction of the user.
It should be noted that, the system in the present specification may collect gesture information and voice information of a user by using the hand wearable sensing device and the head wearable sensing device, and convert the gesture information and the voice information into corresponding gesture operation instructions and voice operation instructions, so as to execute various data processing steps. Therefore, when the system in the specification is used, a user can execute corresponding data processing steps only by making corresponding gestures or sending corresponding voices without clicking logic buttons one by one, so that the situation that in the prior art, when the user executes various data processing steps, the time for searching each logic button is long due to the fact that the number of the logic buttons is large and the arrangement of the logic buttons is disordered is avoided, and further the data processing efficiency of the user is improved.
In addition to the description of the data processing system based on multi-channel interaction, the present disclosure also provides a data processing method based on multi-channel interaction, as shown in fig. 3, which is applied to a processor included in the data processing system based on multi-channel interaction.
Fig. 3 is a schematic flow chart of a data processing method based on multi-channel interaction provided in the present specification.
S301: gesture information of a user acquired by hand wearable sensing equipment is acquired.
S302: and determining the gesture action of the user according to the gesture information.
The processor firstly acquires gesture information of a user acquired by the hand wearable sensing device, and determines gesture actions of the user according to the gesture information of the user.
S303: and determining the gesture operation instruction of the user according to the gesture action of the user.
The processor can match the gesture action of the user with the gesture action contained in the gesture action data set, further determine a gesture operation corresponding to the gesture action matched with the gesture action of the user in the gesture action data set, and determine the gesture operation instruction as the gesture operation of the user.
It should be noted that when the gesture motion of the user is successfully matched with the gesture motion contained in the gesture motion data set, that is, the gesture information of the user is correct, the processor may send a first feedback instruction to the hand wearable sensing device, so that the hand wearable sensing device performs a first feedback response to the user in a vibration mode.
In addition, there may be a case where the gesture motion of the user does not match the gesture motion included in the gesture motion data set, that is, the gesture information of the user is wrong and the gesture operation fails. The processor may send a second feedback instruction to the hand-worn sensing device, and may further cause the hand-worn sensing device to perform a second feedback response to the user in the form of vibration.
In order to distinguish the two feedback responses so that a user can obtain accurate operation feedback, the vibration intensity of the first feedback response is smaller than that of the second feedback response.
S304: and carrying out data processing on the acquired data to be processed according to the gesture operation instruction so as to obtain a data processing result.
After determining the gesture operation instruction, the processor analyzes the gesture operation instruction, and further performs data processing on the acquired data to be processed to obtain a data processing result.
The present disclosure also provides a data processing apparatus based on multi-channel interaction, as shown in fig. 4.
FIG. 4 is a schematic diagram of a data processing apparatus based on multi-channel interaction provided in the present specification, the apparatus being deployed in a processor included in the above-described data processing system based on multi-channel interaction, comprising:
an acquisition module 401, configured to acquire gesture information of a user acquired by a hand wearable sensing device;
a first determining module 402, configured to determine a gesture of the user according to the gesture information;
a second determining module 403, configured to determine a gesture operation instruction of the user according to the gesture action of the user;
and the processing module 404 is configured to perform data processing on the acquired data to be processed according to the gesture operation instruction, so as to obtain a data processing result.
Optionally, the device may further provide a voice prompt module 405, configured to acquire eye track information of the user acquired by the head wearable sensing device; determining the sight line stay position and the sight line dead time value of the user on a user interface according to the eye track information, and determining data watched by the user on the user interface according to the sight line stay position as target data; and when the vision dead time value exceeds a vision dead threshold value, generating a voice prompt instruction aiming at the target data and sending the voice prompt instruction to the head wearable sensing equipment so that the head wearable sensing equipment carries out voice prompt on the user.
The present specification also provides a computer readable storage medium storing a computer program operable to perform a multi-channel interaction based data processing method as provided in fig. 3 above.
The present specification also provides a schematic structural diagram of an electronic device corresponding to fig. 1 shown in fig. 5. At the hardware level, as shown in fig. 5, the electronic device includes a processor, an internal bus, a network interface, a memory, and a nonvolatile storage, and may of course include hardware required by other services. The processor reads the corresponding computer program from the nonvolatile memory into the memory and then runs the same to implement a data processing method based on multi-channel interaction as described in fig. 3.
Of course, other implementations, such as logic devices or combinations of hardware and software, are not excluded from the present description, that is, the execution subject of the following processing flows is not limited to each logic unit, but may be hardware or logic devices.
In the 90 s of the 20 th century, improvements to one technology could clearly be distinguished as improvements in hardware (e.g., improvements to circuit structures such as diodes, transistors, switches, etc.) or software (improvements to the process flow). However, with the development of technology, many improvements of the current method flows can be regarded as direct improvements of hardware circuit structures. Designers almost always obtain corresponding hardware circuit structures by programming improved method flows into hardware circuits. Therefore, an improvement of a method flow cannot be said to be realized by a hardware entity module. For example, a programmable logic device (Programmable Logic Device, PLD) (e.g., field programmable gate array (Field Programmable Gate Array, FPGA)) is an integrated circuit whose logic function is determined by the programming of the device by a user. A designer programs to "integrate" a digital system onto a PLD without requiring the chip manufacturer to design and fabricate application-specific integrated circuit chips. Moreover, nowadays, instead of manually manufacturing integrated circuit chips, such programming is mostly implemented by using "logic compiler" software, which is similar to the software compiler used in program development and writing, and the original code before the compiling is also written in a specific programming language, which is called hardware description language (Hardware Description Language, HDL), but not just one of the hdds, but a plurality of kinds, such as ABEL (Advanced Boolean Expression Language), AHDL (Altera Hardware Description Language), confluence, CUPL (Cornell University Programming Language), HDCal, JHDL (Java Hardware Description Language), lava, lola, myHDL, PALASM, RHDL (Ruby Hardware Description Language), etc., VHDL (Very-High-Speed Integrated Circuit Hardware Description Language) and Verilog are currently most commonly used. It will also be apparent to those skilled in the art that a hardware circuit implementing the logic method flow can be readily obtained by merely slightly programming the method flow into an integrated circuit using several of the hardware description languages described above.
The controller may be implemented in any suitable manner, for example, the controller may take the form of, for example, a microprocessor or processor and a computer readable medium storing computer readable program code (e.g., software or firmware) executable by the (micro) processor, logic gates, switches, application specific integrated circuits (Application Specific Integrated Circuit, ASIC), programmable logic controllers, and embedded microcontrollers, examples of which include, but are not limited to, the following microcontrollers: ARC 625D, atmel AT91SAM, microchip PIC18F26K20, and Silicone Labs C8051F320, the memory controller may also be implemented as part of the control logic of the memory. Those skilled in the art will also appreciate that, in addition to implementing the controller in a pure computer readable program code, it is well possible to implement the same functionality by logically programming the method steps such that the controller is in the form of logic gates, switches, application specific integrated circuits, programmable logic controllers, embedded microcontrollers, etc. Such a controller may thus be regarded as a kind of hardware component, and means for performing various functions included therein may also be regarded as structures within the hardware component. Or even means for achieving the various functions may be regarded as either software modules implementing the methods or structures within hardware components.
The system, apparatus, module or unit set forth in the above embodiments may be implemented in particular by a computer chip or entity, or by a product having a certain function. One typical implementation is a computer. In particular, the computer may be, for example, a personal computer, a laptop computer, a cellular telephone, a camera phone, a smart phone, a personal digital assistant, a media player, a navigation device, an email device, a game console, a tablet computer, a wearable device, or a combination of any of these devices.
For convenience of description, the above devices are described as being functionally divided into various units, respectively. Of course, the functions of each element may be implemented in one or more software and/or hardware elements when implemented in the present specification.
It will be appreciated by those skilled in the art that embodiments of the present description may be provided as a method, system, or computer program product. Accordingly, the present specification may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present description can take the form of a computer program product on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, etc.) having computer-usable program code embodied therein.
The present description is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the specification. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In one typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include volatile memory in a computer-readable medium, random Access Memory (RAM) and/or nonvolatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of computer-readable media.
Computer readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of storage media for a computer include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium, which can be used to store information that can be accessed by a computing device. Computer-readable media, as defined herein, does not include transitory computer-readable media (transmission media), such as modulated data signals and carrier waves.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article or apparatus that comprises the element.
It will be appreciated by those skilled in the art that embodiments of the present description may be provided as a method, system, or computer program product. Accordingly, the present specification may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present description can take the form of a computer program product on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, etc.) having computer-usable program code embodied therein.
The description may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The specification may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
In this specification, each embodiment is described in a progressive manner, and identical and similar parts of each embodiment are all referred to each other, and each embodiment mainly describes differences from other embodiments. In particular, for system embodiments, since they are substantially similar to method embodiments, the description is relatively simple, as relevant to see a section of the description of method embodiments.
The foregoing is merely exemplary of the present disclosure and is not intended to limit the disclosure. Various modifications and alterations to this specification will become apparent to those skilled in the art. Any modifications, equivalent substitutions, improvements, or the like, which are within the spirit and principles of the present description, are intended to be included within the scope of the claims of the present description.

Claims (10)

1. A data processing system based on multi-channel interactions, the system comprising: a hand wearable sensing device, a processor, wherein,
the hand wearable sensing device is used for collecting gesture information of a user and sending the gesture information to the processor;
the processor is used for determining the gesture action of the user according to the gesture information of the user acquired by the hand wearable sensing equipment, determining the gesture operation instruction of the user according to the gesture action of the user, and carrying out data processing on the acquired data to be processed according to the gesture operation instruction so as to obtain a data processing result.
2. The system of claim 1, wherein the hand-worn sensing device is further configured to feedback a gesture operation performed by a user, the feedback response being in the form of a shock;
the processor is configured to determine whether a gesture motion of the user is matched with a gesture motion included in a preset gesture motion data set, if yes, send a first feedback instruction to the hand wearable sensing device, so that the hand wearable sensing device performs a first feedback response on the user, and if not, send a second feedback instruction to the hand wearable sensing device, so that the hand wearable sensing device performs a second feedback response on the user, where vibration intensity corresponding to the first feedback response is smaller than that of the second feedback response.
3. The system of claim 1, further comprising a head-worn sensing device, wherein,
the head wearable sensing device is used for presenting the data processing result through a displayed user interface and collecting eye track information of the user so as to send the eye track information to the processor;
the processor is used for determining the sight line stay position and the sight line stay time value of the user on the user interface according to the eye track information, determining the data watched by the user on the user interface according to the sight line stay position, and generating a voice prompt instruction aiming at the target data when the sight line stay time value exceeds a sight line stay threshold value and sending the voice prompt instruction to the head wearable sensing equipment so that the head wearable sensing equipment carries out voice prompt on the user.
4. The system of claim 1 or 3, further comprising a head-worn sensing device, wherein,
the processor is specifically configured to determine a task step in which the user is located in a data processing task of the data to be processed according to the gesture operation instruction of the user, and send a voice operation prompt instruction for the next task step to the head wearable sensing device in response to an operation corresponding to the next task step not being executed by the user at a designated time after the task step is completed, so that the head wearable sensing device performs voice prompt on the user.
5. The system of claim 4, wherein the head-worn sensing device is configured to collect voice information of the user and send the voice information to the processor;
the processor is specifically configured to parse the voice information to determine a voice operation instruction corresponding to the voice information, and execute data processing according to the voice operation instruction.
6. A data processing method based on multi-channel interaction, wherein the method is applied to a processor included in the system of any one of claims 1 to 5, and includes:
acquiring gesture information of a user acquired by hand wearable sensing equipment;
determining gesture actions of the user according to the gesture information;
determining a gesture operation instruction of the user according to the gesture action of the user;
and carrying out data processing on the acquired data to be processed according to the gesture operation instruction so as to obtain a data processing result.
7. The method of claim 6, wherein the method further comprises:
acquiring eye track information of the user acquired by head wearable sensing equipment;
determining the sight line stay position and the sight line dead time value of the user on a user interface according to the eye track information, and determining data watched by the user on the user interface according to the sight line stay position as target data;
and when the vision dead time value exceeds a vision dead threshold value, generating a voice prompt instruction aiming at the target data and sending the voice prompt instruction to the head wearable sensing equipment so that the head wearable sensing equipment carries out voice prompt on the user.
8. A data processing apparatus based on multi-channel interaction, characterized in that the apparatus is deployed in a processor comprised in the system according to any of claims 1-5, comprising:
the acquisition module is used for acquiring gesture information of a user acquired by the hand wearable sensing equipment;
the first determining module is used for determining gesture actions of the user according to the gesture information;
the second determining module is used for determining a gesture operation instruction of the user according to the gesture action of the user;
and the processing module is used for carrying out data processing on the acquired data to be processed according to the gesture operation instruction so as to obtain a data processing result.
9. A computer readable storage medium, characterized in that the storage medium stores a computer program which, when executed by a processor, implements the method of any of the preceding claims 6-7.
10. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the method of any of the preceding claims 6-7 when executing the program.
CN202311665174.5A 2023-12-06 2023-12-06 Data processing system and method based on multi-channel interaction Pending CN117348737A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311665174.5A CN117348737A (en) 2023-12-06 2023-12-06 Data processing system and method based on multi-channel interaction

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311665174.5A CN117348737A (en) 2023-12-06 2023-12-06 Data processing system and method based on multi-channel interaction

Publications (1)

Publication Number Publication Date
CN117348737A true CN117348737A (en) 2024-01-05

Family

ID=89357981

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311665174.5A Pending CN117348737A (en) 2023-12-06 2023-12-06 Data processing system and method based on multi-channel interaction

Country Status (1)

Country Link
CN (1) CN117348737A (en)

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104145232A (en) * 2012-01-04 2014-11-12 托比技术股份公司 System for gaze interaction
CN106104423A (en) * 2014-03-12 2016-11-09 微软技术许可有限责任公司 Pose parameter is regulated
CN108255297A (en) * 2017-12-29 2018-07-06 青岛真时科技有限公司 A kind of wearable device application control method and apparatus
CN109512073A (en) * 2018-11-09 2019-03-26 中国兵器装备集团自动化研究所 A kind of military helmet based on intelligent human-machine interaction design
CN110162178A (en) * 2019-05-22 2019-08-23 努比亚技术有限公司 Illustrate the methods of exhibiting, wearable device and storage medium of information
CN110785688A (en) * 2017-04-19 2020-02-11 奇跃公司 Multi-modal task execution and text editing for wearable systems
CN112424727A (en) * 2018-05-22 2021-02-26 奇跃公司 Cross-modal input fusion for wearable systems
CN114265498A (en) * 2021-12-16 2022-04-01 中国电子科技集团公司第二十八研究所 Method for combining multi-modal gesture recognition and visual feedback mechanism
CN115443445A (en) * 2020-02-26 2022-12-06 奇跃公司 Hand gesture input for wearable systems
CN115756161A (en) * 2022-11-15 2023-03-07 华南理工大学 Multi-modal interactive structure mechanics analysis method, system, computer equipment and medium
CN116097209A (en) * 2020-06-29 2023-05-09 元平台技术有限公司 Integration of artificial reality interaction modes
CN116339501A (en) * 2021-12-24 2023-06-27 北京字跳网络技术有限公司 Data processing method, device, equipment and computer readable storage medium
WO2023216930A1 (en) * 2022-05-12 2023-11-16 华为技术有限公司 Wearable-device based vibration feedback method, system, wearable device and electronic device
CN117075726A (en) * 2023-08-11 2023-11-17 东南大学 Cooperative control method and system for mixed interaction of visual gestures and myoelectricity sensing

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104145232A (en) * 2012-01-04 2014-11-12 托比技术股份公司 System for gaze interaction
CN106104423A (en) * 2014-03-12 2016-11-09 微软技术许可有限责任公司 Pose parameter is regulated
CN110785688A (en) * 2017-04-19 2020-02-11 奇跃公司 Multi-modal task execution and text editing for wearable systems
CN108255297A (en) * 2017-12-29 2018-07-06 青岛真时科技有限公司 A kind of wearable device application control method and apparatus
CN112424727A (en) * 2018-05-22 2021-02-26 奇跃公司 Cross-modal input fusion for wearable systems
CN109512073A (en) * 2018-11-09 2019-03-26 中国兵器装备集团自动化研究所 A kind of military helmet based on intelligent human-machine interaction design
CN110162178A (en) * 2019-05-22 2019-08-23 努比亚技术有限公司 Illustrate the methods of exhibiting, wearable device and storage medium of information
CN115443445A (en) * 2020-02-26 2022-12-06 奇跃公司 Hand gesture input for wearable systems
CN116097209A (en) * 2020-06-29 2023-05-09 元平台技术有限公司 Integration of artificial reality interaction modes
CN114265498A (en) * 2021-12-16 2022-04-01 中国电子科技集团公司第二十八研究所 Method for combining multi-modal gesture recognition and visual feedback mechanism
CN116339501A (en) * 2021-12-24 2023-06-27 北京字跳网络技术有限公司 Data processing method, device, equipment and computer readable storage medium
WO2023216930A1 (en) * 2022-05-12 2023-11-16 华为技术有限公司 Wearable-device based vibration feedback method, system, wearable device and electronic device
CN117093068A (en) * 2022-05-12 2023-11-21 华为技术有限公司 Vibration feedback method and system based on wearable device, wearable device and electronic device
CN115756161A (en) * 2022-11-15 2023-03-07 华南理工大学 Multi-modal interactive structure mechanics analysis method, system, computer equipment and medium
CN117075726A (en) * 2023-08-11 2023-11-17 东南大学 Cooperative control method and system for mixed interaction of visual gestures and myoelectricity sensing

Similar Documents

Publication Publication Date Title
KR20150079385A (en) A natural input based virtual ui system for electronic devices
GB2438524A (en) Input analysis for executing selected predetermined function
CN111045519A (en) Human-computer interaction method, device and equipment based on eye movement tracking
CN105824422A (en) Information processing method and electronic equipment
CN112015569B (en) Message reminding processing method and device
CN108369451A (en) Information processing unit, information processing method and program
Wingrave et al. Towards preferences in virtual environment interfaces
EP3940588A1 (en) Fingerprint image processing methods and apparatuses
CN112578967B (en) Chart information reading method and mobile terminal
CN111209277B (en) Data processing method, device, equipment and medium
WO2024066759A1 (en) Application switching method, apparatus and device, and medium
CN109246292A (en) A kind of moving method and device of terminal desktop icon
CN111563049B (en) Attack testing method, device and equipment for biological feature recognition
CN112286358A (en) Screen operation method and device, electronic equipment and computer-readable storage medium
CN117348737A (en) Data processing system and method based on multi-channel interaction
CN111736799A (en) Voice interaction method, device, equipment and medium based on man-machine interaction
CN111242106A (en) Facial image acquisition method, device and equipment and facial recognition equipment
CN103870146A (en) Information processing method and electronic equipment
CN114201086B (en) Information display method and device
CN106293034A (en) The method of a kind of information output and terminal
CN115862668B (en) Method and system for judging interactive object based on sound source positioning by robot
CN111652074A (en) Face recognition method, device, equipment and medium
CN117455015B (en) Model optimization method and device, storage medium and electronic equipment
CN116152299B (en) Motion state detection method and device, storage medium and electronic equipment
CN112200070B (en) User identification and service processing method, device, equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination