CN113970967A - Gesture synchronous detection control method, system, medium and mobile terminal - Google Patents
Gesture synchronous detection control method, system, medium and mobile terminal Download PDFInfo
- Publication number
- CN113970967A CN113970967A CN202111265986.1A CN202111265986A CN113970967A CN 113970967 A CN113970967 A CN 113970967A CN 202111265986 A CN202111265986 A CN 202111265986A CN 113970967 A CN113970967 A CN 113970967A
- Authority
- CN
- China
- Prior art keywords
- information
- gesture
- attitude
- acceleration
- acceleration component
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000001514 detection method Methods 0.000 title claims abstract description 28
- 230000001360 synchronised effect Effects 0.000 title claims abstract description 27
- 238000000034 method Methods 0.000 title claims abstract description 22
- 230000001133 acceleration Effects 0.000 claims description 111
- 230000036544 posture Effects 0.000 claims description 49
- 230000009471 action Effects 0.000 claims description 29
- 238000004590 computer program Methods 0.000 claims description 6
- 230000009916 joint effect Effects 0.000 claims description 6
- 230000000875 corresponding effect Effects 0.000 description 17
- 230000001276 controlling effect Effects 0.000 description 4
- 210000002310 elbow joint Anatomy 0.000 description 2
- 210000000245 forearm Anatomy 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 210000000707 wrist Anatomy 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 210000001503 joint Anatomy 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 210000000323 shoulder joint Anatomy 0.000 description 1
- 239000004984 smart glass Substances 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 210000003857 wrist joint Anatomy 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The invention provides a gesture synchronous detection control method, a system, a medium and a mobile terminal, comprising the following steps: a motion information acquisition step: acquiring motion information of the mobile terminal; and (3) attitude analysis: analyzing to obtain attitude information according to the motion information; a gesture analysis step: analyzing to obtain gesture information according to the motion information; and (3) controlling and analyzing: and obtaining control information according to the gesture information and the posture information. The invention organically combines the gesture information and the posture information, obtains the control instruction according to the combination relationship of the gesture information and the posture information, greatly expands the number of the instructions and increases more fun for the use of the toy.
Description
Technical Field
The invention relates to the field of motion gesture control, in particular to a gesture synchronous detection control method, a gesture synchronous detection control system, a gesture synchronous detection control medium and a mobile terminal.
Background
Patent document CN104941203A discloses a toy based on gesture track recognition and a recognition and control method thereof, which is provided with a gesture track recognition mechanism to recognize the gesture track of a user and control the corresponding functions of the toy according to different gesture tracks of the user. Need not to use remote controller or contact toy, only need adopt the gesture action to operate, easy operation, convenience, the drawing orbit of distinguishable user's gesture moreover, if gesture actions such as forward, backward, left, right, draw a circle, the user only needs to adopt single gesture can realize multiple different function control. However, patent document CN104941203A can only realize single gesture control.
Patent document CN106178538A provides an intelligent toy control system and method based on gesture detection, and relates to the field of toys. Characterized in that the system comprises: the control end is used for wearing the control end for detecting the human body action control command by the human body and the toy end for controlling the toy to operate according to the control command sent by the control end and controlling the toy to operate according to the recognition result of the voice recognition device. However, patent document CN106178538A can only realize a single attitude control.
Disclosure of Invention
Aiming at the defects in the prior art, the invention aims to provide a gesture posture synchronous detection control method, a system, a medium and a mobile terminal.
The gesture synchronous detection control method provided by the invention comprises the following steps:
a motion information acquisition step: acquiring motion information of the mobile terminal;
and (3) attitude analysis: analyzing to obtain attitude information according to the motion information;
a gesture analysis step: analyzing to obtain gesture information according to the motion information;
and (3) controlling and analyzing: and obtaining control information according to the gesture information and the posture information.
Preferably, the motion information includes acceleration sensing information, wherein the acceleration sensing information indicates an acceleration of the mobile terminal;
in the attitude analysis step, extracting an acceleration component related to the attitude from the acceleration sensing information, and recording the acceleration component as an attitude acceleration component; matching the attitude acceleration component with attitude preset information to obtain attitude information;
in the gesture analysis step, extracting an interference acceleration component of the attitude acceleration component from the acceleration sensing information, and recording the interference acceleration component as a gesture acceleration component; and matching the gesture acceleration component with gesture preset information to obtain gesture information.
Preferably, in the gesture analysis step, the gesture acceleration component is matched with gesture preset information to obtain gesture information, then correction information corresponding to the gesture information is found, and the gesture information is corrected by using the correction information;
wherein the correction information is used for correcting acceleration deviation caused by action deviation caused by limitation of human joint activity.
Preferably, in the control analysis step, a current combination relationship of the gesture postures is obtained according to the gesture information and the posture information, and a corresponding control instruction is obtained according to the current combination relationship, wherein the control instruction instructs the actuator to execute a preset action;
the current combination relation comprises a position relation, wherein the position relation is used for indicating the position of the mobile terminal in the gesture action when the gesture posture is generated.
The invention provides a gesture synchronous detection control system, which comprises:
the motion information acquisition module: acquiring motion information of the mobile terminal;
a posture analysis module: analyzing to obtain attitude information according to the motion information;
a gesture analysis module: analyzing to obtain gesture information according to the motion information;
a control analysis module: and obtaining control information according to the gesture information and the posture information.
Preferably, the motion information includes acceleration sensing information, wherein the acceleration sensing information indicates an acceleration of the mobile terminal;
in the attitude analysis module, extracting an acceleration component related to the attitude from the acceleration sensing information, and recording the acceleration component as an attitude acceleration component; matching the attitude acceleration component with attitude preset information to obtain attitude information;
extracting an interference acceleration component of the attitude acceleration component from the acceleration sensing information in the gesture analysis module, and recording the interference acceleration component as a gesture acceleration component; and matching the gesture acceleration component with gesture preset information to obtain gesture information.
Preferably, in the gesture analysis module, the gesture acceleration component is matched with gesture preset information to obtain gesture information, then correction information corresponding to the gesture information is found, and the gesture information is corrected by using the correction information;
wherein the correction information is used for correcting acceleration deviation caused by action deviation caused by limitation of human joint activity.
Preferably, in the control analysis module, a current combination relationship of the gesture postures is obtained according to the gesture information and the posture information, and a corresponding control instruction is obtained according to the current combination relationship, wherein the control instruction instructs the actuator to execute a preset action;
the current combination relation comprises a position relation, wherein the position relation is used for indicating the position of the mobile terminal in the gesture action when the gesture posture is generated.
According to the present invention, a computer-readable storage medium is provided, in which a computer program is stored, which, when being executed by a processor, carries out the steps of the method.
According to the intelligent terminal provided by the invention, the intelligent terminal comprises the computer readable storage medium storing the computer program, or comprises the gesture synchronous detection control system.
Compared with the prior art, the invention has the following beneficial effects:
the invention organically combines the gesture information and the posture information, obtains the control instruction according to the combination relationship of the gesture information and the posture information, greatly expands the number of the instructions and increases more fun for the use of the toy.
Drawings
Other features, objects and advantages of the invention will become more apparent upon reading of the detailed description of non-limiting embodiments with reference to the following drawings:
fig. 1 is a flowchart illustrating steps of a gesture synchronous detection control method according to the present invention.
Detailed Description
The present invention will be described in detail with reference to specific examples. The following examples will assist those skilled in the art in further understanding the invention, but are not intended to limit the invention in any way. It should be noted that it would be obvious to those skilled in the art that various changes and modifications can be made without departing from the spirit of the invention. All falling within the scope of the present invention.
The gesture synchronous detection control method provided by the invention comprises the following steps:
a motion information acquisition step: acquiring motion information of the mobile terminal; the mobile terminal can be a handheld terminal such as a smart phone, a smart watch, a tablet computer, smart glasses, a smart bracelet, a smart toy, a smart remote controller and the like. The user holds the handheld terminal, can change the gesture of the mobile terminal through actions such as tilting, and meanwhile, the user holds the handheld terminal, and drives the mobile terminal to make gestures along with the hand through actions such as rotating the wrist and waving the arm. The mobile terminal generates motion information when changing the posture and making a gesture, or when the posture is kept unchanged, the motion information is considered as representing that the mobile device is in a static motion state relative to the hand of the user and is also used as the motion information. In this way, the four kinds of motion information, that is, the motion information that both of the gesture and the posture are changed, neither of the gesture and the posture is changed, one of the gesture and the posture is changed, the other of the gesture and the posture is not changed, and the other of the gesture and the posture is not changed can be simply arranged and combined. Accordingly, the four kinds of motion information can allow the user to correspondingly issue four kinds of control instructions. In a preferred example, the mobile terminal is an intelligent wearable device, for example, an intelligent wearable device worn on a forearm, and a posture of the intelligent wearable device corresponds to a posture of the forearm of the wearer, so that according to the posture of the intelligent wearable device, a posture of a body part of the wearer can be obtained as the posture information to enter subsequent analysis and control.
And (3) attitude analysis: analyzing to obtain attitude information according to the motion information; the gesture information reflects a small amplitude movement of the user, which may typically be a wrist movement, such as tilting or shaking.
A gesture analysis step: analyzing to obtain gesture information according to the motion information; the gesture information reflects the large range of movements of the user, which usually require the cooperation of the elbow joint and the shoulder joint. In particular, the mobile terminal is regarded as a mass point, and the motion track of the mass point is taken as a gesture.
And (3) controlling and analyzing: and obtaining control information according to the gesture information and the posture information. According to the invention, instead of obtaining one control information by gesture information and another control information by attitude information, the control information is obtained according to the combination of the gesture information and the attitude information at the same time or in the same time period. Wherein the same time period may be a time period between a start time and an end time of making a full gesture. For example, when the gesture information is satisfactory, i.e., the gesture is considered to be standard, and the posture information is unsatisfactory, i.e., the posture is considered to be not standard, the combination of the satisfactory gesture information and the satisfactory posture information cannot be obtained, and the control information desired by the user cannot be generated.
The present invention will be described in more detail below.
The motion information comprises acceleration sensing information, wherein the acceleration sensing information indicates an acceleration of the mobile terminal; the acceleration sensing information is acquired through an acceleration sensor in the mobile terminal. In the attitude analysis step, extracting an acceleration component related to the attitude from the acceleration sensing information, and recording the acceleration component as an attitude acceleration component; and matching the attitude acceleration component with attitude preset information to obtain attitude information. In the gesture analysis step, extracting an interference acceleration component of the attitude acceleration component from the acceleration sensing information, and recording the interference acceleration component as a gesture acceleration component; and matching the gesture acceleration component with gesture preset information to obtain gesture information. More specifically, the acceleration sensor may be one or several. For a mobile terminal having only one acceleration sensor, patent document CN107014386B discloses a solution capable of separating acceleration related to the attitude of an aircraft from acceleration unrelated to the attitude, and those skilled in the art can extract and separate acceleration components corresponding to attitude information and gesture information at least with reference to patent document CN 107014386B. For a mobile terminal having a plurality of, for example, two, acceleration sensors, acceleration information corresponding to each of the posture information and the gesture information may be acquired by the two acceleration sensors.
Further, in the gesture analysis step, the gesture acceleration component is matched with gesture preset information to obtain gesture information, then correction information corresponding to the gesture information is found, and the gesture information is corrected by using the correction information; wherein the correction information is used for correcting acceleration deviation caused by action deviation caused by limitation of human joint activity. In a preferred embodiment, the attitude preset information is a set of acceleration component templates, and the attitude information corresponding to the template in the set of acceleration component templates that is most matched with the attitude acceleration component is preset and used as the attitude information obtained after the attitude acceleration component is matched. However, due to the limitation of the mobility of the joints of the human body, especially the limitation of the wrist joints and elbow joints of the hand, it is difficult to make the desired movements in various directions, which is also related to the flexibility of the arm of the user, or it is difficult to exert force at some angles, etc. Therefore, the invention corrects the acceleration deviation caused by the action deviation caused by the limitation of the joint mobility of the human body so as to correct the slightly nonstandard action made by the user into the standard action, thereby facilitating the matching calculation of a subsequent computer and avoiding misjudgment. The types of the action deviation and the correction value of the action deviation of each type are preset and form a table, and the computer can obtain a correction scheme corresponding to the corresponding action deviation in a table look-up mode to correct the action deviation.
More specifically, in the control analysis step, according to the gesture information and the posture information, a current combination relationship of the gesture postures is obtained, and according to the current combination relationship, a corresponding control instruction is obtained, wherein the control instruction instructs the actuator to execute a preset action. For example, if the current combination relationship is that the gesture draws a circle and the posture is always kept horizontal, the corresponding control command is obtained to command the toy car to stop moving forward, and the toy car as the actuator executes the preset action of parking. For another example, if the current combination is that the gesture is drawing a 8-word, and the gesture is left-right shaking, then the corresponding control command is obtained to command the toy vehicle to accelerate forward according to the 8-word. In a preferred example, the current combination relationship includes a position relationship, where the position relationship is used to indicate a position of the mobile terminal in the gesture motion when the gesture posture is generated. For example, the current combination relationship including the position relationship is that the gesture draws a triangle, the gesture at the top corner of the triangle is forward tilting, the gesture at the left lower corner of the triangle is leftward tilting, and the gesture at the right lower corner of the triangle is rightward tilting, so that the corresponding control command is obtained as playing music.
The invention also provides a gesture synchronous detection control system, which can be realized by executing the step flow of the gesture synchronous detection control method, and the gesture synchronous detection control method can be understood as a preferred embodiment of the gesture synchronous detection control system by the technical personnel in the field.
The invention provides a gesture synchronous detection control system, which comprises:
the motion information acquisition module: acquiring motion information of the mobile terminal;
a posture analysis module: analyzing to obtain attitude information according to the motion information;
a gesture analysis module: analyzing to obtain gesture information according to the motion information;
a control analysis module: and obtaining control information according to the gesture information and the posture information.
Preferably, the motion information includes acceleration sensing information, wherein the acceleration sensing information indicates an acceleration of the mobile terminal;
in the attitude analysis module, extracting an acceleration component related to the attitude from the acceleration sensing information, and recording the acceleration component as an attitude acceleration component; matching the attitude acceleration component with attitude preset information to obtain attitude information;
extracting an interference acceleration component of the attitude acceleration component from the acceleration sensing information in the gesture analysis module, and recording the interference acceleration component as a gesture acceleration component; and matching the gesture acceleration component with gesture preset information to obtain gesture information.
Preferably, in the gesture analysis module, the gesture acceleration component is matched with gesture preset information to obtain gesture information, then correction information corresponding to the gesture information is found, and the gesture information is corrected by using the correction information;
wherein the correction information is used for correcting acceleration deviation caused by action deviation caused by limitation of human joint activity.
Preferably, in the control analysis module, a current combination relationship of the gesture postures is obtained according to the gesture information and the posture information, and a corresponding control instruction is obtained according to the current combination relationship, wherein the control instruction instructs the actuator to execute a preset action;
the current combination relation comprises a position relation, wherein the position relation is used for indicating the position of the mobile terminal in the gesture action when the gesture posture is generated.
There is also provided according to the invention a computer-readable storage medium having stored thereon a computer program which, when being executed by a processor, carries out the steps of the method.
According to the invention, an intelligent terminal is further provided, and the intelligent terminal comprises the computer readable storage medium storing the computer program or the gesture synchronous detection control system.
Those skilled in the art will appreciate that, in addition to implementing the systems, apparatus, and various modules thereof provided by the present invention in purely computer readable program code, the same procedures can be implemented entirely by logically programming method steps such that the systems, apparatus, and various modules thereof are provided in the form of logic gates, switches, application specific integrated circuits, programmable logic controllers, embedded microcontrollers and the like. Therefore, the system, the device and the modules thereof provided by the present invention can be considered as a hardware component, and the modules included in the system, the device and the modules thereof for implementing various programs can also be considered as structures in the hardware component; modules for performing various functions may also be considered to be both software programs for performing the methods and structures within hardware components.
The foregoing description of specific embodiments of the present invention has been presented. It is to be understood that the present invention is not limited to the specific embodiments described above, and that various changes or modifications may be made by one skilled in the art within the scope of the appended claims without departing from the spirit of the invention. The embodiments and features of the embodiments of the present application may be combined with each other arbitrarily without conflict.
Claims (10)
1. A gesture posture synchronous detection control method is characterized by comprising the following steps:
a motion information acquisition step: acquiring motion information of the mobile terminal;
and (3) attitude analysis: analyzing to obtain attitude information according to the motion information;
a gesture analysis step: analyzing to obtain gesture information according to the motion information;
and (3) controlling and analyzing: and obtaining control information according to the gesture information and the posture information.
2. The gesture synchronous detection control method according to claim 1,
the motion information comprises acceleration sensing information, wherein the acceleration sensing information indicates an acceleration of the mobile terminal;
in the attitude analysis step, extracting an acceleration component related to the attitude from the acceleration sensing information, and recording the acceleration component as an attitude acceleration component; matching the attitude acceleration component with attitude preset information to obtain attitude information;
in the gesture analysis step, extracting an interference acceleration component of the attitude acceleration component from the acceleration sensing information, and recording the interference acceleration component as a gesture acceleration component; and matching the gesture acceleration component with gesture preset information to obtain gesture information.
3. The gesture synchronous detection control method according to claim 2,
in the gesture analysis step, matching the gesture acceleration component with gesture preset information to obtain gesture information, then finding correction information corresponding to the gesture information, and correcting the gesture information by using the correction information;
wherein the correction information is used for correcting acceleration deviation caused by action deviation caused by limitation of human joint activity.
4. The gesture synchronous detection control method according to claim 1,
in the control analysis step, obtaining a current combination relation of the gesture postures according to the gesture information and the posture information, and obtaining a corresponding control instruction according to the current combination relation, wherein the control instruction instructs an actuator to execute a preset action;
the current combination relation comprises a position relation, wherein the position relation is used for indicating the position of the mobile terminal in the gesture action when the gesture posture is generated.
5. A gesture synchronous detection control system is characterized by comprising:
the motion information acquisition module: acquiring motion information of the mobile terminal;
a posture analysis module: analyzing to obtain attitude information according to the motion information;
a gesture analysis module: analyzing to obtain gesture information according to the motion information;
a control analysis module: and obtaining control information according to the gesture information and the posture information.
6. The gesture synchronous detection control system of claim 5,
the motion information comprises acceleration sensing information, wherein the acceleration sensing information indicates an acceleration of the mobile terminal;
in the attitude analysis module, extracting an acceleration component related to the attitude from the acceleration sensing information, and recording the acceleration component as an attitude acceleration component; matching the attitude acceleration component with attitude preset information to obtain attitude information;
extracting an interference acceleration component of the attitude acceleration component from the acceleration sensing information in the gesture analysis module, and recording the interference acceleration component as a gesture acceleration component; and matching the gesture acceleration component with gesture preset information to obtain gesture information.
7. The gesture synchronous detection control system according to claim 6,
in the attitude analysis module, matching the attitude acceleration component with attitude preset information to obtain attitude information, then finding correction information corresponding to the gesture information, and correcting the attitude information by using the correction information;
wherein the correction information is used for correcting acceleration deviation caused by action deviation caused by limitation of human joint activity.
8. The gesture synchronous detection control system of claim 7,
in the control analysis module, obtaining a current combination relation of gesture postures according to the gesture information and the posture information, and obtaining a corresponding control instruction according to the current combination relation, wherein the control instruction instructs an actuator to execute a preset action;
the current combination relation comprises a position relation, wherein the position relation is used for indicating the position of the mobile terminal in the gesture action when the gesture posture is generated.
9. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the method of any one of claims 1 to 4.
10. An intelligent terminal, characterized in that the intelligent terminal comprises the computer-readable storage medium storing the computer program of claim 9, or comprises the gesture posture synchronous detection control system of any one of claims 5 to 8.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111265986.1A CN113970967A (en) | 2021-10-28 | 2021-10-28 | Gesture synchronous detection control method, system, medium and mobile terminal |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111265986.1A CN113970967A (en) | 2021-10-28 | 2021-10-28 | Gesture synchronous detection control method, system, medium and mobile terminal |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113970967A true CN113970967A (en) | 2022-01-25 |
Family
ID=79588877
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111265986.1A Pending CN113970967A (en) | 2021-10-28 | 2021-10-28 | Gesture synchronous detection control method, system, medium and mobile terminal |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113970967A (en) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110298700A1 (en) * | 2010-06-04 | 2011-12-08 | Sony Corporation | Operation terminal, electronic unit, and electronic unit system |
US20140037139A1 (en) * | 2012-08-01 | 2014-02-06 | Samsung Electronics Co., Ltd. | Device and method for recognizing gesture based on direction of gesture |
CN103984416A (en) * | 2014-06-10 | 2014-08-13 | 北京邮电大学 | Gesture recognition method based on acceleration sensor |
US20150213580A1 (en) * | 2014-01-28 | 2015-07-30 | Sony Corporation | Display control apparatus, display control method, program, and display device |
CN110102044A (en) * | 2019-03-15 | 2019-08-09 | 歌尔科技有限公司 | Game control method, Intelligent bracelet and storage medium based on Intelligent bracelet |
JP2019144955A (en) * | 2018-02-22 | 2019-08-29 | 京セラ株式会社 | Electronic device, control method and program |
-
2021
- 2021-10-28 CN CN202111265986.1A patent/CN113970967A/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110298700A1 (en) * | 2010-06-04 | 2011-12-08 | Sony Corporation | Operation terminal, electronic unit, and electronic unit system |
US20140037139A1 (en) * | 2012-08-01 | 2014-02-06 | Samsung Electronics Co., Ltd. | Device and method for recognizing gesture based on direction of gesture |
US20150213580A1 (en) * | 2014-01-28 | 2015-07-30 | Sony Corporation | Display control apparatus, display control method, program, and display device |
CN103984416A (en) * | 2014-06-10 | 2014-08-13 | 北京邮电大学 | Gesture recognition method based on acceleration sensor |
JP2019144955A (en) * | 2018-02-22 | 2019-08-29 | 京セラ株式会社 | Electronic device, control method and program |
CN110102044A (en) * | 2019-03-15 | 2019-08-09 | 歌尔科技有限公司 | Game control method, Intelligent bracelet and storage medium based on Intelligent bracelet |
Non-Patent Citations (2)
Title |
---|
张燕翔 等: ""《舞台展演交互式空间增强现实技术》", 31 August 2018, 中国科技大学出版社, pages: 108 - 110 * |
苗敏敏;周治平;王杰锋;: "基于加速度传感器的手机用户认证方法", 计算机工程与科学, no. 03, pages 508 - 513 * |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10545579B2 (en) | Remote control with 3D pointing and gesture recognition capabilities | |
CN110312471B (en) | Adaptive system for deriving control signals from neuromuscular activity measurements | |
CN106896796B (en) | Industrial robot master-slave type teaching programming method based on data gloves | |
Wilson et al. | Formulation of a new gradient descent MARG orientation algorithm: Case study on robot teleoperation | |
US10199008B2 (en) | Systems, devices, and methods for wearable electronic devices as state machines | |
CN103529944B (en) | A kind of human motion recognition method based on Kinect | |
KR102408359B1 (en) | Electronic device and method for controlling using the electronic device | |
US9753545B2 (en) | Input device, input method, and storage medium | |
JP5264844B2 (en) | Gesture recognition apparatus and method | |
WO2007053116A1 (en) | Virtual interface system | |
KR20180020262A (en) | Technologies for micro-motion-based input gesture control of wearable computing devices | |
CN109933191B (en) | Gesture recognition and control method and system | |
JP2017068572A (en) | Wearable device | |
CN109153332B (en) | Sign language input for vehicle user interface | |
Wang et al. | Multimodal Human–Robot Interaction for Human‐Centric Smart Manufacturing: A Survey | |
CN111290574B (en) | Method and device for controlling unmanned aerial vehicle by using gestures and readable storage medium | |
CN106775093A (en) | A kind of contact action control method and device | |
CN113970967A (en) | Gesture synchronous detection control method, system, medium and mobile terminal | |
CN111316283B (en) | Gesture recognition method and device | |
CN109960404A (en) | A kind of data processing method and device | |
JP2016095795A (en) | Recognition device, method, and program | |
Balaji et al. | Smart phone accelerometer sensor based wireless robot for physically disabled people | |
CN110321008B (en) | Interaction method, device, equipment and storage medium based on AR model | |
US11660526B2 (en) | Estimation apparatus, estimation method, and program | |
CN115344111A (en) | Gesture interaction method, system and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |