CN115712384A - Gesture control method, device and system based on attitude sensor - Google Patents

Gesture control method, device and system based on attitude sensor Download PDF

Info

Publication number
CN115712384A
CN115712384A CN202211512181.7A CN202211512181A CN115712384A CN 115712384 A CN115712384 A CN 115712384A CN 202211512181 A CN202211512181 A CN 202211512181A CN 115712384 A CN115712384 A CN 115712384A
Authority
CN
China
Prior art keywords
dimensional
gesture
axis
rotation
track
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211512181.7A
Other languages
Chinese (zh)
Inventor
刘豪
陈锦彬
穆允翔
李辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Qiming Cloud Technology Co ltd
Original Assignee
Shenzhen Qiming Cloud Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Qiming Cloud Technology Co ltd filed Critical Shenzhen Qiming Cloud Technology Co ltd
Priority to CN202211512181.7A priority Critical patent/CN115712384A/en
Publication of CN115712384A publication Critical patent/CN115712384A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Position Input By Displaying (AREA)

Abstract

The invention relates to the technical field of computers, in particular to a gesture control method, a gesture control device and a gesture control system based on an attitude sensor, wherein the gesture control method based on the attitude sensor comprises the following steps: acquiring detection data of an attitude sensor; acquiring motion information of three axes according to detection data of the attitude sensor; resolving an attitude angle by the motion information of the three axes; carrying out coordinate system transformation according to the solved attitude angle; performing three-dimensional track resampling on the data after the coordinate system transformation; performing three-dimensional rotation on the three-dimensional track obtained by resampling the three-dimensional track; zooming and translating the three-dimensional track subjected to three-dimensional rotation; carrying out template matching on the three-dimensional track obtained in the last step, and obtaining a matching confidence coefficient through an evaluation function; and if the confidence coefficient reaches a set value, outputting a control instruction corresponding to the corresponding template. According to the invention, the control instruction is output through trajectory calculation and gesture recognition, and the recognition accuracy is improved through processes such as resampling, rotation, scaling and translation.

Description

Gesture control method, device and system based on attitude sensor
Technical Field
The invention relates to the technical field of computers, in particular to a gesture control method, device and system based on an attitude sensor.
Background
The gesture control means that the hand is made to make specific action, and after collection device gathered the hand action, according to the corresponding relation of preset gesture and instruction, obtain corresponding instruction output, the control of controller control executive component execution corresponding instruction realization object.
The gesture control provides a very simple control mode, the control is convenient, in the process of gesture output, a user can output the gesture without any characteristic equipment, and compared with an instruction output mode through keys and a touch screen, the gesture control further liberates the hands of the user and is the direction of intelligent control development.
Gesture control is different from accurate and unique input of keys or a touch screen, and one main problem to be solved is how to accurately recognize gestures by a system.
Disclosure of Invention
In view of the above, it is desirable to provide a gesture control method, device and system based on an attitude sensor.
The embodiment of the invention is realized in such a way that a gesture control method based on an attitude sensor comprises the following steps:
acquiring detection data of the attitude sensor;
acquiring motion information of three axes according to detection data of the attitude sensor;
resolving an attitude angle by the motion information of the three axes;
carrying out coordinate system transformation according to the solved attitude angle;
performing three-dimensional track resampling on the data after the coordinate system transformation;
performing three-dimensional rotation on the three-dimensional track obtained by resampling the three-dimensional track;
zooming and translating the three-dimensional track subjected to three-dimensional rotation;
carrying out template matching on the three-dimensional track obtained in the last step, and obtaining a matching confidence coefficient by an evaluation function;
and if the confidence coefficient reaches a set value, outputting a control instruction corresponding to the corresponding template.
In one embodiment, the present invention provides an attitude sensor-based gesture control apparatus, including:
the detection data acquisition module is used for acquiring detection data of the attitude sensor;
the motion information acquisition module is used for acquiring motion information of three axes according to the detection data of the attitude sensor;
the attitude angle resolving module is used for resolving an attitude angle by the motion information of the three axes;
the coordinate transformation module is used for carrying out coordinate system transformation according to the solved attitude angle;
the resampling module is used for carrying out three-dimensional track resampling on the data after the coordinate system transformation;
the three-dimensional rotation module is used for performing three-dimensional rotation on the three-dimensional track obtained by resampling the three-dimensional track;
the adjusting module is used for zooming and translating the three-dimensional track after three-dimensional rotation;
the matching module is used for carrying out template matching on the three-dimensional track obtained in the last step and obtaining the matching confidence coefficient through an evaluation function;
and the output module is used for outputting the control instruction corresponding to the corresponding template if the confidence coefficient reaches a set value.
In one embodiment, the present invention provides an attitude sensor-based gesture control system, comprising:
the gesture sensor is used for acquiring the gesture of the user; and
and the processor is connected with the gesture sensor and is used for executing the gesture control method based on the gesture sensor.
According to the gesture control method based on the gesture sensor, the gesture sensor is used for obtaining detection data, three-axis motion information is obtained through the detection data, the gesture angle is further solved, coordinate system transformation is carried out according to the obtained gesture angle through calculation, then three-dimensional track resampling is carried out, then rotation, scaling and translation processing are carried out, and finally the degree of repose is evaluated through the evaluation function through the matching module and corresponding control instructions are input. The method provided by the invention has the advantages that the processing process of the data is simple, the accuracy of gesture recognition is improved by various means, and finally the accuracy of output is ensured by the evaluation function.
Drawings
FIG. 1 is a flow diagram of a gesture sensor-based gesture control method in one embodiment;
FIG. 2 is a schematic diagram of trajectory resolution in one embodiment;
FIG. 3 is a block diagram of an embodiment of a gesture sensor based gesture control system;
FIG. 4 is a block diagram of the internal structure of a processor in one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is further described in detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
It will be understood that, as used herein, the terms "first," "second," and the like may be used herein to describe various elements, but these elements are not limited by these terms unless otherwise specified. These terms are only used to distinguish one element from another. For example, a first xx script may be referred to as a second xx script, and similarly, a second xx script may be referred to as a first xx script, without departing from the scope of the present disclosure.
As shown in fig. 1, in an embodiment, a gesture control method based on an attitude sensor is provided, which may specifically include the following steps:
step S100, acquiring detection data of an attitude sensor;
step S200, acquiring motion information of three axes according to detection data of the attitude sensor;
step S300, resolving an attitude angle by the motion information of the three axes;
step S400, transforming a coordinate system according to the calculated attitude angle;
step S500, carrying out three-dimensional trajectory resampling on the data after coordinate system transformation;
step S600, performing three-dimensional rotation on a three-dimensional track obtained by resampling the three-dimensional track;
step S700, zooming and translating the three-dimensional track after three-dimensional rotation;
step S800, carrying out template matching on the three-dimensional track obtained in the previous step, and obtaining a matching confidence coefficient through an evaluation function;
and step S900, outputting a control instruction corresponding to the corresponding template if the confidence coefficient reaches a set value.
In this embodiment, the attitude sensor belongs to an existing hardware device, and the structure, principle, and the like of the attitude sensor itself are not specifically limited in the present invention.
In the present embodiment, the three-axis motion information includes one or more of displacement, velocity, acceleration, and the like in the X-axis direction, the Y-axis direction, and the Z-axis direction. The attitude angle is an Euler angle, has information such as the attitude orientation and the like of a representation moving object, and belongs to common parameters for describing moving objects in the prior art.
In the present embodiment, the coordinate system of the object can be converted to the world coordinate system by coordinate transformation.
In this embodiment, the trajectory input by the user is often uneven, and usually, the trajectory points at the starting point and the end point are more dense, so that the trajectory points can be more even through resampling, and the accuracy of trajectory identification is improved.
In this embodiment, when comparing the ordered tracks, the tracks need to be closer in absolute coordinates, and the tracks need to be compared can be closer in absolute coordinates by three-dimensional rotation.
In this embodiment, deformation during the trajectory drawing process can be reduced by zooming and translating, so that the deviation in position between the measured trajectory and the template trajectory is reduced.
In this embodiment, the template refers to a standard trajectory, a confidence level may be calculated by comparing any trajectory obtained through the foregoing processing with the template, and when the confidence level meets a requirement, a control instruction corresponding to the corresponding template is output.
According to the gesture control method based on the gesture sensor, the gesture sensor is used for obtaining detection data, three-axis motion information is obtained through the detection data, the gesture angle is further solved, coordinate system transformation is carried out according to the obtained gesture angle through calculation, then three-dimensional track resampling is carried out, then rotation, scaling and translation processing are carried out, and finally the degree of repose is evaluated through the evaluation function through the matching module and corresponding control instructions are input. The method provided by the invention has the advantages that the processing process of the data is simple, the accuracy of gesture recognition is improved by various means, and finally the accuracy of output is ensured by the evaluation function.
As an optional embodiment of the present invention, the acquiring motion information of three axes according to the detection data of the attitude sensor includes:
acquiring the triaxial acceleration of the attitude sensor;
and calculating the coordinates of the track point at the current moment by the three-axis acceleration of the attitude sensor according to the following three formulas:
Figure BDA0003969685670000041
Figure BDA0003969685670000042
Figure BDA0003969685670000043
wherein: x is the number of n 、y n And z n Coordinates of the track point at the current moment in the X direction, the Y direction and the Z direction are respectively;
Figure BDA0003969685670000044
and
Figure BDA0003969685670000045
respectively displacement of the track point at the current moment in the X direction, the Y direction and the Z direction;
Figure BDA0003969685670000051
and
Figure BDA0003969685670000052
respectively displacement of the track point at the previous moment in the X direction, the Y direction and the Z direction;
Figure BDA0003969685670000053
and
Figure BDA0003969685670000054
the speeds of the track point at the previous moment in the X direction, the Y direction and the Z direction are respectively;
Figure BDA0003969685670000055
and
Figure BDA0003969685670000056
acceleration of the track point at the previous moment in the X direction, the Y direction and the Z direction is respectively;
Figure BDA0003969685670000057
and
Figure BDA0003969685670000058
respectively the acceleration of the current track point in the X direction, the Y direction and the Z direction; Δ t is the time difference between the previous time and the current time.
In this embodiment, the displacement may be obtained by resolving the three-axis acceleration. As shown in fig. 2, because
Δt=t 1 -t 0 =t 2 -t 1 =t 3 -t 2 =…=t n -t n-1
When n is>When the pressure is 1, the pressure is higher,
Figure BDA0003969685670000059
from the above, it can be obtained:
Figure BDA00039696856700000510
Figure BDA00039696856700000511
therefore, the speed v (t-1), the displacement s (t-1), the acceleration a (t-1) and the acceleration a (t) at the current moment which are calculated at the previous moment can obtain the speed v (t) and the displacement s (t) at the current moment.
As an optional embodiment of the present invention, the resolving the attitude angle from the motion information of the three axes includes:
and (5) obtaining an attitude angle by fusing the motion information of three axes through Kalman filtering.
In this embodiment, the attitude angle is easily obtained by kalman filtering and fusing the motion information of the three axes, and a specific calculation method may refer to the related prior art, which is not described in detail in the embodiment of the present invention.
In this embodiment, after kalman filtering, a proper length frame is selected to perform weighted sliding mean filtering, so as to balance the hysteresis and the responsiveness of the mean filtering, and further optimize the smoothing result.
As an optional embodiment of the present invention, the performing coordinate system transformation according to the calculated attitude angle includes:
calculating a rotation matrix by angle:
the X-axis rotation matrix is:
Figure BDA0003969685670000061
the Y-axis rotation matrix is:
Figure BDA0003969685670000062
the Z-axis rotation matrix is:
Figure BDA0003969685670000063
carrying out coordinate transformation on the acceleration according to the rotation matrix;
wherein x, y, z are coordinates before rotation, x ', y ', z ' are coordinates after rotation, and β is a posture angle.
In this embodiment, the detected acceleration value is based on an object coordinate system, and the object coordinate system is transformed into a world coordinate system because the object coordinate system changes with the direction of the object during the movement. The euler angle is calculated through the foregoing embodiment, and the corresponding acceleration may be coordinate-converted by calculating a rotation matrix through the angle.
As an optional embodiment of the present invention, the performing three-dimensional trajectory resampling on the data after coordinate system transformation includes:
calculating the sum of the spatial Euclidean distances between all the original points;
averagely dividing the sum of the calculated space Euclidean distances into n-1 parts to obtain an average distance, wherein n is the number of the resampled points;
and determining n resampling points with equal distance on the original track according to the average distance obtained by division.
In this embodiment, the original trajectory of the user is obtained to be non-uniform, and generally, the closer to the beginning, end and turn, the more dense the trajectory points are, and vice versa, the looser the trajectory points are. For the situation, equidistant resampling is needed to perform gesture normalization processing, and existing research shows that better recognition can be achieved when the number of resampled points n is about 32 to 256.
As an optional embodiment of the present invention, the performing three-dimensional rotation on the three-dimensional trajectory obtained by resampling the three-dimensional trajectory includes:
calculating the mass center of point cloud data formed by all the original points;
connecting the centroid of the point cloud data with the first original point to obtain a line segment, and respectively determining included angles between the line segment and an X axis, an Y axis and a Z axis;
performing first rotation according to the X-axis rotation matrix and an included angle between the line segment and the X axis;
performing second rotation according to the Y-axis rotation matrix and an included angle between the line segment and the Y axis;
and performing third rotation according to the Z-axis rotation matrix and the included angle between the line segment and the Z axis.
In this embodiment, the calculation of the centroid belongs to the prior art, and a unit mass can be assigned to each point, and due to the equation relationship, the assigned unit mass can be eliminated on the left and right sides of the equation, and finally the coordinates of the centroid are obtained. In this embodiment, the first original point refers to a first point collected in the point cloud data. In this embodiment, the first rotation, the second rotation, or the third rotation, i.e., the included angle is substituted into the corresponding rotation matrix before the rotation, and the result after the rotation is calculated.
As an alternative embodiment of the present invention, the zooming and translating the three-dimensional trajectory after three-dimensional rotation includes:
selecting a cube with the side length of size, and zooming the three-dimensional trajectory after three-dimensional rotation so as to enable the three-dimensional trajectory to completely fall into the cube;
and selecting a reference point on the three-dimensional track, and translating the three-dimensional track to enable the reference point to be coincident with the coordinate origin.
In this embodiment, the size may be set by the user, or may be set with reference to an empirical value. The reference point here may use a centroid.
As an optional embodiment of the present invention, the evaluation function is specifically:
Figure BDA0003969685670000071
wherein, the space Euclidean distance
Figure BDA0003969685670000072
C is the set of points for the target gesture, T is the set of points for the template gesture, C [ k ]] x 、C[k] y 、C[k] z Respectively representing the values of the kth coordinate in the point set C on an x axis, a y axis and a z axis; t is a unit of i [k] x 、T i [k] y 、T i [k] z Respectively represent point sets T i The value of the kth coordinate in the x-axis, the y-axis and the z-axis; n is the total number of all data. .
In this embodiment, the point set C of the target gesture and the template gesture Ti are calculated by the above formula, and the euclidean distance d is compared between the point set C of the target gesture and the template gesture Ti i The Euclidean distance can be represented as a confidence Score after being processed, and the Score range is [0,1 ]]Meanwhile, the higher the confidence coefficient is, the higher the matching degree of the candidate track and the template is.
The embodiment of the invention also provides a gesture control device based on the attitude sensor, which comprises:
the detection data acquisition module is used for acquiring detection data of the attitude sensor;
the motion information acquisition module is used for acquiring motion information of three axes according to the detection data of the attitude sensor;
the attitude angle resolving module is used for resolving an attitude angle by the motion information of the three axes;
the coordinate transformation module is used for carrying out coordinate system transformation according to the solved attitude angle;
the resampling module is used for resampling the three-dimensional track of the data after the coordinate system transformation;
the three-dimensional rotation module is used for performing three-dimensional rotation on the three-dimensional track obtained by resampling the three-dimensional track;
the adjusting module is used for zooming and translating the three-dimensional track after three-dimensional rotation;
the matching module is used for carrying out template matching on the three-dimensional track obtained in the last step and obtaining the matching confidence coefficient through an evaluation function;
and the output module is used for outputting the control instruction corresponding to the corresponding template if the confidence coefficient reaches a set value.
In this embodiment, each module is a module of a corresponding step of the method part of the present invention, for a specific explanation of each module, please refer to the contents of the method part of the present invention, and this embodiment is not described herein again.
As shown in fig. 3, an embodiment of the present invention further provides a gesture control system based on an attitude sensor, where the gesture control system based on an attitude sensor includes:
the gesture sensor is used for acquiring the gesture of the user; and
a processor connected with the gesture sensor for executing the gesture sensor-based gesture control method according to any one of claims 1 to 8.
In this embodiment, the gesture control system based on the gesture sensor may be applied to various scenarios, such as unmanned aerial vehicle gesture control, robot gesture control, and the like, and the specific application scenario is not specifically limited in the present invention.
The gesture control system based on the gesture sensor provided by the invention uses the gesture sensor to obtain detection data, obtains three-axis motion information from the detection data, further solves a gesture angle, performs coordinate system transformation according to the solved gesture angle, performs resampling on a three-dimensional track, performs rotation, scaling and translation processing, and finally evaluates the degree of repose through an evaluation function and inputs a corresponding control instruction through a matching module. The method provided by the invention has the advantages that the processing process of the data is simple, the accuracy of gesture recognition is improved by various means, and finally the accuracy of output is ensured by the evaluation function.
FIG. 4 is a diagram illustrating the internal architecture of a processor in one embodiment. As shown in fig. 4, the processor includes a memory, a network interface, an input device, and a display screen connected through a system bus. Wherein the memory includes a non-volatile storage medium and an internal memory. The non-volatile storage medium of the processor stores an operating system, and may further store a computer program, and when the computer program is executed by the processor, the computer program may enable the processor to implement the gesture control method based on the gesture sensor provided in the embodiment of the present invention. The internal memory may also store a computer program, and when the computer program is executed by the processor, the processor may execute the gesture control method based on the gesture sensor according to the embodiment of the present invention. The display screen of the processor can be a liquid crystal display screen or an electronic ink display screen, and the input device of the processor can be a touch layer covered on the display screen, a key, a track ball or a touch pad arranged on the shell of the processor, an external keyboard, a touch pad or a mouse and the like.
It will be appreciated by those skilled in the art that the architecture shown in fig. 4 is a block diagram of only a portion of the architecture associated with the inventive arrangements and is not intended to limit the processors to which the inventive arrangements may be applied, and that a particular processor may include more or fewer components than shown, or may combine certain components, or have a different arrangement of components.
In one embodiment, the gesture control apparatus based on the gesture sensor provided by the embodiment of the present invention may be implemented in a form of a computer program, and the computer program may be executed on a processor as shown in fig. 4. The memory of the processor may store various program modules constituting the gesture control device based on the attitude sensor, for example, a detection data acquisition module, a motion information acquisition module, an attitude angle calculation module, a coordinate transformation module, a resampling module, a three-dimensional rotation module, an adjustment module, a matching module, and an output module in the gesture control device based on the attitude sensor. The program modules constitute computer programs that cause the processor to execute the steps in the gesture sensor-based gesture control methods of the various embodiments of the present invention described in this specification.
For example, the processor shown in fig. 4 may execute step S100 by a detection data acquisition module in the gesture control apparatus based on the attitude sensor; the processor may execute step S200 through the motion information acquiring module; the processor may execute step S300 through the attitude angle calculation module; the processor may perform step S400 through the coordinate transformation module; the processor may perform step S500 through the resampling module; the processor may perform step S600 through the three-dimensional rotation module; the processor may execute step S700 through the adjusting module; the processor may perform step S800 through the matching module; the processor may perform step S900 through the output module.
In one embodiment, a processor is proposed, the processor comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the following steps when executing the computer program:
acquiring detection data of the attitude sensor;
acquiring motion information of three axes according to detection data of the attitude sensor;
resolving an attitude angle by the motion information of the three axes;
carrying out coordinate system transformation according to the solved attitude angle;
performing three-dimensional track resampling on the data after the coordinate system transformation;
performing three-dimensional rotation on the three-dimensional track obtained by resampling the three-dimensional track;
zooming and translating the three-dimensional track after three-dimensional rotation;
carrying out template matching on the three-dimensional track obtained in the last step, and obtaining a matching confidence coefficient through an evaluation function;
and if the confidence coefficient reaches a set value, outputting a control instruction corresponding to the corresponding template.
In one embodiment, a computer-readable storage medium is provided, having stored thereon a computer program which, when executed by a processor, causes the processor to perform the steps of:
acquiring detection data of an attitude sensor;
acquiring motion information of three axes according to detection data of the attitude sensor;
resolving an attitude angle by the motion information of the three axes;
carrying out coordinate system transformation according to the solved attitude angle;
performing three-dimensional track resampling on the data after the coordinate system transformation;
performing three-dimensional rotation on the three-dimensional track obtained by resampling the three-dimensional track;
zooming and translating the three-dimensional track after three-dimensional rotation;
carrying out template matching on the three-dimensional track obtained in the last step, and obtaining a matching confidence coefficient through an evaluation function;
and if the confidence coefficient reaches a set value, outputting a control instruction corresponding to the corresponding template.
It should be understood that, although the steps in the flowcharts of the embodiments of the present invention are shown in sequence as indicated by the arrows, the steps are not necessarily performed in sequence as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least a portion of the steps in various embodiments may include multiple sub-steps or multiple stages that are not necessarily performed at the same time, but may be performed at different times, and the order of performance of the sub-steps or stages is not necessarily sequential, but may be performed in turn or alternately with other steps or at least a portion of the sub-steps or stages of other steps.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above may be implemented by a computer program, which may be stored in a non-volatile computer readable storage medium, and when executed, may include the processes of the embodiments of the methods described above. Any reference to memory, storage, databases, or other media used in embodiments provided herein may include non-volatile and/or volatile memory. Non-volatile memory can include read-only memory (ROM), programmable ROM (PROM), electrically Programmable ROM (EPROM), electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double Data Rate SDRAM (DDRSDRAM), enhanced SDRAM (ESDRAM), synchronous Link DRAM (SLDRAM), rambus (Rambus) direct RAM (RDRAM), direct memory bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).
The technical features of the embodiments described above may be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the embodiments described above are not described, but should be considered as being within the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present invention, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the inventive concept, which falls within the scope of the present invention. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

1. A gesture control method based on an attitude sensor is characterized by comprising the following steps:
acquiring detection data of the attitude sensor;
acquiring motion information of three axes according to detection data of the attitude sensor;
resolving an attitude angle by the motion information of the three axes;
transforming a coordinate system according to the calculated attitude angle;
carrying out three-dimensional track resampling on the data after the coordinate system transformation;
performing three-dimensional rotation on the three-dimensional track obtained by resampling the three-dimensional track;
zooming and translating the three-dimensional track after three-dimensional rotation;
carrying out template matching on the three-dimensional track obtained in the last step, and obtaining a matching confidence coefficient through an evaluation function;
and if the confidence coefficient reaches a set value, outputting a control instruction corresponding to the corresponding template.
2. The gesture sensor-based gesture control method according to claim 1, wherein the obtaining of motion information of three axes according to detection data of the gesture sensor comprises:
acquiring the triaxial acceleration of the attitude sensor;
and calculating the coordinates of the track point at the current moment by the three-axis acceleration of the attitude sensor according to the following three formulas:
Figure FDA0003969685660000011
Figure FDA0003969685660000012
Figure FDA0003969685660000013
wherein: x is the number of n 、y n And z n Coordinates of the track point at the current moment in the X direction, the Y direction and the Z direction are respectively;
Figure FDA0003969685660000014
and
Figure FDA0003969685660000015
respectively displacement of the track point at the current moment in the X direction, the Y direction and the Z direction;
Figure FDA0003969685660000016
and
Figure FDA0003969685660000017
respectively the displacement of the track point in the X direction, the Y direction and the Z direction at the previous moment;
Figure FDA0003969685660000018
and
Figure FDA0003969685660000019
the speeds of the track point at the previous moment in the X direction, the Y direction and the Z direction are respectively;
Figure FDA00039696856600000110
and
Figure FDA00039696856600000111
acceleration of the track point at the previous moment in the X direction, the Y direction and the Z direction is respectively;
Figure FDA00039696856600000112
and
Figure FDA00039696856600000113
respectively the acceleration of the current track point in the X direction, the Y direction and the Z direction; Δ t is the time difference between the previous time and the current time.
3. The gesture sensor based gesture control method according to claim 1, wherein the resolving of the gesture angle from the motion information of the three axes comprises:
and (5) obtaining an attitude angle by fusing the motion information of three axes through Kalman filtering.
4. The gesture sensor-based gesture control method according to claim 1, wherein the performing coordinate system transformation according to the calculated gesture angle includes:
calculating a rotation matrix by angle:
the X-axis rotation matrix is:
Figure FDA0003969685660000021
the Y-axis rotation matrix is:
Figure FDA0003969685660000022
the Z-axis rotation matrix is:
Figure FDA0003969685660000023
carrying out coordinate transformation on the acceleration according to the rotation matrix;
wherein x, y, z are coordinates before rotation, x ', y ', z ' are coordinates after rotation, and β is a posture angle.
5. The gesture sensor based gesture control method according to claim 1, wherein the three-dimensional trajectory resampling of the coordinate system transformed data comprises:
calculating the sum of the spatial Euclidean distances between all the original points;
averagely dividing the sum of the calculated space Euclidean distances into n-1 parts to obtain an average distance, wherein n is the number of the resampled points;
and determining n resampling points with equal distance on the original track according to the average distance obtained by division.
6. The gesture control method based on the attitude sensor according to claim 1, wherein the three-dimensional trajectory obtained by resampling the three-dimensional trajectory is subjected to three-dimensional rotation, and the method comprises the following steps:
calculating the mass center of point cloud data formed by all the original points;
connecting the centroid of the point cloud data with the first original point to obtain a line segment, and respectively determining included angles between the line segment and an X axis, an Y axis and a Z axis;
performing first rotation according to the X-axis rotation matrix and an included angle between the line segment and the X axis;
performing second rotation according to the Y-axis rotation matrix and an included angle between the line segment and the Y axis;
and performing third rotation according to the Z-axis rotation matrix and the included angle between the line segment and the Z axis.
7. The gesture sensor based gesture control method according to claim 1, wherein the zooming and translating the three-dimensional trajectory after three-dimensional rotation comprises:
selecting a cube with the side length of size, and zooming the three-dimensional trajectory after three-dimensional rotation so as to enable the three-dimensional trajectory to completely fall into the cube;
and selecting a reference point on the three-dimensional track, and translating the three-dimensional track to enable the reference point to be coincident with the coordinate origin.
8. The gesture sensor-based gesture control method according to claim 7, wherein the evaluation function is specifically:
Figure FDA0003969685660000031
wherein: european style distance in space
Figure FDA0003969685660000032
C is the set of points for the target gesture, T is the set of points for the template gesture, C k] x 、C[k] y 、C[k] z Respectively representing the values of the kth coordinate in the point set C on an x axis, a y axis and a z axis; t is i [k] x 、T i [k] y 、T i [k] z Respectively represent point sets T i The value of the kth coordinate in the x-axis, the y-axis and the z-axis; n is the total number of all data.
9. An attitude sensor based gesture control apparatus, comprising:
the detection data acquisition module is used for acquiring detection data of the attitude sensor;
the motion information acquisition module is used for acquiring motion information of three axes according to the detection data of the attitude sensor;
the attitude angle resolving module is used for resolving an attitude angle by the motion information of the three axes;
the coordinate transformation module is used for carrying out coordinate system transformation according to the solved attitude angle;
the resampling module is used for carrying out three-dimensional track resampling on the data after the coordinate system transformation;
the three-dimensional rotation module is used for performing three-dimensional rotation on the three-dimensional track obtained by resampling the three-dimensional track;
the adjusting module is used for zooming and translating the three-dimensional track after three-dimensional rotation;
the matching module is used for carrying out template matching on the three-dimensional track obtained in the last step and obtaining the matching confidence coefficient through an evaluation function;
and the output module is used for outputting the control instruction corresponding to the corresponding template if the confidence coefficient reaches a set value.
10. An attitude sensor-based gesture control system, comprising:
the gesture sensor is used for acquiring the gesture of the user; and
a processor connected with the gesture sensor for executing the gesture sensor based gesture control method of any one of claims 1-8.
CN202211512181.7A 2022-11-29 2022-11-29 Gesture control method, device and system based on attitude sensor Pending CN115712384A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211512181.7A CN115712384A (en) 2022-11-29 2022-11-29 Gesture control method, device and system based on attitude sensor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211512181.7A CN115712384A (en) 2022-11-29 2022-11-29 Gesture control method, device and system based on attitude sensor

Publications (1)

Publication Number Publication Date
CN115712384A true CN115712384A (en) 2023-02-24

Family

ID=85235243

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211512181.7A Pending CN115712384A (en) 2022-11-29 2022-11-29 Gesture control method, device and system based on attitude sensor

Country Status (1)

Country Link
CN (1) CN115712384A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116358562A (en) * 2023-05-31 2023-06-30 氧乐互动(天津)科技有限公司 Disinfection operation track detection method, device, equipment and storage medium
CN117260724A (en) * 2023-10-09 2023-12-22 南京迈思物联网科技有限公司 Attitude control method for deep well exploratory robot

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116358562A (en) * 2023-05-31 2023-06-30 氧乐互动(天津)科技有限公司 Disinfection operation track detection method, device, equipment and storage medium
CN116358562B (en) * 2023-05-31 2023-08-01 氧乐互动(天津)科技有限公司 Disinfection operation track detection method, device, equipment and storage medium
CN117260724A (en) * 2023-10-09 2023-12-22 南京迈思物联网科技有限公司 Attitude control method for deep well exploratory robot
CN117260724B (en) * 2023-10-09 2024-06-07 南京迈思物联网科技有限公司 Attitude control method for deep well exploratory robot

Similar Documents

Publication Publication Date Title
CN115712384A (en) Gesture control method, device and system based on attitude sensor
US20190065872A1 (en) Behavior recognition apparatus, learning apparatus, and method and program therefor
CN109885883B (en) Unmanned vehicle transverse motion control method based on GK clustering algorithm model prediction
Zeng et al. Hand gesture recognition using leap motion via deterministic learning
CN101464134B (en) Vision measuring method for three-dimensional pose of spacing target
JP7131994B2 (en) Self-position estimation device, self-position estimation method, self-position estimation program, learning device, learning method and learning program
CN110068326B (en) Attitude calculation method and apparatus, electronic device, and storage medium
CN112837352B (en) Image-based data processing method, device and equipment, automobile and storage medium
US11833692B2 (en) Method and device for controlling arm of robot
KR102226846B1 (en) System for Positioning Hybrid Indoor Localization Using Inertia Measurement Unit Sensor and Camera
CN111983620B (en) Target positioning method for underwater robot searching and exploring
US20130238295A1 (en) Method and apparatus for pose recognition
CN110211151B (en) Method and device for tracking moving object
WO2020073444A1 (en) Point cloud data processing method and device based on neural network
CN111582186A (en) Object edge identification method, device, system and medium based on vision and touch
CN111707294B (en) Pedestrian navigation zero-speed interval detection method and device based on optimal interval estimation
CN111325663A (en) Three-dimensional point cloud matching method and device based on parallel architecture and computer equipment
WO2015176502A1 (en) Image feature estimation method and device
JP4921847B2 (en) 3D position estimation device for an object
CN109871116B (en) Apparatus and method for recognizing gesture
CN110824496B (en) Motion estimation method, motion estimation device, computer equipment and storage medium
JP2778430B2 (en) Three-dimensional position and posture recognition method based on vision and three-dimensional position and posture recognition device based on vision
CN111811501B (en) Trunk feature-based unmanned aerial vehicle positioning method, unmanned aerial vehicle and storage medium
CN115294280A (en) Three-dimensional reconstruction method, apparatus, device, storage medium, and program product
CN113670327A (en) Visual inertial odometer initialization method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination