CN113190045A - Unmanned aerial vehicle cluster control method and system based on stroke human-computer interaction - Google Patents
Unmanned aerial vehicle cluster control method and system based on stroke human-computer interaction Download PDFInfo
- Publication number
- CN113190045A CN113190045A CN202110505859.8A CN202110505859A CN113190045A CN 113190045 A CN113190045 A CN 113190045A CN 202110505859 A CN202110505859 A CN 202110505859A CN 113190045 A CN113190045 A CN 113190045A
- Authority
- CN
- China
- Prior art keywords
- stroke
- unmanned aerial
- aerial vehicle
- point
- vehicle cluster
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000003993 interaction Effects 0.000 title claims abstract description 44
- 238000000034 method Methods 0.000 title claims abstract description 28
- 238000012952 Resampling Methods 0.000 claims description 10
- 238000004891 communication Methods 0.000 claims description 6
- 230000015572 biosynthetic process Effects 0.000 abstract description 12
- 230000007246 mechanism Effects 0.000 abstract description 4
- 238000005516 engineering process Methods 0.000 description 10
- 230000000694 effects Effects 0.000 description 7
- 238000010586 diagram Methods 0.000 description 6
- 230000009466 transformation Effects 0.000 description 6
- 238000013473 artificial intelligence Methods 0.000 description 5
- 238000012545 processing Methods 0.000 description 5
- 239000013598 vector Substances 0.000 description 5
- 238000011160 research Methods 0.000 description 4
- 238000013519 translation Methods 0.000 description 4
- 238000005070 sampling Methods 0.000 description 3
- 238000001514 detection method Methods 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 238000013507 mapping Methods 0.000 description 2
- 238000012360 testing method Methods 0.000 description 2
- 206010063385 Intellectualisation Diseases 0.000 description 1
- 235000009421 Myristica fragrans Nutrition 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 150000001875 compounds Chemical class 0.000 description 1
- 230000007123 defense Effects 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 238000009499 grossing Methods 0.000 description 1
- 239000001115 mace Substances 0.000 description 1
- 230000006855 networking Effects 0.000 description 1
- 238000010606 normalization Methods 0.000 description 1
- 238000007781 pre-processing Methods 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000000946 synaptic effect Effects 0.000 description 1
- 230000001131 transforming effect Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- 239000002699 waste material Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/10—Simultaneous control of position or course in three dimensions
- G05D1/101—Simultaneous control of position or course in three dimensions specially adapted for aircraft
- G05D1/104—Simultaneous control of position or course in three dimensions specially adapted for aircraft involving a plurality of aircrafts, e.g. formation flying
Landscapes
- Engineering & Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
The invention relates to an unmanned aerial vehicle cluster control method and system based on stroke human-computer interaction. The method comprises the following steps: acquiring a stroke track described by an operator; determining the pixel coordinate of each point in the stroke track according to the set frequency; stroke recognition is carried out on the pixel coordinate of each point, and a control instruction is determined; and controlling the unmanned aerial vehicle cluster according to the control instruction. The stroke human-computer interaction based unmanned aerial vehicle cluster control method and system can improve the adaptability of a cluster system to fuzzy and uncertain scenes and realize high-safety formation control with a human-computer interaction mechanism.
Description
Technical Field
The invention relates to the field of intelligent cooperative control of human and unmanned aerial vehicle clusters, in particular to an unmanned aerial vehicle cluster control method and system based on stroke human-computer interaction.
Background
In the face of the future high-speed development of no humanization, networking, informatization and intellectualization, countries must utilize related technologies to research equivalent and even more advanced combat weapons, combat platforms and combat methods. As a subversive technology, the intelligent unmanned cluster is always regarded as the core of artificial intelligence of unmanned systems of all countries, is a breakthrough of future intelligent unmanned systems, and is a killer mace technology for forming asymmetric weaponry to realize curve overtaking. Many application projects utilize artificial intelligence technology to enable unmanned platforms and systems in operation, intelligent autonomy capability level of bee colony is improved, and key technology is performed on each project from different sides, wherein the key technology comprises man-machine cooperation and man-machine interaction technology.
Compared with single-machine operation, the unmanned aerial vehicle cluster has the advantages of being replaceable or not. Due to the rapid development of the anti-unmanned aerial vehicle system and the anti-unmanned aerial vehicle technology in recent years, the killing and intercepting effects on the low-small-slow unmanned aerial vehicle are good, and the killing probability is improved very quickly. In the face of an anti-unmanned aerial vehicle system, the unmanned aerial vehicle in single-machine operation not only can cause equipment loss but also can cause huge influence on tactical reconnaissance tasks once being captured. The cluster unmanned aerial vehicles are different from each other, and form a task formation, so that even if one or two unmanned aerial vehicles are knocked down by the air defense system or the anti-unmanned aerial vehicle system, other unmanned aerial vehicles in the cluster can successfully execute reconnaissance tasks. Compared with an individual scout unmanned aerial vehicle, the cluster unmanned aerial vehicle has the advantages of formation scout, can perform formation search on a wider area, and has stronger detection capability than that of a single unmanned aerial vehicle; in addition, the cluster unmanned aerial vehicle can perform repeated carpet type search on a certain area, so that scouting omission or error omission caused by single search is avoided, and the success rate of scouting is further improved; moreover, for a battle platform target in motion, an individual unmanned aerial vehicle is often difficult to track and monitor, and for a cluster unmanned aerial vehicle, relay reconnaissance and tracking on a moving target are very easy; in addition, many unmanned aerial vehicles reconnoiter same target, can follow different angles and have different observations to same target, and the observation to the target is more three-dimensional and careful, and the reconnaissance effect is better, and the detail is more outstanding.
However, compared with an individual-soldier unmanned aerial vehicle, the "cluster man-machine system" is technically more difficult, technical bottlenecks such as task cooperation, intelligent formation and the like need to be solved, the self-organizing of the unmanned aerial vehicle and the cooperative detection of the same target are completed in a more reasonable and efficient manner, and higher requirements are provided in the aspect of man-machine interaction.
At present, with the rise of emerging human-computer interaction technologies, the research on various human-computer interaction modes based on voice, vision, gestures and compound body feeling is greatly developed at home and abroad, and the research on the various interaction modes for unmanned aerial vehicle cluster control is relatively lacked. At present, stroke interaction technology is mature, but research on robot cluster control is not much. The current method for recognizing stroke tracks comprises the following steps: recognizing the graph from the position, direction, speed and acceleration of the handwriting by using fuzzy logic and fuzzy knowledge; identifying the graph as a whole, performing smoothing treatment, extracting arc sections, identifying nodes, decomposing straight line sections, finding out an experimental threshold value according to the angle of an included angle between every two adjacent 3 points as the extraction characteristics of the arc and the straight line sections, and classifying; the interior angle features of the pixel geometry are extracted and identified by using a binary synaptic weight algorithm BSW (feedforward net with a hidden layer).
In summary, in order to fully exert the operational advantages of the robot cluster and the operational commanding ability of the collective people, efficient coordination between people and the robot cluster needs to be realized, and therefore, a method or a system for controlling an unmanned aerial vehicle system through gesture interaction is urgently needed to be of great significance.
Disclosure of Invention
Based on the problems, the stroke human-computer interaction-based unmanned aerial vehicle cluster control method and system provided by the invention introduce artificial intelligence into the robot cluster cooperative control system under the condition of simultaneous existence of time delay and switching topology, improve the adaptability of the cluster system to fuzzy and uncertain scenes, and realize high-safety formation control with a human-computer interaction mechanism.
In order to achieve the purpose, the invention provides the following scheme:
an unmanned aerial vehicle cluster control method based on stroke human-computer interaction comprises the following steps:
acquiring a stroke track described by an operator; determining the pixel coordinate of each point in the stroke track according to the set frequency;
stroke recognition is carried out on the pixel coordinate of each point, and a control instruction is determined;
and controlling the unmanned aerial vehicle cluster according to the control instruction.
Optionally, the obtaining an operator-traced stroke track; and determining the pixel coordinates of each point in the stroke track according to the set frequency, specifically comprising:
an image drawing area of a man-machine interaction control interface based on PyQt5 and ROS network is established;
determining the pixel coordinates of each point in the stroke track in the image drawing area at a set frequency; the frequency was set at 100 Hz.
Optionally, the stroke recognition is performed on the pixel coordinate of each point, and the determining of the control instruction specifically includes:
resampling each point in the stroke track to determine a new point set;
rotating each point in the new point set;
zooming each point in the rotated new point set to a set size, and determining a processed stroke track;
acquiring a template stored in a system instruction library; each template corresponds to a control instruction;
performing stroke recognition on the template, and determining the processed template;
determining scores of the processed stroke track and the processed template according to the average distance between the coordinate of each point in the processed template and the coordinate of the corresponding point in the processed stroke track;
and determining a control command according to the scores of the processed stroke track and the processed template.
Optionally, the controlling the unmanned aerial vehicle cluster according to the control instruction specifically includes:
sending the control instruction to the unmanned aerial vehicle cluster by using the ROS network; the communication mode of the ROS network is to send an instruction to the unmanned aerial vehicle cluster by using a topic;
and the unmanned aerial vehicle cluster responds to the control instruction and makes a behavior corresponding to the control instruction.
An unmanned aerial vehicle cluster control system based on stroke human-computer interaction comprises:
the stroke track acquisition module is used for acquiring a stroke track described by an operator; determining the pixel coordinate of each point in the stroke track according to the set frequency;
the control instruction determining module is used for carrying out stroke recognition on the pixel coordinate of each point and determining a control instruction;
and the control command control unmanned aerial vehicle cluster module is used for controlling the unmanned aerial vehicle cluster according to the control command.
Optionally, the stroke track acquiring module specifically includes:
the image drawing area establishing unit is used for establishing an image drawing area of a human-computer interaction control interface based on PyQt5 and an ROS network;
the pixel coordinate determination unit is used for determining the pixel coordinate of each point in the stroke track in the image drawing area at a set frequency; the frequency was set at 100 Hz.
Optionally, the control instruction determining module specifically includes:
the new point set determining unit is used for resampling each point in the stroke track to determine a new point set;
a new point set rotating unit, configured to rotate each point in the new point set;
the processed stroke track determining unit is used for zooming each point in the rotated new point set to a set size and determining a processed stroke track;
the template acquisition unit in the system instruction library is used for acquiring the template stored in the system instruction library; each template corresponds to a control instruction;
the processed template determining unit is used for carrying out stroke recognition on the template and determining the processed template;
the score determining unit is used for determining the scores of the processed stroke track and the processed template according to the average distance between the coordinate of each point in the processed template and the coordinate of the corresponding point in the processed stroke track;
and the control instruction determining unit is used for determining a control instruction according to the processed stroke track and the score of the processed template.
Optionally, the controlling instruction controls the unmanned aerial vehicle cluster module to specifically include:
the control instruction sending unit is used for sending the control instruction to the unmanned aerial vehicle cluster by utilizing the ROS network; the communication mode of the ROS network is to send an instruction to the unmanned aerial vehicle cluster by using a topic;
and the control instruction controls an unmanned aerial vehicle cluster unit, and is used for responding to the control instruction by the unmanned aerial vehicle cluster and making a behavior corresponding to the control instruction.
According to the specific embodiment provided by the invention, the invention discloses the following technical effects:
the unmanned aerial vehicle cluster control method and the unmanned aerial vehicle cluster control system based on stroke human-computer interaction, provided by the invention, are characterized in that a stroke track described by an operator is obtained; determining the pixel coordinate of each point in the stroke track according to the set frequency; stroke recognition is carried out on the pixel coordinate of each point, and a control instruction is determined; and controlling the unmanned aerial vehicle cluster according to the control instruction. Artificial intelligence is introduced into the robot cluster cooperative control system under the condition that time delay and switching topology exist simultaneously. The method introduces the recognition of gesture interaction in artificial intelligence into a robot cluster cooperative control system, further improves the adaptability of the cluster system to fuzzy and uncertain scenes, and realizes high-safety formation control with a human-computer interaction mechanism.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings without inventive exercise.
FIG. 1 is a schematic flow chart of an unmanned aerial vehicle cluster control method based on stroke human-computer interaction provided by the invention;
fig. 2 is a PyQt5 and ROS based UI interface for a human and unmanned aerial vehicle cluster platform according to an embodiment of the present invention;
FIG. 3 is a flow diagram of a stroke recognition function provided by an embodiment of the present invention;
FIG. 4 is a flowchart illustrating resampling a point set in a stroke recognition function according to an embodiment of the present invention;
FIG. 5 is a diagram illustrating an effect of rotating a point set in the stroke recognition function according to an embodiment of the present invention;
FIG. 6 is a flowchart illustrating a zoom operation performed on a point set in the stroke recognition function according to an embodiment of the present invention;
FIG. 7 is a diagram illustrating a comparison score between a stroke track and a template according to an embodiment of the present invention;
fig. 8 is a schematic diagram of a mapping relationship between a template in a system instruction library and an unmanned aerial vehicle behavior set according to an embodiment of the present invention;
fig. 9 is a schematic structural diagram of an unmanned aerial vehicle cluster control system based on stroke human-computer interaction provided by the invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The invention aims to provide an unmanned aerial vehicle cluster control method and system based on stroke human-computer interaction, which can improve the adaptability of a cluster system to fuzzy and uncertain scenes and realize high-safety formation control with a human-computer interaction mechanism.
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in further detail below.
Fig. 1 is a schematic flow chart of an unmanned aerial vehicle cluster control method based on stroke human-computer interaction provided by the present invention, and as shown in fig. 1, the unmanned aerial vehicle cluster control method based on stroke human-computer interaction provided by the present invention includes:
s101, acquiring a stroke track drawn by an operator; determining the pixel coordinate of each point in the stroke track according to the set frequency;
s101 specifically comprises the following steps:
an image drawing area of a man-machine interaction control interface based on PyQt5 and ROS network is established;
determining the pixel coordinates of each point in the stroke track in the image drawing area at a set frequency; the frequency was set at 100 Hz.
Namely, an image drawing area of the human-robot cluster platform based on the PyQt5 and ROS network, namely a drawing board area in a visual UI (user interface), established by the invention is used for recording coordinates in an image coordinate system of each point according to a preset frequency of 100Hz, wherein the origin of the coordinates is positioned at the upper left corner of an image.
As in the PyQt5 and ROS network based human-robot cluster platform UI interface of fig. 2, the drawing board area is located in the 800 pixel x 600 pixel blank area in the lower left corner of fig. 2. When a user runs the human-robot cluster platform software through a tablet personal computer or other equipment, strokes can be drawn on the drawing board area at the lower left corner through a touch screen or a mouse. The stroke width pixel set by the method is 8px, so that the stroke track can be clearly displayed in the area, and the phenomenon that the stroke occupies too large frame due to the too large stroke width pixel and the space waste is avoided.
S102, stroke recognition is carried out on the pixel coordinate of each point, and a control instruction is determined;
as shown in fig. 3, S102 specifically includes:
resampling each point in the stroke track to determine a new point set; the number N of the point sets is equal to or more than 32 and equal to or less than 256, and N is set to be 64.
And after the operator finishes drawing, submitting the point set to the rear end by the human-robot cluster platform in real time for preprocessing. Because the speed of the image drawn by people is slower than the software acquisition frequency, the number of points contained in the collected unprocessed point set is more and is far larger than the resampling range of the invention. The point number is in direct proportion to the time complexity of the algorithm, and the excessive point number of the source data can reduce the processing and recognition speed of the subsequent strokes. Therefore, the back-end stroke recognition system resamples the point sets, so that a new point set with the number N of 32-256 is obtained, and the better stroke recognition effect can be obtained by setting N to 64. The specific processing algorithm flow is shown in fig. 4.
The specific process of resampling the point set shown in fig. 4 is as follows: the total length of the stroke track traced by the operator is first determined in pixels (px) by traversing the acquired set of points and determining the length between two points on the image coordinate system between the two points before and after, and summing them. Further, the minimum value of the distance between two adjacent points is solved for the current point set. Further, an index variable of the point column is defined and set to zero, and the maximum value is the length of the point set minus two. Further, according to the point row index variable, the distance between the current point and the next point is solved, if the distance is larger than a specified value, a point is inserted into the midpoint between the current point and the next point, meanwhile, the newly inserted point is added into the new point set, and if the distance is not larger than the specified value, the next point is directly added into the new point set. And after the coordinates of the points added into the new point set are determined, adding one to the point column index variable, and further judging whether the point set is completely traversed or not, thereby circularly operating until the point set is completely traversed. And finally, outputting a new point set.
Rotating each point in the new point set; the invention takes the included angle between the vector formed by the central moment coordinate of the drawn stroke and the first point of the stroke sampling and the horizontal right x-axis unit vector as the indicating angle of the stroke to be recognized, and carries out linear transformation on the obtained point set according to the indicating angle.
And rotating the new point set obtained by resampling. Considering that the difference of the pen-falling positions of the operators causes the vectors formed by the point sets of the same graph to be different, so as to cause misjudgment, the point sets need to be rotated to realize the normalization effect. The invention uses the included angle between the vector formed by the central moment coordinate of the drawn stroke and the first point of the stroke sampling and the horizontal right x-axis unit vector as the indicating angle of the stroke to be recognized, thereby performing linear transformation on the obtained point set.
First, a rotation indicating angle theta is calculated, where (c)x,cy) Is the stroke center moment coordinate (p)x0,py0) Sampling a first point coordinate for the stroke, wherein atan is an arctangent function, the two coordinates are both two-dimensional coordinate systems of the image, and an origin is positioned at the upper left corner of the image:
then, traversing each point set according to the indication angle and performing linear transformation on coordinates of each point, wherein (q)x,qy) The new point coordinate obtained by linear transformation of a certain point in the origin set is as follows:
the effect of the original stroke track after linear transformation and rotation is shown in fig. 5.
Zooming each point in the rotated new point set to a set size, and determining a processed stroke track; and scaling the new point set obtained after rotation to a specified standard size, and transforming the center-to-center coordinates of the strokes into a new coordinate origin to obtain the new point set, wherein the standard size is 25px by 25 px.
And zooming the new point set obtained after rotation to a specified standard size, taking the center distance coordinates of the strokes as the original point of a new coordinate system, translating the x axis and the y axis in the original coordinate system, and finally intersecting the two axes at the center moment coordinates. The specific processing algorithm flow is shown in fig. 6.
Firstly, a central moment coordinate is calculated for an input point set, namely, the x-axis coordinate value and the y-axis coordinate value of each point in the point set are added in a traversing manner and divided by the length of the point set, namely, the arithmetic mean value of the coordinates of all the point sets is calculated, and thus the central moment coordinate of the input point set is obtained. Further, the original stroke track is scaled according to the size of the panel area of the human-robot cluster platform, the size of the panel area is 800 pixels × 600 pixels, the standard area is 25 pixels × 25 pixels, and the input point set can be scaled according to the ratio of the length and the width of the panel area to the pixels of the standard area. And further, making a difference between the coordinates of each point in the point set and the coordinates of the central moment, and taking the obtained new point set as an output point set.
Acquiring a template stored in a system instruction library; each template corresponds to a control instruction.
And performing stroke recognition on the template, and determining the processed template.
And determining scores of the processed stroke track and the processed template according to the average distance between the coordinates of each point in the processed template and the coordinates of the corresponding point in the processed stroke track.
And determining a control command according to the scores of the processed stroke track and the processed template.
And taking the reciprocal of the average distance as a score between the stroke and each template, and obtaining a corresponding instruction according to the score. The higher the score is, the more the real-time drawing stroke is consistent with a certain template, and the highest score is taken as an instruction sent to the unmanned aerial vehicle cluster.
The template stored in the system instruction library also carries out the same transformation, thereby ensuring the consistency of the acquired and compared stroke data and improving the expansibility of the stroke recognition system. And finally, calculating the average distance between the new point set information obtained after the processing and the information of each point contained in each template after the processing, and taking the reciprocal of the average distance as the score of the track and each template. In the following formula, score is the comparison score, b is the average distance, and size is the standard region size of 25 pixels as described above.
The higher the score is, the more consistent the real-time drawn stroke is with a certain type of template in the template library. FIG. 7 is a graph showing the scores of the strokes compared with the templates, and it can be seen in FIG. 7 that the scores of the strokes with upward stroke tracks are greatly different from the scores of the stroke tracks in the template library after being respectively scored. The invention takes the instruction corresponding to the template with the highest score as the instruction of the drawn stroke.
If the highest score is lower than a certain score threshold value, the highest score is not consistent with the template, and the command is stopped being sent to the unmanned aerial vehicle cluster system. The present invention sets the score threshold to 0.7. Meanwhile, the system feeds back the recognition result to an operator, so that the subsequent conditions can be conveniently judged and adjusted by people.
And S103, controlling the unmanned aerial vehicle cluster according to the control instruction.
S103 specifically comprises the following steps:
sending the control instruction to the unmanned aerial vehicle cluster by using the ROS network; the communication mode of the ROS network is to send an instruction to the unmanned aerial vehicle cluster by using a topic;
and the unmanned aerial vehicle cluster responds to the control instruction and makes a behavior corresponding to the control instruction. And after the unmanned aerial vehicle cluster executes the corresponding behavior, returning a completion response to the controller. Therefore, the unmanned aerial vehicle cluster is controlled through stroke recognition, and the unmanned aerial vehicle cluster control system based on stroke human-computer interaction is realized. The mapping relationship between the stroke template library and the unmanned aerial vehicle behavior set is shown in fig. 8.
As shown in fig. 8, from left to right and from top to bottom represent: the method comprises the steps of stroke instruction of unmanned aerial vehicle cluster takeoff, hovering instruction of unmanned aerial vehicle cluster hovering in the air and keeping balance in a certain range, instruction of unmanned aerial vehicle cluster landing to the ground from the air, left translation of unmanned aerial vehicle cluster relative to the current position in the same plane, right translation of unmanned aerial vehicle cluster relative to the current position in the same plane, forward translation of unmanned aerial vehicle cluster relative to the current position in the same plane, formation of rectangular array type of unmanned aerial vehicle cluster (cluster system formed by four unmanned aerial vehicles in the invention), backward translation of unmanned aerial vehicle cluster relative to the current position in the same plane, formation of straight line of unmanned aerial vehicle cluster in the same horizontal plane, and formation of triangle of unmanned aerial vehicle cluster in the same horizontal plane.
After the unmanned aerial vehicle cluster finishes formation movement, the finished information is sent to an operator, and therefore the user can control the unmanned aerial vehicle cluster through stroke interaction.
Finally, the invention provides a performance comparison of the method:
20 volunteers are invited to participate in the test of the identification precision and performance of the algorithm randomly, and a DTW algorithm is selected for comparison, so that the identification time and the error rate are used as evaluation indexes.
TABLE 1
As can be seen from the test results in the table, the two algorithms are less in time consumption and higher in real-time performance, but Unistroke is shorter in average time consumption, higher in identification accuracy and lower in error rate. For the strokes which are easy to be confused, such as forward stroke and backward stroke, Unistroke is superior to DTW in performance, and the Unistroke has better robustness.
Fig. 9 is a schematic structural diagram of an unmanned aerial vehicle cluster control system based on stroke human-computer interaction, as shown in fig. 9, the unmanned aerial vehicle cluster control system based on stroke human-computer interaction provided by the present invention includes:
a stroke track obtaining module 901, configured to obtain a stroke track drawn by an operator; determining the pixel coordinate of each point in the stroke track according to the set frequency;
a control instruction determining module 902, configured to perform stroke recognition on the pixel coordinate of each point, and determine a control instruction;
and the control instruction control unmanned aerial vehicle cluster module 903 is used for controlling the unmanned aerial vehicle cluster according to the control instruction.
The stroke track obtaining module 901 specifically includes:
the image drawing area establishing unit is used for establishing an image drawing area of a human-computer interaction control interface based on PyQt5 and an ROS network;
the pixel coordinate determination unit is used for determining the pixel coordinate of each point in the stroke track in the image drawing area at a set frequency; the frequency was set at 100 Hz.
The control instruction determining module 902 specifically includes:
the new point set determining unit is used for resampling each point in the stroke track to determine a new point set;
a new point set rotating unit, configured to rotate each point in the new point set;
the processed stroke track determining unit is used for zooming each point in the rotated new point set to a set size and determining a processed stroke track;
the template acquisition unit in the system instruction library is used for acquiring the template stored in the system instruction library; each template corresponds to a control instruction;
the processed template determining unit is used for carrying out stroke recognition on the template and determining the processed template;
the score determining unit is used for determining the scores of the processed stroke track and the processed template according to the average distance between the coordinate of each point in the processed template and the coordinate of the corresponding point in the processed stroke track;
and the control instruction determining unit is used for determining a control instruction according to the processed stroke track and the score of the processed template.
The control instruction control unmanned aerial vehicle cluster module 903 specifically includes:
the control instruction sending unit is used for sending the control instruction to the unmanned aerial vehicle cluster by utilizing the ROS network; the communication mode of the ROS network is to send an instruction to the unmanned aerial vehicle cluster by using a topic;
and the control instruction controls an unmanned aerial vehicle cluster unit, and is used for responding to the control instruction by the unmanned aerial vehicle cluster and making a behavior corresponding to the control instruction.
The embodiments in the present description are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other. For the system disclosed by the embodiment, the description is relatively simple because the system corresponds to the method disclosed by the embodiment, and the relevant points can be referred to the method part for description.
The principles and embodiments of the present invention have been described herein using specific examples, which are provided only to help understand the method and the core concept of the present invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, the specific embodiments and the application range may be changed. In view of the above, the present disclosure should not be construed as limiting the invention.
Claims (8)
1. An unmanned aerial vehicle cluster control method based on stroke human-computer interaction is characterized by comprising the following steps:
acquiring a stroke track described by an operator; determining the pixel coordinate of each point in the stroke track according to the set frequency;
stroke recognition is carried out on the pixel coordinate of each point, and a control instruction is determined;
and controlling the unmanned aerial vehicle cluster according to the control instruction.
2. The stroke human-computer interaction based unmanned aerial vehicle cluster control method of claim 1, wherein the stroke track described by an operator is obtained; and determining the pixel coordinates of each point in the stroke track according to the set frequency, specifically comprising:
an image drawing area of a man-machine interaction control interface based on PyQt5 and ROS network is established;
determining the pixel coordinates of each point in the stroke track in the image drawing area at a set frequency; the frequency was set at 100 Hz.
3. The unmanned aerial vehicle cluster control method based on stroke human-computer interaction of claim 1, wherein the stroke recognition is performed on the pixel coordinates of each point, and the determination of the control command specifically comprises:
resampling each point in the stroke track to determine a new point set;
rotating each point in the new point set;
zooming each point in the rotated new point set to a set size, and determining a processed stroke track;
acquiring a template stored in a system instruction library; each template corresponds to a control instruction;
performing stroke recognition on the template, and determining the processed template;
determining scores of the processed stroke track and the processed template according to the average distance between the coordinate of each point in the processed template and the coordinate of the corresponding point in the processed stroke track;
and determining a control command according to the scores of the processed stroke track and the processed template.
4. The method for controlling the unmanned aerial vehicle cluster based on stroke human-computer interaction according to claim 1, wherein the controlling the unmanned aerial vehicle cluster according to the control instruction specifically comprises:
sending the control instruction to the unmanned aerial vehicle cluster by using the ROS network; the communication mode of the ROS network is to send an instruction to the unmanned aerial vehicle cluster by using a topic;
and the unmanned aerial vehicle cluster responds to the control instruction and makes a behavior corresponding to the control instruction.
5. The utility model provides an unmanned aerial vehicle cluster control system based on stroke human-computer interaction which characterized in that includes:
the stroke track acquisition module is used for acquiring a stroke track described by an operator; determining the pixel coordinate of each point in the stroke track according to the set frequency;
the control instruction determining module is used for carrying out stroke recognition on the pixel coordinate of each point and determining a control instruction;
and the control command control unmanned aerial vehicle cluster module is used for controlling the unmanned aerial vehicle cluster according to the control command.
6. The stroke human-computer interaction based unmanned aerial vehicle cluster control system of claim 5, wherein the stroke track acquisition module specifically comprises:
the image drawing area establishing unit is used for establishing an image drawing area of a human-computer interaction control interface based on PyQt5 and an ROS network;
the pixel coordinate determination unit is used for determining the pixel coordinate of each point in the stroke track in the image drawing area at a set frequency; the frequency was set at 100 Hz.
7. The stroke human-computer interaction based unmanned aerial vehicle cluster control system of claim 5, wherein the control instruction determination module specifically comprises:
the new point set determining unit is used for resampling each point in the stroke track to determine a new point set;
a new point set rotating unit, configured to rotate each point in the new point set;
the processed stroke track determining unit is used for zooming each point in the rotated new point set to a set size and determining a processed stroke track;
the template acquisition unit in the system instruction library is used for acquiring the template stored in the system instruction library; each template corresponds to a control instruction;
the processed template determining unit is used for carrying out stroke recognition on the template and determining the processed template;
the score determining unit is used for determining the scores of the processed stroke track and the processed template according to the average distance between the coordinate of each point in the processed template and the coordinate of the corresponding point in the processed stroke track;
and the control instruction determining unit is used for determining a control instruction according to the processed stroke track and the score of the processed template.
8. The stroke human-computer interaction based unmanned aerial vehicle cluster control system of claim 5, wherein the control instruction to control the unmanned aerial vehicle cluster module specifically comprises:
the control instruction sending unit is used for sending the control instruction to the unmanned aerial vehicle cluster by utilizing the ROS network; the communication mode of the ROS network is to send an instruction to the unmanned aerial vehicle cluster by using a topic;
and the control instruction controls an unmanned aerial vehicle cluster unit, and is used for responding to the control instruction by the unmanned aerial vehicle cluster and making a behavior corresponding to the control instruction.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110505859.8A CN113190045A (en) | 2021-05-10 | 2021-05-10 | Unmanned aerial vehicle cluster control method and system based on stroke human-computer interaction |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110505859.8A CN113190045A (en) | 2021-05-10 | 2021-05-10 | Unmanned aerial vehicle cluster control method and system based on stroke human-computer interaction |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113190045A true CN113190045A (en) | 2021-07-30 |
Family
ID=76988749
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110505859.8A Pending CN113190045A (en) | 2021-05-10 | 2021-05-10 | Unmanned aerial vehicle cluster control method and system based on stroke human-computer interaction |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113190045A (en) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103455262A (en) * | 2012-05-30 | 2013-12-18 | 中兴通讯股份有限公司 | Pen-based interaction method and system based on mobile computing platform |
US20170351324A1 (en) * | 2010-11-22 | 2017-12-07 | Epson Norway Research And Development As | Camera-based multi-touch interaction apparatus, system and method |
CN108303994A (en) * | 2018-02-12 | 2018-07-20 | 华南理工大学 | Team control exchange method towards unmanned plane |
CN111273694A (en) * | 2020-02-28 | 2020-06-12 | 北京京东乾石科技有限公司 | Control method, system and device of unmanned aerial vehicle |
CN112506342A (en) * | 2020-12-04 | 2021-03-16 | 郑州中业科技股份有限公司 | Man-machine interaction method and system based on dynamic gesture recognition |
-
2021
- 2021-05-10 CN CN202110505859.8A patent/CN113190045A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170351324A1 (en) * | 2010-11-22 | 2017-12-07 | Epson Norway Research And Development As | Camera-based multi-touch interaction apparatus, system and method |
CN103455262A (en) * | 2012-05-30 | 2013-12-18 | 中兴通讯股份有限公司 | Pen-based interaction method and system based on mobile computing platform |
CN108303994A (en) * | 2018-02-12 | 2018-07-20 | 华南理工大学 | Team control exchange method towards unmanned plane |
CN111273694A (en) * | 2020-02-28 | 2020-06-12 | 北京京东乾石科技有限公司 | Control method, system and device of unmanned aerial vehicle |
CN112506342A (en) * | 2020-12-04 | 2021-03-16 | 郑州中业科技股份有限公司 | Man-machine interaction method and system based on dynamic gesture recognition |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109800689B (en) | Target tracking method based on space-time feature fusion learning | |
CN106598226B (en) | A kind of unmanned plane man-machine interaction method based on binocular vision and deep learning | |
Dong et al. | Real-time avoidance strategy of dynamic obstacles via half model-free detection and tracking with 2d lidar for mobile robots | |
CN103353935B (en) | A kind of 3D dynamic gesture identification method for intelligent domestic system | |
CN111694428B (en) | Gesture and track remote control robot system based on Kinect | |
CN100442306C (en) | Unmanned machine vision image matching method based on ant colony intelligence | |
CN107765855A (en) | A kind of method and system based on gesture identification control machine people motion | |
CN107357428A (en) | Man-machine interaction method and device based on gesture identification, system | |
CN104407694A (en) | Man-machine interaction method and device combining human face and gesture control | |
Huang et al. | Deepfinger: A cascade convolutional neuron network approach to finger key point detection in egocentric vision with mobile camera | |
Konda et al. | Real time interaction with mobile robots using hand gestures | |
CN111124117B (en) | Augmented reality interaction method and device based on sketch of hand drawing | |
CN112507918B (en) | Gesture recognition method | |
CN114972818A (en) | Target locking system based on deep learning and mixed reality technology | |
Wang et al. | Immersive human–computer interactive virtual environment using large-scale display system | |
Li et al. | Improving autonomous exploration using reduced approximated generalized voronoi graphs | |
Huu et al. | Proposing recognition algorithms for hand gestures based on machine learning model | |
CN108873933A (en) | A kind of unmanned plane gestural control method | |
Bolin et al. | Gesture-based control of autonomous UAVs | |
Osimani et al. | Point Cloud Deep Learning Solution for Hand Gesture Recognition | |
CN113190045A (en) | Unmanned aerial vehicle cluster control method and system based on stroke human-computer interaction | |
CN113221729B (en) | Unmanned aerial vehicle cluster control method and system based on gesture human-computer interaction | |
CN112596659B (en) | Drawing method and device based on intelligent voice and image processing | |
CN112308041A (en) | Unmanned platform gesture control method based on vision | |
CN110766804B (en) | Method for cooperatively grabbing object by human and machine in VR scene |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20210730 |