CN114527869A - Autonomous transfer gesture guiding and identifying method and device - Google Patents
Autonomous transfer gesture guiding and identifying method and device Download PDFInfo
- Publication number
- CN114527869A CN114527869A CN202111651077.1A CN202111651077A CN114527869A CN 114527869 A CN114527869 A CN 114527869A CN 202111651077 A CN202111651077 A CN 202111651077A CN 114527869 A CN114527869 A CN 114527869A
- Authority
- CN
- China
- Prior art keywords
- commander
- gesture
- identifying
- deep learning
- recognition
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 24
- 238000012546 transfer Methods 0.000 title claims description 25
- 238000013135 deep learning Methods 0.000 claims abstract description 27
- 230000009471 action Effects 0.000 claims abstract description 21
- 238000001514 detection method Methods 0.000 claims abstract description 13
- 238000013507 mapping Methods 0.000 claims abstract description 7
- 230000004044 response Effects 0.000 claims description 6
- 238000012216 screening Methods 0.000 claims description 6
- 230000008569 process Effects 0.000 description 6
- 238000005516 engineering process Methods 0.000 description 4
- 238000011084 recovery Methods 0.000 description 3
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 230000002567 autonomic effect Effects 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000002035 prolonged effect Effects 0.000 description 1
- 238000012549 training Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
Abstract
The application belongs to the technical field of airplane control, and particularly relates to an autonomous transport gesture guiding and identifying method and device. The method comprises the steps of S1, identifying a commander based on a deep learning detection algorithm; step S2, judging whether the commander is in an effective range or not, and if so, binding the commander by continuously identifying the same commander; step S3, recognizing the hand action or the limb action of the commander based on a gesture recognition algorithm; step S4, analyzing the hand movement or the limb movement based on a given mapping relation to form a sliding guide control instruction; and step S5, controlling the airplane to move based on the taxi guiding control command. This application can reduce the user demand to the tractor among the transportation, avoids the reciprocal interference to the flight deck that traveles of vehicle, improves aircraft carrier and moves the frame number.
Description
Technical Field
The application belongs to the technical field of airplane control, and particularly relates to an autonomous transfer gesture guiding and identifying method and device.
Background
The carrier-based aircraft surface transportation is one of the phases of movement recovery, the existing carrier-based aircraft transportation is basically realized by tractor traction or a special equipment remote control mode, the transportation process is complex, and the operation is inconvenient, which is mainly caused by the following defects: a. the deck is narrow, the environment is complex, collision can be avoided only by very accurate operation, and misoperation is easily caused due to the fact that the sight of a driver is shielded when the tractor is used for transferring; b. multiple ship surface guarantee personnel need to be equipped for cooperation together to ensure the safety of the transfer process, and when the carrier-based aircraft moves in large batches, the manpower requirement is high; c. under the condition of cooperative transportation of a plurality of carrier-based aircrafts, the requirement on the tractor is high, the time occupied by the track is increased due to the back and forth movement of the tractor, and the running efficiency is influenced; d. the use method of the remote control equipment is mastered, and special training needs to be carried out on commanders; e. the manned machine and the carrier-based machine adopt different transfer command modes, so that the operation time of the deck is prolonged.
Disclosure of Invention
In order to solve the problems, the application provides an autonomous transfer gesture guiding and identifying method and device, and provides a direct and natural interaction mode to achieve carrier-based aircraft surface transfer, so that a carrier-based aircraft system is more autonomous, efficient and safe.
The application provides an autonomous transfer gesture guidance recognition method in a first aspect, which mainly includes:
step S1, identifying the commander based on a deep learning detection algorithm;
step S2, judging whether the commander is in an effective range, and if so, binding the commander by continuously identifying the same commander;
step S3, recognizing the hand action or the limb action of the commander based on a gesture recognition algorithm;
step S4, analyzing the hand movement or the limb movement based on a given mapping relation to form a sliding guide control instruction;
and step S5, controlling the airplane to move based on the taxi guiding control command.
Preferably, in step S1, the deep learning detection algorithm is trained by the dress features of the commander, and the commander is identified based on the dress features.
Preferably, the step S2 further includes screening out the commanders within the effective range through a binocular ranging algorithm.
Preferably, in step S3, the recognizing the hand motion or the limb motion includes:
in daytime, identifying the time sequence characteristics of limbs and gestures of a commander based on a deep learning network; and at night, identifying the relative position and the time sequence characteristics of the motion trail of the night lamp rod held by the commander based on the deep learning network.
Preferably, the step S5 further includes:
and controlling a response command lamp on the machine based on the sliding guide control command so as to feed back the execution state to the commander.
This application second aspect provides an autonomic transportation gesture guide recognition device, mainly includes:
the commander identification module is used for identifying the commander based on a deep learning detection algorithm;
the binding module is used for judging whether the commander is in an effective range or not, and binding the commander by continuously identifying the same commander if the commander is in the effective range;
the gesture recognition module is used for recognizing hand actions or limb actions of the commander based on a gesture recognition algorithm;
the instruction resolving module is used for resolving the hand action or the limb action based on a given mapping relation to form a sliding guide control instruction;
and the airplane control module is used for controlling the airplane to move based on the taxiing guidance control instruction.
Preferably, in the commander identification module, the deep learning detection algorithm is trained by the commander clothing features, and the commander is identified based on the clothing features.
Preferably, the binding module further comprises a binocular recognition unit for screening out commanders in an effective range by a binocular ranging algorithm.
Preferably, the gesture recognition module includes:
the personnel limb gesture recognition unit is used for recognizing the time sequence characteristics of limbs and gestures of the commander based on the deep learning network in daytime; and the luminous object motion track identification unit is used for identifying the relative position of the night lamp rod held by the commander and the time sequence characteristics of the motion track on the basis of the deep learning network at night.
Preferably, the aircraft control module further comprises a feedback unit for controlling a response command lamp on the machine based on the taxi guidance control command so as to feed back the execution state to a commander.
The application has the following advantages: 1) the method has the advantages that the autonomous research on the ship surface of the carrier-based aircraft is deepened, the operation stages of ship surface allocation and transportation, movement and recovery and the like are combined, the autonomous capacity of the ship surface of the carrier-based aircraft is formed through an intelligent technology, the use requirement on a tractor in the transportation process is reduced, the situation that the carrier-based aircraft can independently and intelligently plan the moving path of the carrier-based aircraft according to the deck condition under the unmanned or human supervision condition is guaranteed, and the target point can be rapidly and safely reached on the premise of avoiding accidents such as collision. 2) The deck environment can be sensed and identified autonomously, so that an operator can be assisted to know the condition of the deck comprehensively, and the operation pressure and fatigue of ship surface support personnel are reduced; can accomplish the transportation task under a small amount of people supervision, alleviate control personnel's working strength, reduce the manpower demand, reduce cultivation control personnel's cycle and cost. 3) The ship surface automatic transfer capability is used as a universal technology, can be applied to multiple models such as unmanned models and manned models, autonomously plans a transfer path, shortens the moving distance, reduces the occupied time of a runway, simultaneously can reduce the use requirement of a carrier-based aircraft on a transfer vehicle, effectively reduces the type and the quantity of ship surface equipment, avoids the interference of the reciprocating running of the vehicle on a flight deck, and improves the moving times of the carrier-based aircraft of an aircraft carrier.
Drawings
Fig. 1 is a flowchart of an autonomous transfer gesture guided recognition method according to an embodiment of the present disclosure.
Detailed Description
In order to make the implementation objects, technical solutions and advantages of the present application clearer, the technical solutions in the embodiments of the present application will be described in more detail below with reference to the accompanying drawings in the embodiments of the present application. In the drawings, the same or similar reference numerals denote the same or similar elements or elements having the same or similar functions throughout. The described embodiments are some, but not all embodiments of the present application. The embodiments described below with reference to the drawings are exemplary and intended to be used for explaining the present application, and should not be construed as limiting the present application. All other embodiments obtained by a person of ordinary skill in the art without any inventive work based on the embodiments in the present application are within the scope of protection of the present application. Embodiments of the present application will be described in detail below with reference to the drawings.
The application provides an autonomous transfer gesture guidance recognition method in a first aspect, which mainly includes:
step S1, identifying the commander based on a deep learning detection algorithm;
step S2, judging whether the commander is in an effective range, and if so, binding the commander by continuously identifying the same commander;
step S3, recognizing the hand action or the limb action of the commander based on a gesture recognition algorithm;
step S4, analyzing the hand movement or the limb movement based on a given mapping relation to form a sliding guide control instruction;
and step S5, controlling the airplane to move based on the taxi guiding control command.
In some alternative embodiments, in step S1, the deep learning detection algorithm is trained by the instructor' S dress features, and the instructor is identified based on the dress features. In the embodiment, the existing onboard sensor is effectively utilized, the vision sensor is additionally arranged, and the position of the ship, the ship surface environment and the commander information are acquired after information fusion; commander wears uniform with specific color and is equipped with luminous stick to increase the distinction degree with the surrounding environment and the security personnel
In some optional embodiments, step S2 further includes screening out the commanders in the effective range through a binocular ranging algorithm, and continuously identifying the same commander, so as to ensure that the commanders in the effective range are tracked.
In some alternative embodiments, in step S3, identifying the hand motion or limb motion comprises:
in daytime, identifying the time sequence characteristics of limbs and gestures of a commander based on a deep learning network; and at night, identifying the relative position and the time sequence characteristics of the motion trail of the night lamp rod held by the commander based on the deep learning network.
In the embodiment, a gesture recognition algorithm based on deep learning is adopted, so that the accuracy and the applicability of gesture recognition are enhanced. In an alternative embodiment, at night, a commander carries out transfer command through the light-emitting rod, and the carrier-based aircraft can also generate a sliding instruction through identifying the geometric shape, the spatial position, the color and other characteristics of the light rod. It should be noted that, in the transfer process, the control right transfer of different commanders can be realized through a special control right handover gesture.
In step S5, the shipboard aircraft combines the surrounding environment sensing information to resolve the guidance instruction into control instructions such as front wheel turning, braking, and accelerator, and controls the shipboard aircraft to smoothly complete the ship surface sliding. In the embodiment, the carrier-based aircraft can integrate the ambient environment perception information and the personnel command instruction, autonomously optimize the sliding path and the sliding speed, and improve the transportation efficiency and the safety. In some alternative embodiments, step S5 further includes: and controlling a response command lamp on the machine based on the sliding guide control command so as to feed back the execution state to a commander or a peripheral ground crew.
Fig. 1 shows a flowchart of a specific embodiment of the present application, and referring to fig. 1, a gesture recognition transfer process of the embodiment includes:
1) the carrier-based aircraft enters a surface sliding mode, and the carrier-based aircraft transmits back a video image in real time; 2) under the normal conditions of daytime, the commander wears uniform and specific clothes, and under the night conditions, the commander holds the light bar by hand, and dresses or extracts and identifies the characteristics of the light bar by an algorithm; 3) two cameras are arranged on the carrier-based aircraft to simulate the eyes of a human to acquire images on a deck of the ship, the distance from a commander on the deck to the camera (the carrier-based aircraft) is calculated by combining a binocular vision model, and if the distance is too far, the step 2 is returned; 4) extracting the characteristics of the human body posture by adopting a deep learning network, and segmenting the limb and gesture information of the human body from the whole image; then introducing time domain information through a long-time and short-time memory network (LSTM), extracting the sequence characteristics such as the limbs and the gestures of the daytime, the relative position and the motion track of a lamp rod at night and the like, and identifying the limb actions and the dynamic gestures; 5) after a command instruction is identified, a transfer track is calculated through a path planning module by combining positioning and environment information acquired by a sensing module, and a sliding control module guides a carrier-based aircraft to track the track so as to finish transfer; 6) when an emergency occurs, such as the appearance of obstacles such as personnel, vehicles and the like on a transfer path, the carrier-based machine can automatically brake to avoid collision.
The second aspect of the present application provides an autonomous transfer gesture guidance recognition apparatus corresponding to the foregoing method, which mainly includes:
the commander identification module is used for identifying the commander based on a deep learning detection algorithm;
the binding module is used for judging whether the commander is in an effective range or not, and binding the commander by continuously identifying the same commander if the commander is in the effective range;
the gesture recognition module is used for recognizing hand actions or limb actions of the commander based on a gesture recognition algorithm;
the instruction resolving module is used for resolving the hand action or the limb action based on a given mapping relation to form a sliding guide control instruction;
and the airplane control module is used for controlling the airplane to move based on the taxiing guidance control instruction.
In some optional embodiments, in the commander identification module, the deep learning detection algorithm is trained by the commander dressing features, and the commander is identified based on the dressing features.
In some optional embodiments, the binding module further comprises a binocular recognition unit, which is used for screening out commanders in an effective range by using a binocular ranging algorithm.
In some optional embodiments, the gesture recognition module comprises:
the personnel limb gesture recognition unit is used for recognizing the time sequence characteristics of limbs and gestures of the commander on the basis of the deep learning network in daytime; and the luminous object motion track identification unit is used for identifying the relative position of the night lamp rod held by the commander and the time sequence characteristics of the motion track on the basis of the deep learning network at night.
In some optional embodiments, the aircraft control module further comprises a feedback unit for controlling a response command lamp on the aircraft control machine based on the taxi guidance control command to feed back the execution state to a commander.
The application has the following advantages: 1) the method has the advantages that the autonomous research on the ship surface of the carrier-based aircraft is deepened, the operation stages of ship surface allocation and transportation, movement and recovery and the like are combined, the autonomous capacity of the ship surface of the carrier-based aircraft is formed through an intelligent technology, the use requirement on a tractor in the transportation process is reduced, the situation that the carrier-based aircraft can independently and intelligently plan the moving path of the carrier-based aircraft according to the deck condition under the unmanned or human supervision condition is guaranteed, and the target point can be rapidly and safely reached on the premise of avoiding accidents such as collision. 2) The deck environment can be sensed and identified autonomously, so that an operator can be assisted to know the condition of the deck comprehensively, and the operation pressure and fatigue of ship surface support personnel are reduced; can accomplish the transportation task under a small amount of people supervision, alleviate control personnel's working strength, reduce the manpower demand, reduce cultivation control personnel's cycle and cost. 3) The ship surface automatic transfer capability is used as a universal technology, can be applied to multiple models such as unmanned models and manned models, autonomously plans a transfer path, shortens the moving distance, reduces the occupied time of a runway, simultaneously can reduce the use requirement of a carrier-based aircraft on a transfer vehicle, effectively reduces the type and the quantity of ship surface equipment, avoids the interference of the reciprocating running of the vehicle on a flight deck, and improves the moving times of the carrier-based aircraft of an aircraft carrier.
Although the present application has been described in detail with respect to the general description and specific embodiments, it will be apparent to those skilled in the art that certain modifications or improvements may be made based on the present application. Accordingly, such modifications and improvements are intended to be within the scope of this invention as claimed.
Claims (10)
1. An autonomous transfer gesture guided recognition method is characterized by comprising the following steps:
step S1, identifying the commander based on a deep learning detection algorithm;
step S2, judging whether the commander is in an effective range, and if so, binding the commander by continuously identifying the same commander;
step S3, recognizing the hand action or the limb action of the commander based on a gesture recognition algorithm;
step S4, analyzing the hand movement or the limb movement based on a given mapping relation to form a sliding guide control instruction;
and step S5, controlling the airplane to move based on the taxi guiding control command.
2. The method for guiding and recognizing the autonomous transit gesture according to claim 1, wherein in step S1, the deep learning detection algorithm is trained by a commander' S dressing feature, and the commander is recognized based on the dressing feature.
3. The autonomous transport gesture guided recognition method of claim 1, wherein step S2 further comprises screening out commanders within an effective range through a binocular ranging algorithm.
4. The autonomous transport gesture guided recognition method of claim 1, wherein in step S3, recognizing the hand motion or limb motion comprises:
in daytime, identifying the time sequence characteristics of limbs and gestures of a commander on the basis of a deep learning network; and at night, identifying the relative position and the time sequence characteristics of the motion trail of the night lamp rod held by the commander based on the deep learning network.
5. The autonomous transit gesture guided recognition method of claim 1, wherein step S5 further comprises:
and controlling a response command lamp on the machine based on the sliding guide control command so as to feed back the execution state to the commander.
6. An autonomous transport gesture guided recognition device, comprising:
the commander identification module is used for identifying the commander based on a deep learning detection algorithm;
the binding module is used for judging whether the commander is in an effective range or not, and binding the commander by continuously identifying the same commander if the commander is in the effective range;
the gesture recognition module is used for recognizing hand actions or limb actions of the commander based on a gesture recognition algorithm;
the instruction resolving module is used for resolving the hand action or the limb action based on a given mapping relation to form a sliding guide control instruction;
and the airplane control module is used for controlling the airplane to move based on the taxiing guidance control instruction.
7. The autonomous transport gesture guided recognition device of claim 6, wherein the commander recognition module trains the deep learning detection algorithm through the commander dressing features and recognizes the commander based on the dressing features.
8. The autonomous transport gesture guided recognition device of claim 6, wherein the binding module further comprises a binocular recognition unit for screening out commanders within an effective range through a binocular ranging algorithm.
9. The autonomous transport gesture guided recognition device of claim 6, wherein the gesture recognition module comprises:
the personnel limb gesture recognition unit is used for recognizing the time sequence characteristics of limbs and gestures of the commander based on the deep learning network in daytime; and the luminous object motion track identification unit is used for identifying the relative position of the night lamp rod held by the commander and the time sequence characteristics of the motion track on the basis of the deep learning network at night.
10. The autonomous transport gesture guidance recognition device of claim 6, wherein the aircraft control module further comprises a feedback unit for controlling an onboard response command lamp based on the taxi guidance control command to feed back the execution status to a commander.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111651077.1A CN114527869A (en) | 2021-12-30 | 2021-12-30 | Autonomous transfer gesture guiding and identifying method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111651077.1A CN114527869A (en) | 2021-12-30 | 2021-12-30 | Autonomous transfer gesture guiding and identifying method and device |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114527869A true CN114527869A (en) | 2022-05-24 |
Family
ID=81620411
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111651077.1A Pending CN114527869A (en) | 2021-12-30 | 2021-12-30 | Autonomous transfer gesture guiding and identifying method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114527869A (en) |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR200311459Y1 (en) * | 2003-01-20 | 2003-04-26 | 권혁진 | lighting-glove for hand signal |
CN103777633A (en) * | 2012-10-18 | 2014-05-07 | 霍尼韦尔国际公司 | High-integrity surface guidance system for aircraft electric taxi |
CN104700088A (en) * | 2015-03-23 | 2015-06-10 | 南京航空航天大学 | Gesture track recognition method based on monocular vision motion shooting |
CN108877754A (en) * | 2018-05-14 | 2018-11-23 | 码路(海南)人工智能有限公司 | System and implementation method are played in artificial intelligence music's letter |
CN109343565A (en) * | 2018-10-29 | 2019-02-15 | 中国航空无线电电子研究所 | A kind of UAV Intelligent ground control control method based on gesture perception identification |
CN211108041U (en) * | 2019-11-07 | 2020-07-28 | 吉林大学 | Automatic park quick-witted auxiliary system |
CN113495570A (en) * | 2021-07-26 | 2021-10-12 | 成都飞机工业(集团)有限责任公司 | Ship surface autonomous guiding control system and method for fixed-wing unmanned aerial vehicle |
-
2021
- 2021-12-30 CN CN202111651077.1A patent/CN114527869A/en active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR200311459Y1 (en) * | 2003-01-20 | 2003-04-26 | 권혁진 | lighting-glove for hand signal |
CN103777633A (en) * | 2012-10-18 | 2014-05-07 | 霍尼韦尔国际公司 | High-integrity surface guidance system for aircraft electric taxi |
CN104700088A (en) * | 2015-03-23 | 2015-06-10 | 南京航空航天大学 | Gesture track recognition method based on monocular vision motion shooting |
CN108877754A (en) * | 2018-05-14 | 2018-11-23 | 码路(海南)人工智能有限公司 | System and implementation method are played in artificial intelligence music's letter |
CN109343565A (en) * | 2018-10-29 | 2019-02-15 | 中国航空无线电电子研究所 | A kind of UAV Intelligent ground control control method based on gesture perception identification |
CN211108041U (en) * | 2019-11-07 | 2020-07-28 | 吉林大学 | Automatic park quick-witted auxiliary system |
CN113495570A (en) * | 2021-07-26 | 2021-10-12 | 成都飞机工业(集团)有限责任公司 | Ship surface autonomous guiding control system and method for fixed-wing unmanned aerial vehicle |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110888456B (en) | Unmanned aerial vehicle and unmanned aerial vehicle autonomous collaborative reconnaissance control method | |
CN107807652A (en) | Merchandising machine people, the method for it and controller and computer-readable medium | |
CN106444758B (en) | A kind of road Identification based on machine vision and the preferred AGV transport vehicle in path | |
CN105512628B (en) | Vehicle environmental sensory perceptual system based on unmanned plane and method | |
US8005257B2 (en) | Gesture recognition apparatus and method | |
CN105892474A (en) | Unmanned plane and control method of unmanned plane | |
CN104898524A (en) | Unmanned plane remote control system based on gesture | |
CN105739705A (en) | Human-eye control method and apparatus for vehicle-mounted system | |
CN109532522A (en) | A kind of unmanned charging system of automobile based on 3D vision technique and its application method | |
CN104516352B (en) | The robot system detecting for rectangular target | |
CN106697322A (en) | Automatic abutting system and method for boarding bridge | |
CN109460031A (en) | A kind of system for tracking of the automatic tractor based on human bioequivalence | |
CN112837554A (en) | AGV positioning navigation method and system based on binocular camera | |
CN114527869A (en) | Autonomous transfer gesture guiding and identifying method and device | |
CN111399636A (en) | Unmanned vehicle guiding method, system and device based on limb action instruction | |
CN114115282A (en) | Unmanned device of mine auxiliary transportation robot and using method thereof | |
CN111813125A (en) | Indoor environment detection system and method based on wheeled robot | |
US10901412B2 (en) | Moving body, control method, and recording medium | |
CN114800615A (en) | Robot real-time scheduling system and method based on multi-source perception | |
CN115661966A (en) | Inspection system and method based on augmented reality | |
CN110162057B (en) | Evaluation method of intelligent ground-air heterogeneous multi-robot human performance evaluation system | |
CN108091161A (en) | For detecting the method and system for the protrusion object being located in parking lot | |
Vassallo et al. | Using motor representations for topological mapping and navigation | |
Munoz | Object tracking using autonomous quad copter | |
CN109199805A (en) | Smart electronics blind guiding system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |