WO2018137608A1 - Method of controlling flight device, device, flight device, and system - Google Patents

Method of controlling flight device, device, flight device, and system Download PDF

Info

Publication number
WO2018137608A1
WO2018137608A1 PCT/CN2018/073783 CN2018073783W WO2018137608A1 WO 2018137608 A1 WO2018137608 A1 WO 2018137608A1 CN 2018073783 W CN2018073783 W CN 2018073783W WO 2018137608 A1 WO2018137608 A1 WO 2018137608A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
gesture
image
aircraft
flight
Prior art date
Application number
PCT/CN2018/073783
Other languages
French (fr)
Chinese (zh)
Inventor
王洁梅
黄盈
周大军
朱传聪
孙涛
康跃腾
张晓明
张力
Original Assignee
腾讯科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from CN201710060176.XA external-priority patent/CN106774945A/en
Priority claimed from CN201710060380.1A external-priority patent/CN106843489B/en
Application filed by 腾讯科技(深圳)有限公司 filed Critical 腾讯科技(深圳)有限公司
Publication of WO2018137608A1 publication Critical patent/WO2018137608A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer

Definitions

  • This application relates to the field of aircraft technology.
  • Aircraft such as drones are widely used in surveillance, security, aerial photography, etc.
  • the flight control of aircraft is generally operated by users; currently, a mainstream flight control method is that the user conducts flight of the aircraft through a remote controller paired with the aircraft. control.
  • Embodiments of the present application provide an aircraft flight control method, apparatus, aircraft, and system, which can more conveniently implement flight control of an aircraft.
  • the embodiment of the present application provides the following technical solutions:
  • An aircraft flight control method is applied to an aircraft, the method comprising:
  • the aircraft is controlled to fly according to the flight instructions.
  • an embodiment of the present application further provides an aircraft flight control device, which is applied to an aircraft, where the aircraft flight control device includes:
  • An image acquisition module configured to acquire a user image
  • a gesture recognition module configured to identify a user gesture in the user image
  • a flight instruction determining module configured to determine a flight instruction corresponding to the user gesture according to a predefined correspondence between each user gesture and a flight instruction
  • a flight control module is configured to control aircraft flight according to the flight instruction.
  • an embodiment of the present application further provides an aircraft, including: an image acquisition device and a processing chip; and the processing chip includes the aircraft flight control device.
  • an embodiment of the present application further provides an aircraft flight control system, including: a ground image acquisition device and an aircraft;
  • the ground image acquisition device is configured to collect a user image and transmit it to the aircraft;
  • the aircraft includes a processing chip; the processing chip is configured to acquire a user image transmitted by the ground image capturing device; identify a user gesture in the user image; and determine a location according to a predefined correspondence between each user gesture and a flight instruction. Determining a flight instruction corresponding to the user gesture; controlling flight of the aircraft according to the flight instruction.
  • an embodiment of the present application further provides an aircraft flight control system, including: a ground image acquisition device, a ground processing chip, and an aircraft;
  • the ground image acquisition device is configured to collect a user image and transmit the image to a ground processing chip
  • the ground processing chip is configured to acquire a user image transmitted by the ground image capturing device; identify a user gesture in the user image; and determine a flight corresponding to the user gesture according to a predefined correspondence between each user gesture and a flight instruction. Commanding; transmitting the flight instruction to the aircraft;
  • the aircraft includes a processing chip; the processing chip is configured to acquire the flight instruction, and control aircraft flight according to the flight instruction.
  • embodiments of the present application provide a computer readable storage medium comprising instructions that, when executed on a computer, perform the method described above.
  • embodiments of the present application provide a computer program product comprising instructions that, when executed on a computer, perform the method described above.
  • the aircraft may acquire a user image, identify a user gesture in the user image, and determine the location according to a predefined correspondence between each user gesture and a flight instruction.
  • the identified flight instruction corresponding to the user gesture controls the flight of the aircraft according to the flight instruction to implement flight control of the aircraft.
  • the aircraft flight control method provided by the embodiment of the present application can control the flight of the aircraft by the user gesture, and the flight control operation of the aircraft is extremely convenient, and the flight control of the aircraft can be conveniently achieved.
  • FIG. 1 is a schematic diagram of flight control of an aircraft provided by an embodiment of the present application.
  • FIG. 2 is a schematic diagram of a user gesture control aircraft flight according to an embodiment of the present application.
  • FIG. 3 is a schematic diagram of another flight control of an aircraft provided by an embodiment of the present application.
  • FIG. 4 is a schematic diagram of still another flight control of the aircraft provided by the embodiment of the present application.
  • FIG. 5 is a flowchart of an aircraft flight control method according to an embodiment of the present application.
  • FIG. 6 is another flowchart of an aircraft flight control method according to an embodiment of the present application.
  • FIG. 7 is still another flowchart of an aircraft flight control method according to an embodiment of the present application.
  • FIG. 8 is still another flowchart of an aircraft flight control method according to an embodiment of the present application.
  • FIG. 9 is a schematic diagram of a flight scenario of an aircraft according to an embodiment of the present application.
  • FIG. 10 is a schematic diagram of another flight scenario of an aircraft provided by an embodiment of the present application.
  • FIG. 11 is still another flowchart of an aircraft flight control method according to an embodiment of the present application.
  • FIG. 12 is still another schematic diagram of flight control of an aircraft provided by an embodiment of the present application.
  • FIG. 13 is another flowchart of a flight control method of an aircraft according to an embodiment of the present application.
  • FIG. 14 is a flow chart of a method for determining a horizontal movement distance adjusted by an aircraft
  • Figure 15 is a schematic diagram for determining the horizontal movement distance of the aircraft adjustment
  • 16 is a flow chart of a method for determining an adjusted vertical movement distance of an aircraft
  • Figure 17 is a schematic diagram for determining the vertical movement distance adjusted by the aircraft
  • FIG. 18 is another flowchart of a flight control method of an aircraft according to an embodiment of the present application.
  • Figure 19 is a diagram showing an example of flight path control of an aircraft
  • FIG. 20 is another flowchart of a flight control method of an aircraft according to an embodiment of the present application.
  • FIG. 21 is a structural block diagram of an aircraft flight control apparatus according to an embodiment of the present application.
  • FIG. 22 is another structural block diagram of an aircraft flight control apparatus according to an embodiment of the present application.
  • FIG. 23 is another structural block diagram of an aircraft flight control apparatus according to an embodiment of the present application.
  • FIG. 24 is another structural block diagram of an aircraft flight control apparatus according to an embodiment of the present application.
  • the embodiment of the present application can control the flight of the aircraft by the user gesture, the aircraft can acquire the user image, and recognize the user gesture in the user image, thereby using the corresponding flight instruction of the user gesture. Flight control is carried out to achieve the purpose of facilitating flight control of the aircraft.
  • the aircraft 1 may be provided with an image acquisition device 11 and a processing chip 12; the user may operate gestures around the aircraft, and the image acquisition device of the aircraft may collect user images in real time or at regular intervals, and transmit to the processing.
  • a chip the user image may include a user portrait and a background image;
  • the processing chip of the aircraft can identify a user gesture in the image of the user, and determine a flight instruction corresponding to the identified user gesture according to a predetermined correspondence between each user gesture and a flight instruction, thereby performing flight control with the determined flight instruction;
  • Table 1 below shows a correspondence between an optional user gesture and a flight instruction
  • FIG. 2 shows a schematic diagram of a corresponding user gesture control aircraft flight, which can be referred to; it can be understood that Table 1 and Figure 2 show For the optional example only, the correspondence between the user gesture and the flight instruction can be defined according to actual needs;
  • the flight control diagram shown in FIG. 1 requires the aircraft to be able to capture the user image, so that the processing chip of the aircraft can recognize the user gesture in the user image, and perform flight control according to the corresponding flight instruction of the user gesture; this method requires the aircraft to fly in the user.
  • Around the camera image can be captured, which limits the flight of the aircraft away from the user to perform aerial missions such as aerial photography.
  • FIG. 3 shows another flight control schematic diagram under the flight control idea of the aircraft based on the user gesture.
  • the ground image acquisition device 2 disposed near the user can collect the user image and transmit it to the aircraft 1, the aircraft.
  • the processing chip 12 acquires a user image collected by the ground image capturing device, and can identify a user gesture in the user image, and determines a flight instruction corresponding to the recognized user gesture according to a predetermined correspondence between each user gesture and a flight instruction, thereby determining according to the determination Flight instructions for flight control;
  • the embodiment of the present application can also collect user images through a terrestrial image acquisition device, and the terrestrial image acquisition device can adopt a General Packet Radio Service (GPRS), a micro air vehicle link protocol (Micro Air Vehicle Link, A wireless communication technology such as MAV Link) transmits the collected user image to the processing chip of the aircraft; thus, the processing chip of the aircraft can recognize the user gesture in the acquired user image, and perform flight control according to the corresponding flight instruction;
  • GPRS General Packet Radio Service
  • Micro Air Vehicle Link Micro Air Vehicle Link
  • a wireless communication technology such as MAV Link
  • the user image is transmitted between the ground image acquisition device and the aircraft, the aircraft can fly away from the user, perform aerial missions and the like; further, as shown in FIG. 4, the image acquisition device 11 provided by the aircraft itself can collect.
  • the user equipment 3 such as a user's mobile phone to display the task image collected by the aircraft to the user; meanwhile, the user can operate different gestures based on the displayed mission image to perform flight on the aircraft. The flight during the mission is controlled.
  • the aircraft flight control method provided by the embodiment of the present application is introduced below from the perspective of the aircraft, and the aircraft flight control method described below can refer to the above description.
  • FIG. 5 is a flowchart of a method for controlling flight of an aircraft according to an embodiment of the present disclosure.
  • the method may be applied to an aircraft, and may be specifically applied to a processing chip of an aircraft. Referring to FIG. 5, the method may include:
  • Step S100 Acquire a user image.
  • the user image may be acquired by an image acquisition device provided by the aircraft, that is, the processing chip of the aircraft may acquire a user image collected by the image acquisition device of the aircraft to obtain the image of the user;
  • the user image may also be acquired by the ground image acquisition device, and the ground image acquisition device may transmit the collected user image to the processing chip of the aircraft through wireless communication technology to obtain the image of the user.
  • Step S110 Identify a user gesture in the user image.
  • the embodiment of the present application may identify a user gesture from the user image according to the skin color detection algorithm. Specifically, the embodiment of the present application may identify a human skin region in the user image according to the skin color detection algorithm. Extracting a user gesture area from the human skin area, matching the contour feature of the user gesture area with the preset contour features of each standard user gesture, and determining a standard user gesture with the highest matching degree with the contour feature of the user gesture area, The determined standard user gesture is thus used as a user gesture identified from the user image to enable recognition of the user gesture in the user image.
  • the embodiment of the present application may also collect a large number of user images including standard user gestures, as image samples corresponding to each standard user gesture; thus, according to a support vector machine (Support Vector Machine, a machine training method such as SVM), which uses a sample of images corresponding to each standard user gesture to train a detector of each standard user gesture;
  • a support vector machine Small Vector Machine, a machine training method such as SVM
  • the detector of each standard user gesture is used to detect the user image acquired in step S100, and the detection result of the user image by the detector of each standard user gesture is obtained, and the image from the user is determined according to the detection result of the user image.
  • the user gesture identified in the middle implements recognition of a user gesture in the user image.
  • the manner of recognizing the user's gesture from the user image is only optional.
  • the embodiment of the present application may also adopt other schemes for recognizing the user's gesture from the user image.
  • Step S120 Determine, according to a predefined correspondence between each user gesture and a flight instruction, a flight instruction corresponding to the user gesture.
  • each user gesture and the flight instruction may be as shown in Table 1. After the user gesture in the user image is recognized, according to a predefined correspondence between each user gesture and the flight instruction, A flight instruction corresponding to the identified user gesture is determined, thereby controlling the aircraft to fly with the determined flight instruction.
  • the flight instruction corresponding to the user gesture may be determined, and the subsequent The flight instruction controls the flight of the aircraft; if the identified user gesture, the correspondence between the predefined user gestures and the flight instruction does not correspond to a flight instruction, that is, the identified user gesture does not correspond to a flight instruction, The process can be ended without flight control of the aircraft.
  • Step S130 controlling the flight of the aircraft according to the flight instruction.
  • the aircraft may acquire a user image, identify a user gesture in the user image, and determine, according to a predefined correspondence between each user gesture and a flight instruction, the identified user gesture correspondingly.
  • the flight instruction controls the flight of the aircraft according to the flight instruction to achieve flight control of the aircraft.
  • the aircraft flight control method provided by the embodiment of the present application can control the flight of the aircraft by the user gesture, and the flight control operation of the aircraft is extremely convenient, and the flight control of the aircraft can be conveniently achieved.
  • FIG. 6 is another flowchart of the aircraft flight control method provided by the embodiment of the present application, where the method may be applied.
  • the method may include:
  • Step S200 Acquire a user image.
  • the image capturing device such as the camera of the aircraft can collect the video frame in real time, obtain the collected user image, and transmit the user image collected in real time to the processing chip of the aircraft;
  • the map image acquisition device can also collect video frames in real time, obtain the collected user images, and transmit the real-time collected user images to the processing chip of the aircraft through wireless communication technology.
  • Step S210 Identify a human skin area in the user image according to the skin color detection algorithm.
  • the human skin area can be identified from the user image according to the Gaussian Mixture Model (GMM) model of the skin.
  • GMM Gaussian Mixture Model
  • Step S220 Removing a face region in the human skin region to obtain a user gesture region.
  • the embodiment of the present application may identify and remove the face region in the human skin region according to the face detection algorithm.
  • the obtained user gesture area may only include the user's hand (eg, the user wears a tighter face, only the face and the hand are exposed), and may also include The user's arm (such as when the user wears a vest or short sleeve), the leg (such as when the user wears shorts), etc.; but after removing the face area from the human skin area in the user image, the remaining human skin area can be considered It is mainly a skin area of a human hand. Therefore, the human skin area of the user image in which the face area is removed from the user image can be directly used as the user gesture area.
  • step S210 and step S220 illustrate an alternative manner of extracting a user gesture area from the user image by the skin color detection algorithm.
  • Step S230 Extract contour features of the user gesture area.
  • Step S240 Matching the contour feature of the user gesture area with the preset contour feature of each standard user gesture, and determining a standard user gesture with the highest degree of matching with the contour feature of the user gesture area, and obtaining the user from the user User gestures identified in the image.
  • the embodiment of the present application may extract the contour feature of the user gesture area, and match the contour feature of the user gesture area with the preset contour features of each standard user gesture to determine the highest matching degree.
  • a standard user gesture that takes the determined standard user gesture with the highest degree of matching as the user gesture identified from the user image.
  • step S230 to step S240 are performed, after extracting the user gesture area from the user image, identifying a user gesture corresponding to the extracted user gesture area based on the comparison with the contour feature of the standard user gesture, and obtaining the An optional way of user gestures in the user image.
  • Steps S210 to S240 can be considered as an alternative implementation of step S110 shown in FIG. 5.
  • Step S250 Determine, according to a predefined correspondence between each user gesture and a flight instruction, a flight instruction corresponding to the user gesture.
  • Step S260 controlling the flight of the aircraft according to the flight instruction.
  • FIG. 6 illustrates that a user gesture area is identified from the user image according to a skin color detection algorithm, and then a standard user gesture corresponding to the user gesture area is matched with the contour feature to obtain a user gesture in the user image.
  • the method needs to be established in the case where the user's hand is bare. Once the user wears the glove, the user's gesture area in the user image cannot be identified by the skin color detection algorithm;
  • the embodiment of the present application can identify the connected area from the user image, match the contour feature of each connected area with the preset contour features of each standard user gesture, and identify the user gesture in the user image;
  • FIG. 7 is still another flowchart of the aircraft flight control method provided by the embodiment of the present application.
  • the method may be applied to an aircraft, and may be specifically applied to a processing chip of an aircraft.
  • the method may include:
  • Step S300 Acquire a user image.
  • step S300 may be referred to corresponding to step S200 shown in FIG. 6.
  • Step S310 extracting a connected area in the user image.
  • the embodiment of the present application may extract all connected areas in the user image; or may remove the connected area in the user image after removing the face area after removing the face area from the user image.
  • Step S320 Extract contour features of each connected area.
  • Step S330 matching the contour features of each connected area with the preset contour features of each standard user gesture, determining a standard user gesture with the highest matching degree, and using the standard user gesture with the highest matching degree as the image from the user.
  • the recognized user gesture is a standard user gesture with the highest matching degree.
  • the contour features of each connected area are respectively matched with the contour features of each standard user gesture, and the contour features of each connected area are obtained, and the matching degree of the contour features of each standard user gesture is selected, and the matching degree is selected from the highest.
  • a corresponding standard user gesture is taken as a user gesture identified from the user image.
  • step S310 to step S330 show another optional implementation method for identifying a user gesture in the user image in step S110 shown in FIG. 5, and step S310 to step S330, the user may be performed without using the skin color detection algorithm.
  • the recognition of the user's gesture in the image is performed by extracting the connected region in the user image, matching the contour features of the standard user gestures through the contour features of the connected region in the user image, and selecting the standard user gesture with the highest matching degree as the User gestures identified from the user image.
  • Step S340 Determine, according to a predefined correspondence between each user gesture and a flight instruction, a flight instruction corresponding to the user gesture.
  • Step S350 controlling the flight of the aircraft according to the flight instruction.
  • the embodiment of the present application may also pre-train the detectors of each standard user gesture, and detect the user image by using a detector of each standard user gesture, and identify the detection result based on the detector of each standard user gesture.
  • the embodiment of the present application may pre-collect a plurality of user images including standard user gestures as image samples corresponding to each standard user gesture; thus, for each standard user gesture corresponding image sample, according to the machine Training methods (SVM, etc.) to train detectors for standard user gestures;
  • SVM machine Training methods
  • FIG. 8 is still another flowchart of the aircraft flight control method provided by the embodiment of the present application, and the method can be applied to The aircraft, specifically applicable to the processing chip of the aircraft, referring to FIG. 8, the method may include:
  • Step S400 Acquire a user image.
  • step S400 may be referred to corresponding to step S200 shown in FIG. 6.
  • Step S410 Using the detectors of the standard user gestures, respectively detecting the user image, and obtaining a detection result of the user image by the detector of each standard user gesture.
  • Step S420 Determine a user gesture recognized from the user image according to the detection result of the user image.
  • the detection result of the user image by the detector of the standard user gesture may be that the user image is a standard user gesture corresponding to the detector, or the user image is not a standard user gesture corresponding to the detector; and the detection of each standard user gesture is analyzed.
  • the embodiment of the present application may determine the detected user gesture in the user image, and implement the recognition of the user gesture in the user image.
  • step S410 and step S420 show another optional implementation method for identifying a user gesture in the user image in step S110 shown in FIG. 5, and step S410 and step S420 may pass the pre-trained standard user gestures.
  • the detector detects the user gesture identified in the user image.
  • Step S430 Determine a flight instruction corresponding to the user gesture according to a predefined correspondence between each user gesture and a flight instruction.
  • Step S440 controlling the flight of the aircraft according to the flight instruction.
  • the image capturing device of the aircraft may not be able to collect the user image; As shown in Figure 9, after the aircraft flies forward according to the identified user gesture, if the user moves forward without synchronization, the user will no longer be in the image acquisition range of the aircraft camera. At this time, the aircraft camera will not be able to collect.
  • the user image after which the flight control can no longer be performed by the user gesture in the user image; therefore, in the case that the user does not follow the synchronous movement of the aircraft, the image capturing device can still make the aircraft fly after the corresponding flight instruction according to the user gesture.
  • the user image is collected, and the aircraft can control the image acquisition angle of the image acquisition device to adjust, so that the image acquisition device can still collect the user image;
  • the processing chip of the aircraft can control the image acquisition angle of the image capturing device to be adjusted, so that the user is within the image capturing range of the image capturing device;
  • the embodiment of the present application can adjust the image acquisition angle of the image acquisition device according to the flight direction and the flight distance of the aircraft; the adjustment ratio of the specific image acquisition angle and the flight direction and the flight distance may be according to the actual setting of the image acquisition device. set up;
  • the image capturing device of the aircraft may have an angle adjusting mechanism, and the processing chip may control an angle of the adjusting angle adjusting mechanism to implement an adjustment of an image capturing angle of the image capturing device;
  • the image capturing angle of the image capturing device of the aircraft may not be adjusted, and the user may be moved by the user to keep the image capturing angle of the image capturing device unchanged.
  • the image acquisition device can still acquire the user image, and perform flight control based on the user gesture in the user image.
  • the image acquisition device of the aircraft can perform the acquisition of the task image such as the aerial image, and the aircraft can not adjust the image collection of the image acquisition device after the flight according to the flight instruction. angle.
  • the aircraft may also have multiple user portraits in the user image. As shown in FIG. 10, multiple users making gestures exist on the ground at this time.
  • the aircraft needs to determine the user's gesture based on the flight control; based on this, the embodiment of the present application can set a legitimate user to control the flight of the aircraft, and the flight control of the user's gesture based on the legal user of the aircraft can be preset.
  • the user's face feature after acquiring the user image (which may be collected by the image acquisition device of the aircraft or collected by the ground image acquisition device), may identify the user portrait region of the user image that matches the facial feature of the legitimate user.
  • the user gesture is identified based on the user portrait area matching the facial features of the legitimate user, thereby ensuring that the aircraft can perform corresponding flight control by the user gesture of the legitimate user in the user image;
  • FIG. 11 is still another flowchart of the aircraft flight control method provided by the embodiment of the present application.
  • the method may be applied to an aircraft, and may be specifically applied to a processing chip of an aircraft.
  • the method may include :
  • Step S500 Acquire a user image.
  • step S500 may be referred to corresponding to step S200 shown in FIG. 6.
  • Step S510 Determine whether there is a face area matching the facial features of the legal user in the user image. If no, step S520 is performed, and if yes, step S530 is performed.
  • the embodiment of the present application may identify a face region in the user image according to a face detection algorithm, obtain at least one face region, and respectively obtain the face features of each face region.
  • the face features of the legal users are matched, and it is determined whether there is a face region in the user image that matches the face feature of the legitimate user.
  • Step S520 ending the process.
  • step S510 On the user image acquired in the next frame.
  • Step S530 Extract a user portrait corresponding to a face region of the user image that matches a facial feature of the legal user.
  • the extracted user portrait may be a portrait of a legitimate user in the user image (ie, a user corresponding to a face region in the user image that matches the facial feature of the legitimate user), and includes a body image of the legitimate user.
  • Step S540 Identify a user gesture in the user portrait.
  • an implementation manner of identifying a user gesture in the user portrait may be referred to the corresponding part above.
  • the embodiment of the present application may identify a user gesture in the user portrait according to the skin color detection algorithm, as shown in FIG. 6 .
  • the human skin region in the user portrait may be identified according to a skin color detection algorithm. Extracting a user gesture area from a human skin area, matching a contour feature of the user gesture area with a preset contour feature of each standard user gesture, and determining a standard user having the highest matching degree with the contour feature of the user gesture area Gesture, obtaining a user gesture recognized from the user portrait;
  • the embodiment of the present application may also match the contour features of each standard user gesture according to the contour feature of the connected area in the user portrait, and identify the user gesture in the user portrait; And extracting the connected area in the user portrait, matching the contour feature of each connected area with the preset contour features of each standard user gesture, determining a standard user gesture with the highest matching degree, and matching the standard user with the highest matching degree Gesture as a user gesture recognized from the user portrait;
  • the embodiment of the present application may also identify a user gesture in the user portrait by using a detector of each standard user gesture according to FIG. 8; specifically, a detector of each standard user gesture may be used, respectively The user portrait is detected, and the detection result of the user portrait by the detector of each standard user gesture is obtained, and the user gesture recognized from the user portrait is determined according to the detection result of the user portrait.
  • Step S550 Determine, according to a predefined correspondence between each user gesture and a flight instruction, a flight instruction corresponding to the user gesture.
  • Step S560 controlling the flight of the aircraft according to the flight instruction.
  • the method shown in FIG. 11 uses a face detection algorithm to identify a user portrait of a legitimate user in a user image, thereby identifying a user gesture of the user portrait of the legitimate user, to control the aircraft to fly according to the corresponding flight instruction, only this A preferred embodiment of the flight control of an aircraft by applying an embodiment;
  • the embodiment of the present application can also be opened only by the legal user by limiting the ground image acquisition device (such as setting the password of the ground image acquisition device).
  • the ground image acquisition device collects the user image of the legitimate user to control the flight of the aircraft; at this time, the aircraft can eliminate the step of judging the legitimate user based on the face detection algorithm.
  • the embodiment of the present application can also be used to disperse the personnel and select a place with fewer personnel to maintain only the legal user on the flight scene of the aircraft, so that the aircraft can directly recognize the user gesture through the collected user image, and eliminate the user gesture.
  • the present application may further set a ground processing chip that communicates with the ground image acquisition device, and the ground processing chip identifies the user gesture in the user image, and determines the user. a flight instruction corresponding to the gesture; the ground processing chip transmits the flight instruction to the processing chip of the aircraft through a wireless communication technology, and the processing chip of the aircraft controls the flight of the aircraft according to the flight instruction;
  • the ground processing chip 4 can identify the user gesture in the user image, and the specific recognition manner can be as shown in FIG. 6 and FIG.
  • the method shown in FIG. 8 and FIG. 11 is implemented.
  • the ground processing chip 4 determines a flight instruction corresponding to the user gesture according to a predefined correspondence between each user gesture and a flight instruction, and transmits the flight instruction through a wireless communication technology.
  • a processing chip for the aircraft 1; the processing chip of the aircraft 1 controls the flight of the aircraft in accordance with the flight instruction.
  • the aircraft flight control method provided by the embodiment of the present application can control the flight of the aircraft by the user gesture, and the flight control operation of the aircraft is extremely convenient, and the flight control of the aircraft can be conveniently achieved.
  • the user may also wave the human hand through the agreed first gesture (the first gesture of the appointment is one of the predefined user gestures described above), and generate the motion with the first gesture.
  • Gesture track the first gesture of the appointment is one of the predefined user gestures described above
  • FIG. 13 is a flowchart of a flight control method for an aircraft provided by an embodiment of the present application. The method is applicable to an aircraft, and is specifically applicable to a processing chip of an aircraft. Referring to FIG. 13, the method may include:
  • Step S600 Acquire a user image.
  • the user image may be acquired by an image acquisition device provided by the aircraft, that is, the processing chip of the aircraft may acquire a user image collected by the image acquisition device of the aircraft to obtain the image of the user;
  • the user image may also be acquired by the ground image acquisition device, and the ground image acquisition device may transmit the collected user image to the processing chip of the aircraft through wireless communication technology to obtain the image of the user.
  • the user image can be collected by an image acquisition device provided by the aircraft for description.
  • Step S610 Identify a user gesture in the user image.
  • an implementation manner of identifying a user gesture in the user portrait may be referred to the corresponding part above.
  • the embodiment of the present application may identify a user gesture in the user portrait according to the skin color detection algorithm, as shown in FIG. 6 .
  • the human skin region in the user portrait may be identified according to a skin color detection algorithm. Extracting a user gesture area from a human skin area, matching a contour feature of the user gesture area with a preset contour feature of each standard user gesture, and determining a standard user having the highest matching degree with the contour feature of the user gesture area Gesture, obtaining a user gesture recognized from the user portrait;
  • the embodiment of the present application may also match the contour features of each standard user gesture according to the contour feature of the connected area in the user portrait, and identify the user gesture in the user portrait; And extracting the connected area in the user portrait, matching the contour feature of each connected area with the preset contour features of each standard user gesture, determining a standard user gesture with the highest matching degree, and matching the standard user with the highest matching degree Gesture as a user gesture recognized from the user portrait;
  • the embodiment of the present application may also identify a user gesture in the user portrait by using a detector of each standard user gesture according to FIG. 8; specifically, a detector of each standard user gesture may be used, respectively The user portrait is detected, and the detection result of the user portrait by the detector of each standard user gesture is obtained, and the user gesture recognized from the user portrait is determined according to the detection result of the user portrait.
  • Step S620 Determine the position of the first gesture in the user image if the identified user gesture is a predetermined first gesture.
  • the embodiment of the present application may detect the user image by using a pre-trained first gesture detector, and determine whether a first gesture exists in the user image to identify whether the user gesture in the user image is a first gesture; when it is recognized by the detector of the first gesture that the first gesture exists in the user image (ie, the user gesture in the user image is the first gesture), the first gesture may be determined in the user image a position in the user image; optionally, determining an area of the first gesture recognized by the detector of the first gesture in the image of the user, the position of the center point of the area in the user image, as the first gesture in the user image s position.
  • the embodiment of the present application may also identify a human skin area in the user image according to the skin detection algorithm; remove the human face area from the human skin area, and obtain a user gesture area (because the naked skin of the human body is generally a face and a human hand) Therefore, the human skin area of the face area can be removed as the user gesture area; the contour feature of the user gesture area is matched with the contour feature of the predetermined first gesture, and the matching degree is used to determine whether the user image is There is a first gesture to identify whether the user gesture in the user image is the first gesture;
  • the embodiment of the present application may position the user gesture area in the image (optionally, the position of the center point of the user gesture area in the image), as the first gesture is in the The location in the user's image.
  • the embodiment of the present application may also extract a connected area in the user image (preferably, extract each connected area of the user image after removing the face area), and set the contour feature of each connected area with the predetermined first gesture.
  • the contour feature is matched, and the first gesture is determined in the user image by the matching degree to identify whether the user gesture in the user image is the first gesture;
  • the first matching degree and the second matching degree may be the same or different, and may be different depending on the actual situation. set up.
  • the embodiment of the present application may first determine whether a user gesture exists in the user image, and whether the user gesture is a first gesture (may be determined by a detector of the first gesture, or may be a user gesture area, or a connected area and a The matching degree judgment of the contour feature of a gesture determines that the user gesture is present in the user image, and after the user gesture is the first gesture, the position of the first gesture in the user image may be determined.
  • Step S630 Adjust a flight attitude of the aircraft according to a position of the first gesture in the user image, so that the aircraft follows the gesture trajectory of the first gesture to fly.
  • the embodiment of the present application may determine, according to the location, an adjusted horizontal movement distance of the aircraft in the same horizontal motion direction as the gesture trajectory of the first gesture; Determining the vertical movement distance of the aircraft in the same vertical movement direction as the gesture trajectory of the first gesture; thereby adjusting the flight attitude of the aircraft with the determined horizontal movement distance and vertical movement distance, so that the first gesture is always
  • the image capturing device is located in the image capturing field of view of the image capturing device.
  • the first gesture is always located within the image capturing field of view of the image capturing device, so that the aircraft follows the first gesture. Gesture trajectory flight.
  • the aircraft can perform the real-time according to the gesture trajectory of the first gesture of the user.
  • the adjustment of the flight attitude enables the aircraft to fly following the gesture trajectory of the user's first gesture, enabling control of the flight path of the aircraft.
  • the processing chip of the aircraft may acquire a user image collected by the image capturing device of the aircraft, and identify a user gesture in the user image, if the identified user gesture is a predetermined a gesture of determining a position of the first gesture in the user image, and adjusting a flight attitude of the aircraft according to a position of the first gesture in the user image, so that the aircraft follows the first
  • the gesture gesture trajectory flight realizes the control of the flight path of the aircraft. It can be seen that, in the embodiment of the present application, the user can operate the first gesture, so that the aircraft can adjust the flight attitude according to the position of the first gesture in the collected user image, so that the aircraft can follow the gesture trajectory of the user's first gesture. .
  • the embodiment of the present application can control the flight path of the aircraft by the gesture track of the first gesture of the user, and conveniently realize the flight path control of the aircraft.
  • FIG. 14 is a flowchart of a method for determining a horizontal movement distance adjusted by an aircraft according to a position of the first gesture in the user image, and the method is applicable to an aircraft, and specifically, to a processing chip of the aircraft.
  • the method can include:
  • Step S700 Construct a horizontal axis coordinate with a line of sight range of the image capturing device of the aircraft in the horizontal axis direction, where the origin of the horizontal axis coordinate is the midpoint of the line of sight of the image capturing device in the horizontal axis direction.
  • BC is the horizontal axis coordinate constructed by the line of sight of the camera in the horizontal axis direction, and each point on the BC uniformly falls on the horizontal axis coordinate of the image captured by the camera; AM is the camera center line, and M is the camera in the horizontal direction.
  • the midpoint of the line of sight in the axial direction is the origin of the horizontal axis coordinate, that is, the center of the BC line segment.
  • Step S710 determining a projection point of the position of the first gesture in the user image on the horizontal axis coordinate, and determining coordinates of the projection point on the horizontal axis coordinate.
  • the embodiment of the present application may determine the position of the first gesture in the image, the projection point in the horizontal direction; as shown in FIG. 15 , the position of the first gesture in the image
  • the projection point in the horizontal direction is the P point; the coordinate of the P point on the horizontal axis BC is the coordinate of the projection point on the horizontal axis.
  • Step S720 according to the length of the horizontal axis coordinate, the vertical height of the aircraft and the ground, the angle between the center line and the vertical direction of the image capturing device of the aircraft, the half angle of the viewing angle of the horizontal direction of the image capturing device, and the projection point
  • the coordinates on the horizontal axis coordinate determine the horizontal moving distance of the aircraft.
  • OA is the vertical height of the aircraft such as the drone from the ground; then OAM is the angle between the center line of the camera and the vertical direction, and BAM is the half angle of the angle of view of the horizontal axis of the camera, so that the first gesture is in the horizontal direction.
  • the projection point P falls on the center point M of the image acquired by the camera, and the aircraft needs to adjust the horizontal movement distance of the MP; that is, the embodiment of the present application can adjust the flight posture of the aircraft so that the first gesture is located in the image collection field of the image acquisition device. center of;
  • OAM can be set to ⁇
  • BAM is ⁇
  • the vertical height of the aircraft from the ground is H
  • the position of the first gesture in the user image
  • the horizontal axis coordinate of the projection point on the horizontal axis coordinate is x
  • the horizontal axis coordinate The length (the length of the line of sight of the camera in the horizontal axis direction) is Lx
  • the horizontal moving distance MP to be adjusted is Sx
  • the horizontal moving distance that the aircraft needs to adjust can be determined according to the following formula:
  • the height data of the aircraft can be obtained by ultrasonic or barometer; the angle data can be set at a fixed angle as needed.
  • the processing chip of the aircraft can acquire a user image of each frame collected in real time, determine the horizontal moving distance of the aircraft in real time based on the position of the first gesture in each frame of the user image, and then output a flight control instruction to the flight mechanism of the aircraft.
  • the aircraft is enabled to adjust the determined horizontal movement distance in the same horizontal motion direction as the gesture trajectory of the first gesture, so that the aircraft can follow the gesture trajectory of the first gesture to fly in the same horizontal motion direction.
  • FIG. 16 is a flowchart of a method for determining an adjusted vertical movement distance of an aircraft according to a position of the first gesture in the user image, and the method is applicable to an aircraft, and specifically applicable to an processing chip of the aircraft.
  • the method can include:
  • Step S800 constructing a vertical axis coordinate with a line of sight range of the image capturing device of the aircraft in the longitudinal axis direction, and an origin of the vertical axis coordinate is a midpoint of the line of sight of the image capturing device in the longitudinal axis direction.
  • BC is The vertical axis coordinate constructed by the line of sight of the camera in the longitudinal direction; the dotted line AD is the center line of the camera, and D is the midpoint of the line of sight of the camera in the longitudinal direction, which is the origin of the vertical axis coordinate.
  • Step S810 determining a projection point of the position of the first gesture in the user image on the vertical axis coordinate, and determining coordinates of the projection point on the vertical axis coordinate.
  • the embodiment of the present application may determine a projection point of the position of the first gesture in the user image in the vertical direction, that is, a position of the first gesture in the user image,
  • the projection point on the coordinate of the axis as shown in Fig. 17, the position of the first gesture in the user image, the projection point in the vertical direction is P point; the coordinate of the P point on the vertical axis BC is the projection point on the vertical axis The coordinates on the top.
  • Step S820 according to the height of the vertical axis coordinate, the vertical height of the aircraft and the ground, the half angle of view of the vertical axis direction of the image capturing device, the angle difference between the inclination angle of the image capturing device and the half angle of view, and the projection point is
  • the coordinates on the vertical axis coordinates determine the vertical movement distance of the aircraft.
  • AO is the vertical height of the aircraft from the ground
  • OAD is the inclination of the camera
  • CAD is the half angle of the longitudinal direction of the camera
  • the half angle of view of the longitudinal axis of the camera can be the half angle of the longitudinal direction of the camera
  • OAC is The angle difference between the OAD and the CAD angle
  • the height of the vertical axis coordinate can be determined according to the height of the image interface. For example, if the image is collected by 640*360 resolution, the height of the vertical axis coordinate can be 360, that is, according to the vertical axis of the interface. Height determines the height of the vertical axis coordinates;
  • the aircraft In order for the projection point P to fall on the center point D of the image captured by the camera, the aircraft needs to adjust the vertical movement distance of the PD;
  • AO can be set to H
  • CAD is ⁇
  • OAC is ⁇
  • vertical coordinate height is Ly
  • the position of the first gesture in the user image
  • the vertical axis coordinate of the projection point on the vertical coordinate is y
  • the vertical movement distance that the aircraft needs to adjust is Sy, then the vertical movement distance that the aircraft needs to adjust can be determined according to the following formula:
  • the processing chip of the aircraft can acquire a user image of each frame collected in real time, determine the vertical moving distance of the aircraft in real time based on the position of the first gesture in each frame of the user image, and then output a flight control instruction to the flight mechanism of the aircraft.
  • the aircraft is allowed to adjust the determined vertical movement distance in the same vertical movement direction as the gesture trajectory of the first gesture.
  • the horizontal moving distance and the vertical moving distance determined by the processing chip based on each frame image may be output through a flight control command, so that the aircraft adjusts the flying posture to achieve the same horizontal motion direction as the gesture trajectory of the first gesture. Adjusting the determined horizontal moving distance, and adjusting the determined vertical moving distance in the same vertical moving direction as the gesture trajectory of the first gesture, so that the aircraft can follow the gesture trajectory of the user's first gesture in real time to achieve the aircraft Control of the flight route.
  • the embodiment of the present application may notify the aircraft to start and cancel the first gesture of following the user by the second gesture of the user, that is, when the aircraft flies without following the first gesture of the user, if the user is detected by the user image
  • the second gesture the aircraft can start to follow the first gesture of the user to fly; correspondingly, after the second gesture is operated, the user can switch the gesture trajectory operation by using the first gesture, so that the aircraft is based on the image of each frame of the user.
  • a gesture position adjusting the flight attitude, following the gesture trajectory of the first gesture; and when the user wants the aircraft to cancel the first gesture of following the user, the user can switch from the gesture trajectory of the first gesture to the second hand Potential, after the aircraft detects the second gesture of the user through the user image, the first gesture flight following the user may be cancelled;
  • FIG. 18 is another flowchart of a method for controlling a flight path of an aircraft provided by an embodiment of the present application.
  • the method is applicable to an aircraft, and specifically, to a processing chip of an aircraft.
  • the method may be include:
  • Step S900 Acquire a user image collected by the image collection device in real time.
  • Step S910 Identify a user gesture in the user image.
  • the embodiment of the present application may identify whether the user gesture in the user image is a predetermined first gesture or a predetermined second gesture, and perform different processing processes according to different recognition results.
  • the instructions for performing different processing flows according to different user gestures identified in the user image may be referred to the following steps S920 to S940.
  • the embodiment of the present application may detect the user image by using a pre-trained first gesture detector and a second gesture detector, respectively, to determine that the user image exists.
  • the first gesture is also the second gesture, or both the first gesture and the second gesture are absent.
  • the embodiment of the present application may also identify a human skin area in the user image by using a skin detection algorithm, and remove the human skin area of the face area as a user gesture area, respectively.
  • a contour feature of the gesture and a contour feature of the second gesture are matched with the contour feature of the user gesture area to determine whether the first gesture or the second gesture exists in the user image, or neither the first gesture nor the second hand exists
  • the contour feature of the user gesture area is matched with the contour feature of the first gesture by a predetermined first matching degree, it may be determined that the first gesture exists in the user image, otherwise, determining that the user image is not There is a first gesture; if the contour feature of the user gesture area is matched with the contour feature of the second gesture by a predetermined first matching degree, it may be determined that the second gesture exists in the user image, otherwise, the user image is determined There is no second gesture in it.
  • the embodiment of the present application may further extract a connected area in the user image, respectively, the contour feature of the first gesture and the contour feature of the second gesture, and the contour feature of each connected area. Performing a match to determine whether the first gesture or the second gesture exists in the user image, or neither the first gesture nor the second gesture exists; optionally, if there is a matching degree with the contour feature of the first gesture is higher than a predetermined one
  • the connected area of the second matching degree may determine that the user gesture represented by the connected area is the first gesture, determining that the first gesture exists in the user image, otherwise determining that the first gesture does not exist in the user image; if the second hand exists If the matching degree of the contour feature is higher than the connected region of the predetermined second matching degree, determining that the user gesture represented by the connected region is the second gesture, determining that the second gesture exists in the user image, otherwise determining the user image There is no second gesture.
  • the embodiment of the present application may first detect whether a first gesture exists in the user image, and if there is no first gesture in the user image, whether the second gesture is detected in the user image, or the user image may be detected first. Whether there is a second gesture, whether there is a first gesture in detecting the user image when there is no second gesture in the user image, and whether the first gesture or the second gesture is detected in the user image at the same time. .
  • Step S920 If the identified user gesture is a predetermined second gesture, and the aircraft does not currently enter the first mode, triggering the aircraft to enter a first mode, the first mode is used to indicate that the aircraft follows the user's first gesture. Gesture trajectory flight.
  • Step S930 if the identified user gesture is a predetermined first gesture, and the aircraft has entered the first mode, determining a position of the first gesture in the user image, according to the first gesture in the user The position in the image adjusts the flight attitude of the aircraft to cause the aircraft to follow the gesture trajectory of the first gesture.
  • step S620 and step S630 shown in FIG. 13 may establish that the user gesture identified in the user image is the first gesture, and the aircraft has entered the first mode.
  • Step S940 if the identified user gesture is a predetermined second gesture, and the aircraft has entered the first mode, triggering the aircraft to exit the first mode, instructing the aircraft to cancel the gesture trajectory following the first gesture of the user. .
  • the embodiment of the present application may define a flight mode in which the aircraft follows the gesture trajectory of the first gesture of the user as the first mode.
  • the flight attitude may be adjusted based on the position of the first gesture in the user image to implement the following manner.
  • the purpose of the first gesture of the gesture trajectory flight and in the state where the aircraft does not enter the first mode, even if the first gesture exists in the captured user image, the aircraft does not adjust the flight based on the position of the first gesture in the user image. Gesture; therefore whether the aircraft enters the first mode is a precondition for whether the aircraft follows the gesture trajectory of the first gesture.
  • the aircraft enters and exits the first mode, which is controlled by the second gesture of the user; if the aircraft does not currently enter the first mode, the second gesture of the user may trigger the aircraft to enter the first mode, so that The aircraft may adjust the flight attitude based on the position of the first gesture in the subsequently acquired user image; if the aircraft is currently entering the first mode, the second gesture of the user may trigger the aircraft to exit the first mode, such that the aircraft cancels the first following the user Gesture gesture track flying.
  • the manner in which the user controls the flight path of the aircraft may be:
  • the user makes a second gesture; after the aircraft recognizes the second gesture by the collected user image, the aircraft enters the first mode;
  • the gesture is the first gesture, and the arm is swung by the first gesture; after the aircraft enters the first mode, the first gesture is recognized by the collected user image, and the first gesture is collected according to the first gesture. Position in each user image, adjusting the flight attitude, and achieving the purpose of the aircraft following the gesture trajectory of the first gesture;
  • the gesture may be switched to the second gesture; after the aircraft recognizes the second gesture by the collected user image, the aircraft exits from the first mode, and no longer follows the gesture of the user's first gesture. Trajectory flight.
  • the first gesture is an example of a fist-finger gesture
  • FIG. 19 shows an example of a flight path control of the corresponding aircraft, as shown in FIG.
  • the aircraft In an initial state in which the aircraft does not enter the first mode, if the aircraft detects that there is a five-finger open gesture in the captured user image, the aircraft enters the first mode;
  • the aircraft After the aircraft enters the first mode, if the aircraft detects that there is a fist gesture in the captured user image, the position of the fist gesture in the user image may be adjusted, the flight attitude is adjusted, and the aircraft follows the gesture trajectory of the user's fist gesture;
  • the aircraft After the aircraft enters the first mode, if the aircraft detects that there is a five-finger open gesture in the user image, the aircraft exits the first mode; optionally, the aircraft can hover at the current position.
  • the aircraft is triggered to enter and exit the first mode, so that the aircraft performs or cancels the position according to the first gesture of the user in the user image, and the manner of adjusting the flight attitude. , only optional;
  • the position of the first gesture in the user image is adjusted, and the flight attitude is adjusted to achieve the purpose of the aircraft following the gesture of the first gesture.
  • Introducing the second gesture of the user to control the aircraft to perform or cancel the gesture trajectory following the first gesture that is, the user can swing the arm directly through the first gesture when the aircraft is desired to fly according to the gesture trajectory of the first gesture, so that the aircraft follows the first Gesture flight without first making a second gesture; when the user wants the aircraft to cancel following the first gesture flight, the user can do so by not operating the first gesture.
  • the embodiment of the present application may adopt a detector of the first gesture that is pre-trained, and the detector of the second gesture performs identification of the user gesture in the user image;
  • the embodiment of the present application may collect a large number of gesture images of the first gesture and a background image of the first gesture, and extract features such as haar of the gesture image of each first gesture, and each first a haar feature of the background image of the gesture; a haar feature of the gesture image of the first gesture and a haar feature of the background image of the first gesture, using a machine training method such as cascade to generate a detector of the first gesture; the first gesture The detector may identify whether there is a first gesture in the collected user image, and determine a position of the first gesture in the user image when the first gesture exists in the user image;
  • the embodiment of the present application may collect a plurality of gesture images of the second gesture and a background image of the second gesture, and extract a direction gradient of the gesture image of each second gesture.
  • Features such as Histogram of Oriented Gradient (HOG), and HOG characteristics of the background image of each second gesture; HOG features of the gesture image according to the second gesture, and HOG features of the background image of the second gesture a training method using a support vector machine (SVM) to generate a second gesture detector; the second gesture detector can identify whether there is a second gesture in the collected user image, and When there is a second gesture in the user image, the location of the second gesture in the user image is determined.
  • HOG Histogram of Oriented Gradient
  • SVM support vector machine
  • the position of the center point of the area in the user image may be used as the first gesture in the user image.
  • the determination of the position of the gesture in the user image may be the same; optionally, the manner of determining the position of the gesture in the user image described in this paragraph may be limited to the case of using the detector to recognize the user's gesture, and may also be applied to The situation of the user's gesture is recognized by the skin area in the user image, or the connected area.
  • the aircraft since there may be multiple users at the same time on the ground, after the aircraft acquires the user image, there may be multiple users in the user image that simultaneously make the first gesture or the second gesture. At this time, the aircraft needs to determine which user is based on the user.
  • the gestures are used for flight control; based on this, the embodiment of the present application can set a legal user to control the flight of the aircraft, and the flight control of the user gesture based on the legal user is implemented in the embodiment of the present application.
  • the aircraft may determine whether there is a user face matching the facial feature of the legal user in the user image, so that when there is a user face matching the facial feature of the legal user in the user image, the user is based on the user a first gesture or a second gesture of a legitimate user in the image (a user whose face area matches a face feature of a legitimate user in the user image) performs flight control;
  • the embodiment of the present application may first extract a face region in the user image, and determine whether there is a face matching the facial feature of the legal user in the extracted face region. a region, thereby identifying a user gesture of a legal user corresponding to a face region of the user image that matches a facial feature of the legal user;
  • FIG. 20 is still another flowchart of a method for controlling a flight path of an aircraft provided by an embodiment of the present application.
  • the method may be applied to an aircraft, and may be specifically applied to a processing chip of an aircraft.
  • the method may be include:
  • Step S1000 Acquire a user image collected by the image collection device.
  • step S1010 it is determined whether there is a face area matching the facial features of the legal user in the user image. If not, step S1020 is performed, and if yes, step S1030 is performed.
  • the embodiment of the present application may determine whether the user image has a face area of a legitimate user.
  • Step S1020 ending the process.
  • the current process waits for the user image acquired in the next frame to arrive, and performs the processing of step S1010 on the user image acquired in the next frame.
  • Step S1030 Identify a user gesture corresponding to a face feature of a legitimate user in a user gesture in the user image.
  • the embodiment of the present application may extract the user figure corresponding to the face area in the user image, and identify the user of the user figure. Gestures enable recognition of user gestures by legitimate users in the user's image.
  • Step S1040 Determine the position of the first gesture in the user image if the identified user gesture is a predetermined first gesture.
  • Step S1050 Adjust a flight attitude of the aircraft according to a position of the first gesture in the user image, so that the aircraft follows the gesture trajectory of the first gesture to fly.
  • the method for verifying whether a user has a legitimate user in the user image by using the face detection technology shown in FIG. 20 can also be applied to the method shown in FIG. 18; for each acquired user image shown in FIG. There is a face region matching the facial features of the legal user, and when the determination result is yes, the user face corresponding to the face feature of the legitimate user is identified in the user image corresponding to the user face, And follow-up processing.
  • the flight path control method of the aircraft provided by the embodiment of the present application can control the flight path of the aircraft through the gesture track of the first gesture of the user, and conveniently realize the flight path control of the aircraft.
  • the aircraft flight control device provided by the embodiment of the present application is introduced below by the angle of the user's gesture in the image of the user.
  • the aircraft flight control device described below can be considered as the processing chip of the aircraft to implement the aircraft flight control method provided by the embodiment of the present application, and the functional module architecture required to be set; the aircraft flight control device described below can be described above. Corresponding reference.
  • FIG. 21 is a structural block diagram of an aircraft flight control device according to an embodiment of the present disclosure.
  • the aircraft flight control device is applicable to an aircraft, and is specifically applicable to a processing chip of an aircraft.
  • the aircraft flight control device may include:
  • the image obtaining module 100 is configured to acquire a user image.
  • a gesture recognition module 200 configured to identify a user gesture in the user image
  • the flight instruction determining module 300 is configured to determine a flight instruction corresponding to the user gesture according to a predefined correspondence between each user gesture and a flight instruction;
  • the flight control module 400 is configured to control the flight of the aircraft according to the flight instruction.
  • the gesture recognition module 200 is configured to identify a user gesture in the user image, and specifically includes:
  • the determined standard user gesture is taken as the user gesture identified from the user image.
  • the gesture recognition module 200 is configured to extract a user gesture area from the human skin area, and specifically includes:
  • the face area in the human skin area is removed to obtain a user gesture area.
  • the gesture recognition module 200 is configured to identify a user gesture in the user image, and specifically includes:
  • the gesture recognition module 200 is configured to extract the connected area in the user image, and specifically includes:
  • All connected areas in the user image are extracted, or connected areas in the user image after the face area is removed are extracted.
  • FIG. 22 is another structural block diagram of the aircraft flight control device provided by the embodiment of the present application.
  • the aircraft flight control device may further include:
  • the training module 500 is configured to pre-collect a plurality of user images including standard user gestures as image samples corresponding to standard user gestures for each standard user gesture; and image samples corresponding to each standard user gesture, according to a machine training method, A detector for each standard user gesture.
  • the gesture recognition module 200 is configured to identify a user gesture in the user image, and specifically includes:
  • the detectors of the standard user gestures respectively detecting the user images, and obtaining detection results of the user images by the detectors of the standard user gestures;
  • a user gesture recognized from the user image is determined based on a detection result of the user image.
  • the image obtaining module 100 is configured to acquire a user image, and specifically includes:
  • the user image collected by the ground image acquisition device is acquired.
  • the image acquisition module 100 acquires the user image collected by the image acquisition device of the aircraft, as shown in FIG. 23, another structural block diagram of the aircraft flight control device, as shown in FIG. 21 and FIG. 23, the aircraft flies.
  • the control device may further include:
  • the angle adjustment module 600 is configured to adjust an image acquisition angle of the image capturing device of the aircraft after the aircraft is controlled to fly according to the flight instruction, so that the user is within the image collection range of the image capturing device.
  • the embodiment of the present application needs to identify a user portrait of a legitimate user, thereby implementing flight control of the aircraft based on a user gesture of the user portrait of the legitimate user;
  • the gesture recognition module 200 is configured to extract a user gesture area from the human skin area, and specifically includes:
  • a user gesture in the user portrait is identified.
  • the manner in which the gesture recognition module 200 identifies the user gesture in the user portrait may be referred to the above description.
  • the gesture recognition module 200 is configured to identify the user gesture in the user portrait, and specifically includes:
  • Identifying a human skin area in the user portrait extracting a user gesture area from the human skin area, matching a contour feature of the user gesture area with a preset contour feature of each standard user gesture, and determining the user a standard user gesture with the highest degree of contour feature matching of the gesture area, obtaining a user gesture recognized from the user portrait;
  • FIG. 24 is another structural block diagram of the aircraft flight control device provided by the embodiment of the present application.
  • the aircraft flight control device may further include:
  • a gesture location determining module 700 configured to determine a location of the first gesture in the user image if the identified user gesture is a predetermined first gesture
  • the flight control module 400 is further configured to adjust a flight attitude of the aircraft according to a position of the first gesture in the user image, so that the aircraft follows the gesture trajectory of the first gesture to fly.
  • the flight control module 400 is configured to adjust the aircraft according to the position of the first gesture in the user image.
  • the flight posture specifically includes:
  • the flight attitude of the aircraft is adjusted with the determined horizontal movement distance and vertical movement distance such that the first gesture is always within the image acquisition field of view of the image acquisition device.
  • the flight control module 400 is further configured to: if the recognized user gesture is a predetermined second gesture, and the aircraft does not currently enter the first mode, triggering the aircraft to enter the first mode, where the first mode is used to indicate The aircraft follows the gesture trajectory of the user's first gesture;
  • the identified user gesture is a predetermined second gesture, and the aircraft has entered the first mode, triggering the aircraft to exit the first mode, instructing the aircraft to cancel the gesture trajectory following the first gesture of the user;
  • the flight control module 400 is configured to: if the identified user gesture is a predetermined first gesture, determine a location of the first gesture in the user image, specifically:
  • the identified user gesture is a predetermined first gesture and the aircraft has currently entered the first mode, determining the location of the first gesture in the user image.
  • the gesture recognition module 200 is further configured to: before identifying the user gesture in the user image, determining whether there is a face region in the user image that matches a facial feature of the legal user;
  • the gesture recognition module 200 is configured to identify a user gesture in the user image, and specifically includes:
  • the face region that matches the face feature of the legitimate user is identified in the user gesture corresponding to the user image.
  • the embodiment of the present application further provides an aircraft, the aircraft may include: an image capturing device and a processing chip; wherein the processing chip may include: the aircraft flight control device described above.
  • the image capturing device of the aircraft can collect the image of the user, and correspondingly, the image acquiring module of the processing chip can acquire the image of the user collected by the image capturing device of the aircraft;
  • the image acquisition module of the processing chip may also acquire the user image collected by the ground image acquisition device.
  • the embodiment of the present application further provides an aircraft flight control system.
  • the aircraft flight control system may include: a ground image acquisition device and an aircraft;
  • the ground image acquisition device is configured to collect a user image and transmit it to the aircraft;
  • the aircraft includes a processing chip; the processing chip is configured to acquire a user image transmitted by the ground image capturing device; identify a user gesture in the user image; and determine a location according to a predefined correspondence between each user gesture and a flight instruction. Determining a flight instruction corresponding to the user gesture; controlling flight of the aircraft according to the flight instruction.
  • the embodiment of the present application further provides another aircraft flight control system.
  • the aircraft flight control system may include: a ground image acquisition device, a ground processing chip, and an aircraft;
  • the ground image acquisition device is configured to collect user images and transmit to the ground processing chip
  • a ground processing chip configured to acquire a user image transmitted by the ground image capturing device; identify a user gesture in the user image; and determine a flight instruction corresponding to the user gesture according to a predefined correspondence between each user gesture and a flight instruction; Transmitting the flight instruction to the aircraft;
  • the ground processing chip implements user gesture recognition, and a specific implementation manner of the flight instruction corresponding to the user gesture, and may refer to the processing chip of the aircraft described above to identify the user gesture, and determine the specific content of the flight instruction corresponding to the user gesture. .
  • the aircraft includes a processing chip; the processing chip is configured to acquire the flight instruction, and control aircraft flight according to the flight instruction.
  • the embodiment of the present application can control the flight of the aircraft by the user gesture, and the flight control operation of the aircraft is extremely convenient, and the purpose of the flight control of the aircraft can be conveniently achieved.
  • the steps of a method or algorithm described in connection with the embodiments disclosed herein can be implemented directly in hardware, a software module executed by a processor, or a combination of both.
  • the software module can be placed in random access memory (RAM), memory, read only memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, hard disk, removable disk, CD-ROM, or technical field. Any other form of storage medium known.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A method of controlling a flight device, a device, a flight device, and a system. The method comprises: acquiring a user image (S100); identifying a user gesture in the user image (S110); determining, according to correspondences between respective predetermined user gestures and flight commands, a flight command corresponding to the user gesture (S120); and controlling according to the flight command a flight device to fly (S130). The solution can be utilized to control flight of a flight device using a user gesture, providing simple and easy flight control operations for the flight device.

Description

飞行器飞行控制方法、装置、飞行器及系统Aircraft flight control method, device, aircraft and system
本申请要求于2017年01月24日提交中国专利局、申请号为2017100603801、申请名称为“一种飞行器的飞行路线控制方法及飞行器”和要求于2017年01月24日提交中国专利局、申请号为201710060176X、申请名称为“一种飞行器飞行控制方法、装置、飞行器及系统”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。This application is required to be submitted to the China Patent Office on January 24, 2017, the application number is 2017100603801, and the application name is “A Flight Path Control Method and Aircraft for an Aircraft” and the application is submitted to the Chinese Patent Office on January 24, 2017. No. 201710060176X, the entire disclosure of which is hereby incorporated by reference in its entirety in its entirety in its entirety in the the the the the the the the the
技术领域Technical field
本申请涉及飞行器技术领域。This application relates to the field of aircraft technology.
背景技术Background technique
无人机等飞行器在监控、安防、航拍等领域应用广泛,飞行器的飞行控制一般由用户操作实现;目前一种主流的飞行器飞行控制方式是,用户通过与飞行器配对的遥控器对飞行器的飞行进行控制。Aircraft such as drones are widely used in surveillance, security, aerial photography, etc. The flight control of aircraft is generally operated by users; currently, a mainstream flight control method is that the user conducts flight of the aircraft through a remote controller paired with the aircraft. control.
然而,采用遥控器进行飞行器的飞行控制,需要用户熟悉遥控器的使用方式,才能对飞行器的飞行进行较为熟练且精准的控制,如通过遥控器中控制飞行器飞行方向的方向按键或者操作摇杆控制飞行器的飞行方向。在这种情况下,用户需要熟练使用方向按键或者操作摇杆,才可能对飞行器进行较为熟练且精准的飞行控制。这样导致飞行器的飞行控制对于大多数人来说并不便捷。However, using the remote control for flight control of the aircraft requires the user to be familiar with the use of the remote control in order to perform more sophisticated and precise control of the flight of the aircraft, such as controlling the direction of the flight direction of the aircraft through the remote control or operating the joystick control. The flight direction of the aircraft. In this case, the user needs to be skilled in using the direction buttons or operating the joystick to perform more sophisticated and precise flight control of the aircraft. This results in flight control of the aircraft that is not convenient for most people.
发明内容Summary of the invention
本申请实施例提供一种飞行器飞行控制方法、装置、飞行器及系统,可以更便捷的实现飞行器的飞行控制。Embodiments of the present application provide an aircraft flight control method, apparatus, aircraft, and system, which can more conveniently implement flight control of an aircraft.
一方面,本申请实施例提供如下技术方案:In one aspect, the embodiment of the present application provides the following technical solutions:
一种飞行器飞行控制方法,应用于飞行器,所述方法包括:An aircraft flight control method is applied to an aircraft, the method comprising:
获取用户图像;Obtaining a user image;
识别所述用户图像中的用户手势;Identifying a user gesture in the user image;
根据预定义的各用户手势与飞行指令的对应关系,确定所述用户手势对应的飞行指令;Determining, according to a predefined correspondence between each user gesture and a flight instruction, a flight instruction corresponding to the user gesture;
根据所述飞行指令控制飞行器飞行。The aircraft is controlled to fly according to the flight instructions.
另一方面,本申请实施例还提供一种飞行器飞行控制装置,应用于飞行器,所述飞行器飞行控制装置包括:On the other hand, an embodiment of the present application further provides an aircraft flight control device, which is applied to an aircraft, where the aircraft flight control device includes:
图像获取模块,用于获取用户图像;An image acquisition module, configured to acquire a user image;
手势识别模块,用于识别所述用户图像中的用户手势;a gesture recognition module, configured to identify a user gesture in the user image;
飞行指令确定模块,用于根据预定义的各用户手势与飞行指令的对应关系,确定所述用户手势对应的飞行指令;a flight instruction determining module, configured to determine a flight instruction corresponding to the user gesture according to a predefined correspondence between each user gesture and a flight instruction;
飞行控制模块,用于根据所述飞行指令控制飞行器飞行。A flight control module is configured to control aircraft flight according to the flight instruction.
另一方面,本申请实施例还提供一种飞行器,包括:图像采集装置和处理芯片;所述处理芯片包括上述飞行器飞行控制装置。In another aspect, an embodiment of the present application further provides an aircraft, including: an image acquisition device and a processing chip; and the processing chip includes the aircraft flight control device.
另一方面,本申请实施例还提供一种飞行器飞行控制系统,包括:地面图像采集装置和飞行器;On the other hand, an embodiment of the present application further provides an aircraft flight control system, including: a ground image acquisition device and an aircraft;
所述地面图像采集装置,用于采集用户图像,并传输给该飞行器;The ground image acquisition device is configured to collect a user image and transmit it to the aircraft;
所述飞行器包括处理芯片;所述处理芯片,用于获取地面图像采集装置传输的用户图像;识别所述用户图像中的用户手势;根据预定义的各用户手势与飞行指令的对应关系,确定所述用户手势对应的飞行指令;根据所述飞行指令控制飞行器飞行。The aircraft includes a processing chip; the processing chip is configured to acquire a user image transmitted by the ground image capturing device; identify a user gesture in the user image; and determine a location according to a predefined correspondence between each user gesture and a flight instruction. Determining a flight instruction corresponding to the user gesture; controlling flight of the aircraft according to the flight instruction.
另一方面,本申请实施例还提供一种飞行器飞行控制系统,包括:地面图像采集装置,地面处理芯片和飞行器;On the other hand, an embodiment of the present application further provides an aircraft flight control system, including: a ground image acquisition device, a ground processing chip, and an aircraft;
所述地面图像采集装置,用于采集用户图像,并传输给地面处理芯片;The ground image acquisition device is configured to collect a user image and transmit the image to a ground processing chip;
所述地面处理芯片,用于获取地面图像采集装置传输的用户图像;识别所述用户图像中的用户手势;根据预定义的各用户手势与飞行指令的对应关系,确定所述用户手势对应的飞行指令;将所述飞行指令传输给飞行器;The ground processing chip is configured to acquire a user image transmitted by the ground image capturing device; identify a user gesture in the user image; and determine a flight corresponding to the user gesture according to a predefined correspondence between each user gesture and a flight instruction. Commanding; transmitting the flight instruction to the aircraft;
所述飞行器包括处理芯片;所述处理芯片,用于获取所述飞行指令,根据所述飞行指令控制飞行器飞行。The aircraft includes a processing chip; the processing chip is configured to acquire the flight instruction, and control aircraft flight according to the flight instruction.
另一方面,本申请实施例提供一种计算机可读存储介质,包括指令,当所述指令在计算机上运行时,所述计算机执行上述所述的方法。In another aspect, embodiments of the present application provide a computer readable storage medium comprising instructions that, when executed on a computer, perform the method described above.
另一方面,本申请实施例提供一种包含指令的计算机程序产品,当所述计算机程序产品在计算机上运行时,所述计算机执行上述所述的方法。In another aspect, embodiments of the present application provide a computer program product comprising instructions that, when executed on a computer, perform the method described above.
基于上述技术方案,本申请实施例提供的飞行器飞行控制方法中,飞行器可获取用户图像,识别所述用户图像中的用户手势,从而根据预定义的各用户手势与飞行指令的对应关系,确定所识别的用户手势对应的飞行指令,根据所 述飞行指令控制飞行器飞行,实现对飞行器的飞行控制。本申请实施例提供的飞行器飞行控制方法,可通过用户手势控制飞行器的飞行,飞行器的飞行控制操作极为便捷,可达到便捷的实现飞行器的飞行控制的目的。Based on the above technical solution, in the aircraft flight control method provided by the embodiment of the present application, the aircraft may acquire a user image, identify a user gesture in the user image, and determine the location according to a predefined correspondence between each user gesture and a flight instruction. The identified flight instruction corresponding to the user gesture controls the flight of the aircraft according to the flight instruction to implement flight control of the aircraft. The aircraft flight control method provided by the embodiment of the present application can control the flight of the aircraft by the user gesture, and the flight control operation of the aircraft is extremely convenient, and the flight control of the aircraft can be conveniently achieved.
附图说明DRAWINGS
图1为本申请实施例提供的飞行器的飞行控制示意图;1 is a schematic diagram of flight control of an aircraft provided by an embodiment of the present application;
图2为本申请实施例提供的用户手势控制飞行器飞行的示意图;2 is a schematic diagram of a user gesture control aircraft flight according to an embodiment of the present application;
图3为本申请实施例提供的飞行器的另一飞行控制示意图;3 is a schematic diagram of another flight control of an aircraft provided by an embodiment of the present application;
图4为本申请实施例提供的飞行器的再一飞行控制示意图;4 is a schematic diagram of still another flight control of the aircraft provided by the embodiment of the present application;
图5为本申请实施例提供的飞行器飞行控制方法的流程图;FIG. 5 is a flowchart of an aircraft flight control method according to an embodiment of the present application;
图6为本申请实施例提供的飞行器飞行控制方法的另一流程图;6 is another flowchart of an aircraft flight control method according to an embodiment of the present application;
图7为本申请实施例提供的飞行器飞行控制方法的再一流程图;FIG. 7 is still another flowchart of an aircraft flight control method according to an embodiment of the present application;
图8为本申请实施例提供的飞行器飞行控制方法的又一流程图;FIG. 8 is still another flowchart of an aircraft flight control method according to an embodiment of the present application;
图9为本申请实施例提供的飞行器的飞行场景示意图;FIG. 9 is a schematic diagram of a flight scenario of an aircraft according to an embodiment of the present application; FIG.
图10为本申请实施例提供的飞行器的另一飞行场景示意图;FIG. 10 is a schematic diagram of another flight scenario of an aircraft provided by an embodiment of the present application; FIG.
图11为本申请实施例提供的飞行器飞行控制方法的又另一流程图;FIG. 11 is still another flowchart of an aircraft flight control method according to an embodiment of the present application;
图12为本申请实施例提供的飞行器的又一飞行控制示意图;FIG. 12 is still another schematic diagram of flight control of an aircraft provided by an embodiment of the present application; FIG.
图13为本申请实施例提供的飞行器的飞行控制方法的另一流程图;FIG. 13 is another flowchart of a flight control method of an aircraft according to an embodiment of the present application;
图14为确定飞行器调整的水平移动距离的方法流程图;14 is a flow chart of a method for determining a horizontal movement distance adjusted by an aircraft;
图15为确定飞行器调整的水平移动距离的示意图;Figure 15 is a schematic diagram for determining the horizontal movement distance of the aircraft adjustment;
图16为确定飞行器调整的垂直移动距离的方法流程图;16 is a flow chart of a method for determining an adjusted vertical movement distance of an aircraft;
图17为确定飞行器调整的垂直移动距离的示意图;Figure 17 is a schematic diagram for determining the vertical movement distance adjusted by the aircraft;
图18为本申请实施例提供的飞行器的飞行控制方法的另一流程图;FIG. 18 is another flowchart of a flight control method of an aircraft according to an embodiment of the present application;
图19为飞行器的飞行路线控制示例图;Figure 19 is a diagram showing an example of flight path control of an aircraft;
图20为本申请实施例提供的飞行器的飞行控制方法的另一流程图;FIG. 20 is another flowchart of a flight control method of an aircraft according to an embodiment of the present application;
图21为本申请实施例提供的飞行器飞行控制装置的结构框图;FIG. 21 is a structural block diagram of an aircraft flight control apparatus according to an embodiment of the present application;
图22为本申请实施例提供的飞行器飞行控制装置的另一结构框图;FIG. 22 is another structural block diagram of an aircraft flight control apparatus according to an embodiment of the present application;
图23为本申请实施例提供的飞行器飞行控制装置的另一结构框图;23 is another structural block diagram of an aircraft flight control apparatus according to an embodiment of the present application;
图24为本申请实施例提供的飞行器飞行控制装置的另一结构框图。FIG. 24 is another structural block diagram of an aircraft flight control apparatus according to an embodiment of the present application.
具体实施方式detailed description
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本申请一部分实施例,而不是全部的实施例。The technical solutions in the embodiments of the present application are clearly and completely described in the following with reference to the drawings in the embodiments of the present application. It is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments.
区别于现有采用遥控器对飞行器进行飞行控制的方式,本申请实施例可通过用户手势控制飞行器的飞行,飞行器可获取用户图像,识别用户图像中的用户手势,从而以用户手势相应的飞行指令进行飞行控制,达到便捷的实现飞行器的飞行控制的目的。Different from the existing method for controlling the flight control of the aircraft by using the remote controller, the embodiment of the present application can control the flight of the aircraft by the user gesture, the aircraft can acquire the user image, and recognize the user gesture in the user image, thereby using the corresponding flight instruction of the user gesture. Flight control is carried out to achieve the purpose of facilitating flight control of the aircraft.
如图1所示飞行器的飞行控制示意图,飞行器1可设置有图像采集装置11和处理芯片12;用户可在飞行器周围操作手势,飞行器的图像采集装置可实时或定时采集用户图像,并传输至处理芯片;用户图像可以包括用户人像以及背景图像;As shown in the flight control diagram of the aircraft shown in FIG. 1, the aircraft 1 may be provided with an image acquisition device 11 and a processing chip 12; the user may operate gestures around the aircraft, and the image acquisition device of the aircraft may collect user images in real time or at regular intervals, and transmit to the processing. a chip; the user image may include a user portrait and a background image;
飞行器的处理芯片可识别用户图像中的用户手势,根据预定的各用户手势与飞行指令的对应关系,确定所识别的用户手势对应的飞行指令,从而以所确定的飞行指令进行飞行控制;The processing chip of the aircraft can identify a user gesture in the image of the user, and determine a flight instruction corresponding to the identified user gesture according to a predetermined correspondence between each user gesture and a flight instruction, thereby performing flight control with the determined flight instruction;
下表1示出了一种可选的用户手势与飞行指令的对应关系,图2示出了相应的用户手势控制飞行器飞行的示意图,可参照;可以理解的是,表1和图2所示仅为可选示例,用户手势与飞行指令的对应关系,可以根据实际需要进行定义;Table 1 below shows a correspondence between an optional user gesture and a flight instruction, and FIG. 2 shows a schematic diagram of a corresponding user gesture control aircraft flight, which can be referred to; it can be understood that Table 1 and Figure 2 show For the optional example only, the correspondence between the user gesture and the flight instruction can be defined according to actual needs;
手势gesture 飞行指令Flight instruction
掌心向外、五指直立向左Palms outward, five fingers upright to the left 向左飞行1秒Fly left for 1 second
掌心向外、五指直立向右Palms outward, five fingers upright to the right 向右飞行1秒Fly to the right for 1 second
掌心向外、五指直立向前Palms outward, five fingers upright 向上飞行1秒Fly up for 1 second
掌心向外、五指直立向后Palms outward, five fingers upright 向下飞行1秒Fly down for 1 second
拇指朝上手型Thumb up hand 连续左右连续2次2 consecutive times in a row
掌心向外的V字手型V-shaped hand with palm 顺时针自转1圈Rotate clockwise one turn
掌心向外的OK手型OK hand with palm out 悬停Hover
表1Table 1
图1所示飞行控制示意图需要飞行器能够拍摄到用户图像,才能使得飞行 器的处理芯片能够识别出用户图像中的用户手势,根据用户手势相应的飞行指令进行飞行控制;这种方式需要飞行器飞行在用户的周围,能够拍摄到用户图像,限制了飞行器远离用户执行航拍等飞行任务的情况。The flight control diagram shown in FIG. 1 requires the aircraft to be able to capture the user image, so that the processing chip of the aircraft can recognize the user gesture in the user image, and perform flight control according to the corresponding flight instruction of the user gesture; this method requires the aircraft to fly in the user. Around the camera image can be captured, which limits the flight of the aircraft away from the user to perform aerial missions such as aerial photography.
基于此,图3示出了基于用户手势控制飞行器飞行思路下的另一飞行控制示意图,参照图3,设置于用户附近的地面图像采集装置2可采集用户图像,并传输给飞行器1,飞行器的处理芯片12获取到地面图像采集装置采集的用户图像,可识别用户图像中的用户手势,根据预定的各用户手势与飞行指令的对应关系,确定所识别的用户手势对应的飞行指令,从而根据确定的飞行指令进行飞行控制;Based on this, FIG. 3 shows another flight control schematic diagram under the flight control idea of the aircraft based on the user gesture. Referring to FIG. 3, the ground image acquisition device 2 disposed near the user can collect the user image and transmit it to the aircraft 1, the aircraft. The processing chip 12 acquires a user image collected by the ground image capturing device, and can identify a user gesture in the user image, and determines a flight instruction corresponding to the recognized user gesture according to a predetermined correspondence between each user gesture and a flight instruction, thereby determining according to the determination Flight instructions for flight control;
可见,本申请实施例也可通过地面图像采集装置采集用户图像,地面图像采集装置可通过通用分组无线服务技术(General Packet Radio Service,GPRS),微型空中飞行器链路通讯协议(Micro Air Vehicle Link,MAV Link)等无线通信技术,将所采集的用户图像传输给飞行器的处理芯片;从而飞行器的处理芯片可识别所获取的用户图像中的用户手势,根据相应的飞行指令进行飞行控制;It can be seen that the embodiment of the present application can also collect user images through a terrestrial image acquisition device, and the terrestrial image acquisition device can adopt a General Packet Radio Service (GPRS), a micro air vehicle link protocol (Micro Air Vehicle Link, A wireless communication technology such as MAV Link) transmits the collected user image to the processing chip of the aircraft; thus, the processing chip of the aircraft can recognize the user gesture in the acquired user image, and perform flight control according to the corresponding flight instruction;
如此,利用无线通信技术,在地面图像采集装置与飞行器间传输用户图像,飞行器可远离用户飞行,执行航拍等飞行任务;进一步,如图4所示,飞行器本身自带的图像采集装置11可采集执行航拍等飞行任务时的任务图像,并传输给用户手机等用户设备3,以便向用户展示飞行器采集的任务图像;同时,用户可基于所展示的任务图像,操作不同的手势,对飞行器执行飞行任务过程中的飞行进行控制。In this way, by using wireless communication technology, the user image is transmitted between the ground image acquisition device and the aircraft, the aircraft can fly away from the user, perform aerial missions and the like; further, as shown in FIG. 4, the image acquisition device 11 provided by the aircraft itself can collect. Performing a mission image for a flight mission such as aerial photography, and transmitting it to the user equipment 3 such as a user's mobile phone to display the task image collected by the aircraft to the user; meanwhile, the user can operate different gestures based on the displayed mission image to perform flight on the aircraft. The flight during the mission is controlled.
下面站在飞行器的角度,对本申请实施例提供的飞行器飞行控制方法进行介绍,下文描述的飞行器飞行控制方法可与上文描述内容相互对应参照。The aircraft flight control method provided by the embodiment of the present application is introduced below from the perspective of the aircraft, and the aircraft flight control method described below can refer to the above description.
图5为本申请实施例提供的飞行器飞行控制方法的流程图,该方法可应用于飞行器,具体可以应用于飞行器的处理芯片,参照图5,该方法可以包括:FIG. 5 is a flowchart of a method for controlling flight of an aircraft according to an embodiment of the present disclosure. The method may be applied to an aircraft, and may be specifically applied to a processing chip of an aircraft. Referring to FIG. 5, the method may include:
步骤S100、获取用户图像。Step S100: Acquire a user image.
可选的,用户图像可由飞行器自带的图像采集装置采集得到,即飞行器的处理芯片可获取飞行器的图像采集装置所采集的用户图像,实现对用户图像的获取;Optionally, the user image may be acquired by an image acquisition device provided by the aircraft, that is, the processing chip of the aircraft may acquire a user image collected by the image acquisition device of the aircraft to obtain the image of the user;
可选的,用户图像也可以是由地面图像采集装置采集得到,地面图像采集 装置可通过无线通信技术,将所采集的用户图像传输给飞行器的处理芯片,以实现对用户图像的获取。Optionally, the user image may also be acquired by the ground image acquisition device, and the ground image acquisition device may transmit the collected user image to the processing chip of the aircraft through wireless communication technology to obtain the image of the user.
步骤S110、识别所述用户图像中的用户手势。Step S110: Identify a user gesture in the user image.
一种可能实现方式中,本申请实施例可根据肤色检测算法,从所述用户图像中识别出用户手势;具体的,本申请实施例可根据肤色检测算法,识别用户图像中的人体皮肤区域,从人体皮肤区域中提取用户手势区域,将用户手势区域的轮廓特征,与预置的各标准用户手势的轮廓特征进行匹配,确定与所述用户手势区域的轮廓特征匹配度最高的标准用户手势,从而将所确定的标准用户手势作为从所述用户图像中识别的用户手势,实现对所述用户图像中用户手势的识别。In a possible implementation manner, the embodiment of the present application may identify a user gesture from the user image according to the skin color detection algorithm. Specifically, the embodiment of the present application may identify a human skin region in the user image according to the skin color detection algorithm. Extracting a user gesture area from the human skin area, matching the contour feature of the user gesture area with the preset contour features of each standard user gesture, and determining a standard user gesture with the highest matching degree with the contour feature of the user gesture area, The determined standard user gesture is thus used as a user gesture identified from the user image to enable recognition of the user gesture in the user image.
另一种可能实现方式中,对于各标准用户手势,本申请实施例也可采集含有标准用户手势的大量用户图像,作为各标准用户手势对应的图像样本;从而根据支持向量机(Support Vector Machine,SVM)等机器训练方法,以各标准用户手势对应的图像样本,训练出各标准用户手势的检测器;In another possible implementation manner, for each standard user gesture, the embodiment of the present application may also collect a large number of user images including standard user gestures, as image samples corresponding to each standard user gesture; thus, according to a support vector machine (Support Vector Machine, a machine training method such as SVM), which uses a sample of images corresponding to each standard user gesture to train a detector of each standard user gesture;
进而采用各标准用户手势的检测器,分别检测步骤S100获取的用户图像,得到各标准用户手势的检测器对所述用户图像的检测结果,根据所述用户图像的检测结果确定从所述用户图像中识别的用户手势,实现对所述用户图像中用户手势的识别。And the detector of each standard user gesture is used to detect the user image acquired in step S100, and the detection result of the user image by the detector of each standard user gesture is obtained, and the image from the user is determined according to the detection result of the user image. The user gesture identified in the middle implements recognition of a user gesture in the user image.
显示,上述描述的从用户图像中识别用户手势的方式仅是可选的,本申请实施例也可采用其他的能够从用户图像中识别用户手势的方案。It is shown that the manner of recognizing the user's gesture from the user image is only optional. The embodiment of the present application may also adopt other schemes for recognizing the user's gesture from the user image.
步骤S120、根据预定义的各用户手势与飞行指令的对应关系,确定所述用户手势对应的飞行指令。Step S120: Determine, according to a predefined correspondence between each user gesture and a flight instruction, a flight instruction corresponding to the user gesture.
各用户手势与飞行指令的对应关系的一种可选示例可如表1所示,在识别出所述用户图像中的用户手势后,可根据预定义的各用户手势与飞行指令的对应关系,确定出所识别的用户手势对应的飞行指令,从而控制飞行器以所确定的飞行指令进行飞行。An optional example of the correspondence between each user gesture and the flight instruction may be as shown in Table 1. After the user gesture in the user image is recognized, according to a predefined correspondence between each user gesture and the flight instruction, A flight instruction corresponding to the identified user gesture is determined, thereby controlling the aircraft to fly with the determined flight instruction.
可选的,若所识别的用户手势,在所述预定义的各用户手势与飞行指令的对应关系中,对应有飞行指令,则可确定出所述用户手势对应的飞行指令,后续可根据所述飞行指令控制飞行器飞行;若所识别的用户手势,在所述预定义 的各用户手势与飞行指令的对应关系中,不对应有飞行指令,即所识别的用户手势不对应有飞行指令,则可结束流程,不进行飞行器的飞行控制。Optionally, if the identified user gesture, in the corresponding relationship between the predefined user gestures and the flight instruction, corresponding to the flight instruction, the flight instruction corresponding to the user gesture may be determined, and the subsequent The flight instruction controls the flight of the aircraft; if the identified user gesture, the correspondence between the predefined user gestures and the flight instruction does not correspond to a flight instruction, that is, the identified user gesture does not correspond to a flight instruction, The process can be ended without flight control of the aircraft.
步骤S130、根据所述飞行指令控制飞行器飞行。Step S130, controlling the flight of the aircraft according to the flight instruction.
本申请实施例提供的飞行器飞行控制方法中,飞行器可获取用户图像,识别所述用户图像中的用户手势,从而根据预定义的各用户手势与飞行指令的对应关系,确定所识别的用户手势对应的飞行指令,根据所述飞行指令控制飞行器飞行,实现对飞行器的飞行控制。本申请实施例提供的飞行器飞行控制方法,可通过用户手势控制飞行器的飞行,飞行器的飞行控制操作极为便捷,可达到便捷的实现飞行器的飞行控制的目的。In the aircraft flight control method provided by the embodiment of the present application, the aircraft may acquire a user image, identify a user gesture in the user image, and determine, according to a predefined correspondence between each user gesture and a flight instruction, the identified user gesture correspondingly. The flight instruction controls the flight of the aircraft according to the flight instruction to achieve flight control of the aircraft. The aircraft flight control method provided by the embodiment of the present application can control the flight of the aircraft by the user gesture, and the flight control operation of the aircraft is extremely convenient, and the flight control of the aircraft can be conveniently achieved.
可选的,本申请实施例可根据肤色检测算法,从所述用户图像中识别出用户手势,图6示出了本申请实施例提供的飞行器飞行控制方法的另一流程图,该方法可应用于飞行器,具体可以应用于飞行器的处理芯片,参照图6,该方法可以包括:Optionally, the embodiment of the present application may identify a user gesture from the user image according to the skin color detection algorithm, and FIG. 6 is another flowchart of the aircraft flight control method provided by the embodiment of the present application, where the method may be applied. In the aircraft, specifically applicable to the processing chip of the aircraft, referring to FIG. 6, the method may include:
步骤S200、获取用户图像。Step S200: Acquire a user image.
可选的,飞行器的摄像头等图像采集装置可实时的采集视频帧,得到采集的用户图像,并将实时采集的用户图像传输给飞行器的处理芯片;Optionally, the image capturing device such as the camera of the aircraft can collect the video frame in real time, obtain the collected user image, and transmit the user image collected in real time to the processing chip of the aircraft;
可选的,地图图像采集装置也可实时的采集视频帧,得到采集的用户图像,并将实时采集的用户图像通过无线通信技术传输给飞行器的处理芯片。Optionally, the map image acquisition device can also collect video frames in real time, obtain the collected user images, and transmit the real-time collected user images to the processing chip of the aircraft through wireless communication technology.
步骤S210、根据肤色检测算法,识别用户图像中的人体皮肤区域。Step S210: Identify a human skin area in the user image according to the skin color detection algorithm.
可选的,可根据皮肤的高斯混合模型(Gaussian Mixture Model,GMM)模型,从用户图像中识别出人体皮肤区域。Optionally, the human skin area can be identified from the user image according to the Gaussian Mixture Model (GMM) model of the skin.
步骤S220、去除所述人体皮肤区域中的人脸区域,得到用户手势区域。Step S220: Removing a face region in the human skin region to obtain a user gesture region.
可选的,本申请实施例可根据人脸检测算法,识别所述人体皮肤区域中的人脸区域,并进行去除。Optionally, the embodiment of the present application may identify and remove the face region in the human skin region according to the face detection algorithm.
可选的,从用户图像中的人体皮肤区域中去除人脸区域后,所得到的用户手势区域可能仅包含用户人手(如用户的穿着较为严密,仅裸露了人脸和人手),也可能包含用户的胳膊(如用户穿着背心或者短袖的情况)、腿部(如用户穿着短裤的情况)等;但从用户图像中的人体皮肤区域中去除人脸区域后,可以认为剩余的人体皮肤区域主要是人手的皮肤区域,因此本申请实施例可将 用户图像中去除人脸区域的人体皮肤区域,直接作为用户手势区域使用。Optionally, after the face area is removed from the human skin area in the user image, the obtained user gesture area may only include the user's hand (eg, the user wears a tighter face, only the face and the hand are exposed), and may also include The user's arm (such as when the user wears a vest or short sleeve), the leg (such as when the user wears shorts), etc.; but after removing the face area from the human skin area in the user image, the remaining human skin area can be considered It is mainly a skin area of a human hand. Therefore, the human skin area of the user image in which the face area is removed from the user image can be directly used as the user gesture area.
可选的,步骤S210和步骤S220示出了通过肤色检测算法,从用户图像中提取用户手势区域的可选方式。Optionally, step S210 and step S220 illustrate an alternative manner of extracting a user gesture area from the user image by the skin color detection algorithm.
步骤S230、提取所述用户手势区域的轮廓特征。Step S230: Extract contour features of the user gesture area.
步骤S240、将所述用户手势区域的轮廓特征,与预置的各标准用户手势的轮廓特征进行匹配,确定与所述用户手势区域的轮廓特征匹配度最高的标准用户手势,得到从所述用户图像中识别的用户手势。Step S240: Matching the contour feature of the user gesture area with the preset contour feature of each standard user gesture, and determining a standard user gesture with the highest degree of matching with the contour feature of the user gesture area, and obtaining the user from the user User gestures identified in the image.
在得到用户手势区域后,本申请实施例可提取所述用户手势区域的轮廓特征,将所述用户手势区域的轮廓特征,与预置的各标准用户手势的轮廓特征进行匹配,确定匹配度最高的标准用户手势,将所确定的匹配度最高的标准用户手势作为从所述用户图像中识别的用户手势。After obtaining the user gesture area, the embodiment of the present application may extract the contour feature of the user gesture area, and match the contour feature of the user gesture area with the preset contour features of each standard user gesture to determine the highest matching degree. A standard user gesture that takes the determined standard user gesture with the highest degree of matching as the user gesture identified from the user image.
可选的,步骤S230至步骤S240示出了在从用户图像中提取用户手势区域后,基于与标准用户手势的轮廓特征的比对,识别所提取的用户手势区域对应的用户手势,得到所述用户图像中的用户手势的可选方式。Optionally, step S230 to step S240 are performed, after extracting the user gesture area from the user image, identifying a user gesture corresponding to the extracted user gesture area based on the comparison with the contour feature of the standard user gesture, and obtaining the An optional way of user gestures in the user image.
步骤S210至步骤S240可以认为是图5所示步骤S110的可选实现方式。Steps S210 to S240 can be considered as an alternative implementation of step S110 shown in FIG. 5.
步骤S250、根据预定义的各用户手势与飞行指令的对应关系,确定所述用户手势对应的飞行指令。Step S250: Determine, according to a predefined correspondence between each user gesture and a flight instruction, a flight instruction corresponding to the user gesture.
步骤S260、根据所述飞行指令控制飞行器飞行。Step S260, controlling the flight of the aircraft according to the flight instruction.
可选的,图6示出了根据肤色检测算法,从所述用户图像中识别出用户手势区域,进而以轮廓特征匹配用户手势区域对应的标准用户手势,得到所述用户图像中的用户手势的方式;但这种方式需要建立在用户人手为裸露的情况,一旦用户穿戴手套,则无法通过肤色检测算法,识别出用户图像中的用户手势区域;Optionally, FIG. 6 illustrates that a user gesture area is identified from the user image according to a skin color detection algorithm, and then a standard user gesture corresponding to the user gesture area is matched with the contour feature to obtain a user gesture in the user image. The method needs to be established in the case where the user's hand is bare. Once the user wears the glove, the user's gesture area in the user image cannot be identified by the skin color detection algorithm;
基于此,本申请实施例可从用户图像中识别连通区域,将各连通区域的轮廓特征与预置的各标准用户手势的轮廓特征进行匹配,识别出用户图像中的用户手势;Based on this, the embodiment of the present application can identify the connected area from the user image, match the contour feature of each connected area with the preset contour features of each standard user gesture, and identify the user gesture in the user image;
可选的,图7示出了本申请实施例提供的飞行器飞行控制方法的再一流程图,该方法可应用于飞行器,具体可以应用于飞行器的处理芯片,参照图7,该方法可以包括:Optionally, FIG. 7 is still another flowchart of the aircraft flight control method provided by the embodiment of the present application. The method may be applied to an aircraft, and may be specifically applied to a processing chip of an aircraft. Referring to FIG. 7, the method may include:
步骤S300、获取用户图像。Step S300: Acquire a user image.
可选的,步骤S300的实现可与图6所示步骤S200相对应参照。Optionally, the implementation of step S300 may be referred to corresponding to step S200 shown in FIG. 6.
步骤S310、提取所述用户图像中的连通区域。Step S310, extracting a connected area in the user image.
可选的,本申请实施例可提取用户图像中的所有连通区域;也可以是从所述用户图像中去除人脸区域后,提取去除人脸区域后的用户图像中的连通区域。Optionally, the embodiment of the present application may extract all connected areas in the user image; or may remove the connected area in the user image after removing the face area after removing the face area from the user image.
步骤S320、提取各连通区域的轮廓特征。Step S320: Extract contour features of each connected area.
步骤S330、将各连通区域的轮廓特征,与预置的各标准用户手势的轮廓特征进行匹配,确定匹配度最高的标准用户手势,将匹配度最高的标准用户手势,作为从所述用户图像中识别的用户手势。Step S330: matching the contour features of each connected area with the preset contour features of each standard user gesture, determining a standard user gesture with the highest matching degree, and using the standard user gesture with the highest matching degree as the image from the user. The recognized user gesture.
本申请实施例可将各连通区域的轮廓特征,分别与各标准用户手势的轮廓特征进行匹配,得到各连通区域的轮廓特征,与各标准用户手势的轮廓特征的匹配度,从中选取匹配度最高对应的标准用户手势,作为从所述用户图像中识别的用户手势。In this embodiment, the contour features of each connected area are respectively matched with the contour features of each standard user gesture, and the contour features of each connected area are obtained, and the matching degree of the contour features of each standard user gesture is selected, and the matching degree is selected from the highest. A corresponding standard user gesture is taken as a user gesture identified from the user image.
可选的,步骤S310至步骤S330示出了图5所示步骤S110识别所述用户图像中的用户手势的另一种可选实现方法,步骤S310至步骤S330,可不通过肤色检测算法,进行用户图像中用户手势的识别,而是通过提取用户图像中的连通区域,通过用户图像中的连通区域的轮廓特征,与各标准用户手势的轮廓特征进行匹配,选取匹配度最高的标准用户手势,作为从所述用户图像中识别的用户手势。Optionally, step S310 to step S330 show another optional implementation method for identifying a user gesture in the user image in step S110 shown in FIG. 5, and step S310 to step S330, the user may be performed without using the skin color detection algorithm. The recognition of the user's gesture in the image is performed by extracting the connected region in the user image, matching the contour features of the standard user gestures through the contour features of the connected region in the user image, and selecting the standard user gesture with the highest matching degree as the User gestures identified from the user image.
步骤S340、根据预定义的各用户手势与飞行指令的对应关系,确定所述用户手势对应的飞行指令。Step S340: Determine, according to a predefined correspondence between each user gesture and a flight instruction, a flight instruction corresponding to the user gesture.
步骤S350、根据所述飞行指令控制飞行器飞行。Step S350, controlling the flight of the aircraft according to the flight instruction.
可选的,本申请实施例也可预先训练各标准用户手势的检测器,通过各标准用户手势的检测器对所述用户图像进行检测,基于各标准用户手势的检测器的检测结果,识别所述用户图像中的用户手势;Optionally, the embodiment of the present application may also pre-train the detectors of each standard user gesture, and detect the user image by using a detector of each standard user gesture, and identify the detection result based on the detector of each standard user gesture. User gestures in the user image;
可选的,对于各标准用户手势,本申请实施例可预先采集含有标准用户手势的多个用户图像,作为各标准用户手势对应的图像样本;从而对于各标准用户手势对应的图像样本,根据机器训练方法(SVM等),训练出各标准用户手 势的检测器;Optionally, for each standard user gesture, the embodiment of the present application may pre-collect a plurality of user images including standard user gestures as image samples corresponding to each standard user gesture; thus, for each standard user gesture corresponding image sample, according to the machine Training methods (SVM, etc.) to train detectors for standard user gestures;
在得到各标准用户手势的检测器后,可通过图8所示方法实现飞行器的飞行控制,图8示出了本申请实施例提供的飞行器飞行控制方法的又一流程图,该方法可应用于飞行器,具体可以应用于飞行器的处理芯片,参照图8,该方法可以包括:After the detectors of the standard user gestures are obtained, the flight control of the aircraft can be implemented by the method shown in FIG. 8. FIG. 8 is still another flowchart of the aircraft flight control method provided by the embodiment of the present application, and the method can be applied to The aircraft, specifically applicable to the processing chip of the aircraft, referring to FIG. 8, the method may include:
步骤S400、获取用户图像。Step S400: Acquire a user image.
可选的,步骤S400的实现可与图6所示步骤S200相对应参照。Optionally, the implementation of step S400 may be referred to corresponding to step S200 shown in FIG. 6.
步骤S410、使用各标准用户手势的检测器,分别对所述用户图像进行检测,得到各标准用户手势的检测器对所述用户图像的检测结果。Step S410: Using the detectors of the standard user gestures, respectively detecting the user image, and obtaining a detection result of the user image by the detector of each standard user gesture.
步骤S420、根据所述用户图像的检测结果,确定从所述用户图像中识别的用户手势。Step S420: Determine a user gesture recognized from the user image according to the detection result of the user image.
标准用户手势的检测器对所述用户图像的检测结果可以是,用户图像为检测器对应的标准用户手势,或者,用户图像不为检测器对应的标准用户手势;通过分析各标准用户手势的检测器对所述用户图像的检测结果,本申请实施例可确定出所述用户图像中被检测出的用户手势,实现所述用户图像中用户手势的识别。The detection result of the user image by the detector of the standard user gesture may be that the user image is a standard user gesture corresponding to the detector, or the user image is not a standard user gesture corresponding to the detector; and the detection of each standard user gesture is analyzed. In the detection result of the user image, the embodiment of the present application may determine the detected user gesture in the user image, and implement the recognition of the user gesture in the user image.
可选,步骤S410和步骤S420示出了图5所示步骤S110识别所述用户图像中的用户手势的另一种可选实现方法,步骤S410和步骤S420可通过预先训练出的各标准用户手势的检测器,检测出用户图像中识别的用户手势。Optionally, step S410 and step S420 show another optional implementation method for identifying a user gesture in the user image in step S110 shown in FIG. 5, and step S410 and step S420 may pass the pre-trained standard user gestures. The detector detects the user gesture identified in the user image.
步骤S430、根据预定义的各用户手势与飞行指令的对应关系,确定所述用户手势对应的飞行指令。Step S430: Determine a flight instruction corresponding to the user gesture according to a predefined correspondence between each user gesture and a flight instruction.
步骤S440、根据所述飞行指令控制飞行器飞行。Step S440, controlling the flight of the aircraft according to the flight instruction.
可选的,如果是通过飞行器的图像采集装置实现用户图像的采集,则飞行器在基于所识别的用户手势相应的飞行指令进行飞行后,飞行器的图像采集装置可能无法再采集到用户图像;如图9所示,飞行器在根据所识别的用户手势,向前飞行后,如果用户不同步的向前移动,则用户将不再处于飞行器摄像头的图像采集范围内,此时,飞行器摄像头将无法采集到用户图像,后续则无法再通过用户图像中的用户手势进行飞行控制;因此在用户不跟随飞行器同步移动的情况下,为使得飞行器在根据用户手势相应的飞行指令进行飞行后,图像采 集装置仍能采集到用户图像,飞行器可以控制图像采集装置的图像采集角度进行调整,使得图像采集装置仍能采集到用户图像;Optionally, if the image of the user is acquired by the image capturing device of the aircraft, after the aircraft flies based on the corresponding flight instruction of the identified user gesture, the image capturing device of the aircraft may not be able to collect the user image; As shown in Figure 9, after the aircraft flies forward according to the identified user gesture, if the user moves forward without synchronization, the user will no longer be in the image acquisition range of the aircraft camera. At this time, the aircraft camera will not be able to collect. The user image, after which the flight control can no longer be performed by the user gesture in the user image; therefore, in the case that the user does not follow the synchronous movement of the aircraft, the image capturing device can still make the aircraft fly after the corresponding flight instruction according to the user gesture. The user image is collected, and the aircraft can control the image acquisition angle of the image acquisition device to adjust, so that the image acquisition device can still collect the user image;
具体的,飞行器的处理芯片在控制飞行器以所识别的用户手势对应的飞行指令进行飞行后,处理芯片可控制调整图像采集装置的图像采集角度,使得用户处于图像采集装置的图像采集范围内;可选的,本申请实施例可以根据飞行器的飞行方向和飞行距离,调整图像采集装置的图像采集角度;具体图像采集角度与飞行方向和飞行距离相应的调整比例,可以根据图像采集装置的实际设置情况设定;Specifically, after the processing chip of the aircraft is controlled to fly by the flight instruction corresponding to the identified user gesture, the processing chip can control the image acquisition angle of the image capturing device to be adjusted, so that the user is within the image capturing range of the image capturing device; Optionally, the embodiment of the present application can adjust the image acquisition angle of the image acquisition device according to the flight direction and the flight distance of the aircraft; the adjustment ratio of the specific image acquisition angle and the flight direction and the flight distance may be according to the actual setting of the image acquisition device. set up;
可选的,飞行器的图像采集装置可以具有角度调节机构,处理芯片可控制调整角度调节机构的角度,来实现对图像采集装置的图像采集角度的调整;Optionally, the image capturing device of the aircraft may have an angle adjusting mechanism, and the processing chip may control an angle of the adjusting angle adjusting mechanism to implement an adjustment of an image capturing angle of the image capturing device;
可选的,如果用户跟随飞行器同步移动,则可以不用调整飞行器的图像采集装置的图像采集角度,可在保持图像采集装置的图像采集角度不变的情况下,通过用户移动,使得用户处于图像采集装置的图像采集范围内,使得图像采集装置后续仍能采集到用户图像,基于用户图像中的用户手势进行飞行控制。Optionally, if the user follows the aircraft synchronously moving, the image capturing angle of the image capturing device of the aircraft may not be adjusted, and the user may be moved by the user to keep the image capturing angle of the image capturing device unchanged. Within the image acquisition range of the device, the image acquisition device can still acquire the user image, and perform flight control based on the user gesture in the user image.
显然,如果是通过地面图像采集装置实现用户图像的采集,则飞行器的图像采集装置可执行航拍等任务图像的采集,飞行器可在根据所述飞行指令进行飞行后,不调整图像采集装置的图像采集角度。Obviously, if the image acquisition of the user is realized by the ground image acquisition device, the image acquisition device of the aircraft can perform the acquisition of the task image such as the aerial image, and the aircraft can not adjust the image collection of the image acquisition device after the flight according to the flight instruction. angle.
可选的,地面上可能存在多个用户,飞行器在获取用户图像后,用户图像中也可能存在多个用户人像,如图10所示,地面上同时存在多个做出手势的用户,此时飞行器需要确定基于哪个用户的手势进行飞行控制;基于此,本申请实施例可设定控制飞行器飞行的合法用户,为实现飞行器基于合法用户的用户手势进行飞行控制,本申请实施例可预置合法用户的人脸特征,在获取到用户图像后(可以是飞行器的图像采集装置采集,也可以是地面图像采集装置采集),可识别用户图像中与合法用户的人脸特征匹配的用户人像区域,基于与合法用户的人脸特征匹配的用户人像区域,进行用户手势的识别,从而保障飞行器可以用户图像中合法用户的用户手势进行相应的飞行控制;Optionally, there may be multiple users on the ground. After acquiring the user image, the aircraft may also have multiple user portraits in the user image. As shown in FIG. 10, multiple users making gestures exist on the ground at this time. The aircraft needs to determine the user's gesture based on the flight control; based on this, the embodiment of the present application can set a legitimate user to control the flight of the aircraft, and the flight control of the user's gesture based on the legal user of the aircraft can be preset. The user's face feature, after acquiring the user image (which may be collected by the image acquisition device of the aircraft or collected by the ground image acquisition device), may identify the user portrait region of the user image that matches the facial feature of the legitimate user. The user gesture is identified based on the user portrait area matching the facial features of the legitimate user, thereby ensuring that the aircraft can perform corresponding flight control by the user gesture of the legitimate user in the user image;
可选的,图11示出了本申请实施例提供的飞行器飞行控制方法的又另一流程图,该方法可应用于飞行器,具体可以应用于飞行器的处理芯片,参照图 11,该方法可以包括:Optionally, FIG. 11 is still another flowchart of the aircraft flight control method provided by the embodiment of the present application. The method may be applied to an aircraft, and may be specifically applied to a processing chip of an aircraft. Referring to FIG. 11, the method may include :
步骤S500、获取用户图像。Step S500: Acquire a user image.
可选的,步骤S500的实现可与图6所示步骤S200相对应参照。Optionally, the implementation of step S500 may be referred to corresponding to step S200 shown in FIG. 6.
步骤S510、判断所述用户图像中是否存在与合法用户的人脸特征相匹配的人脸区域,若否,执行步骤S520,若是,执行步骤S530。Step S510: Determine whether there is a face area matching the facial features of the legal user in the user image. If no, step S520 is performed, and if yes, step S530 is performed.
可选的,本申请实施例可根据人脸检测算法,识别所述用户图像中的人脸区域,得到至少一个人脸区域,并将所得到的各人脸区域的人脸特征,分别与预置的合法用户的人脸特征相匹配,判断用户图像中是否存在与合法用户的人脸特征相匹配的人脸区域。Optionally, the embodiment of the present application may identify a face region in the user image according to a face detection algorithm, obtain at least one face region, and respectively obtain the face features of each face region. The face features of the legal users are matched, and it is determined whether there is a face region in the user image that matches the face feature of the legitimate user.
步骤S520、结束流程。Step S520, ending the process.
如果用户图像中不存在与合法用户的人脸特征相匹配的人脸区域,说明用户图像中不存在合法用户的人像,不能够基于用户图像中的用户手势进行飞行器的飞行控制,可结束当前流程,并等待下一帧获取的用户图像到来,对下一帧获取的用户图像进行如步骤S510的处理。If there is no face area in the user image that matches the face feature of the legitimate user, it means that there is no portrait of the legitimate user in the user image, and the flight control of the aircraft cannot be performed based on the user gesture in the user image, and the current process can be ended. And waiting for the user image acquired in the next frame to arrive, and performing the processing of step S510 on the user image acquired in the next frame.
步骤S530、提取所述用户图像中与合法用户的人脸特征相匹配的人脸区域所对应的用户人像。Step S530: Extract a user portrait corresponding to a face region of the user image that matches a facial feature of the legal user.
所提取的用户人像可以是用户图像中合法用户(即用户图像中与合法用户的人脸特征相匹配的人脸区域所对应的用户)的人像,包含合法用户的身体图像。The extracted user portrait may be a portrait of a legitimate user in the user image (ie, a user corresponding to a face region in the user image that matches the facial feature of the legitimate user), and includes a body image of the legitimate user.
步骤S540、识别所述用户人像中的用户手势。Step S540: Identify a user gesture in the user portrait.
可选的,识别所述用户人像中的用户手势的实现方式可参照上文相应部分所示。Optionally, an implementation manner of identifying a user gesture in the user portrait may be referred to the corresponding part above.
可选的,本申请实施例可基于图6所示,根据肤色检测算法,识别所述用户人像中的用户手势;具体的,可根据肤色检测算法,识别所述用户人像中的人体皮肤区域,从人体皮肤区域中提取用户手势区域,将所述用户手势区域的轮廓特征,与预置的各标准用户手势的轮廓特征进行匹配,确定与所述用户手势区域的轮廓特征匹配度最高的标准用户手势,得到从所述用户人像识别的用户手势;Optionally, the embodiment of the present application may identify a user gesture in the user portrait according to the skin color detection algorithm, as shown in FIG. 6 . Specifically, the human skin region in the user portrait may be identified according to a skin color detection algorithm. Extracting a user gesture area from a human skin area, matching a contour feature of the user gesture area with a preset contour feature of each standard user gesture, and determining a standard user having the highest matching degree with the contour feature of the user gesture area Gesture, obtaining a user gesture recognized from the user portrait;
可选的,本申请实施例也可基于图7所示,根据用户人像中的连通区域的 轮廓特征,与各标准用户手势的轮廓特征进行匹配,识别所述用户人像中的用户手势;具体的,可提取所述用户人像中的连通区域,将各连通区域的轮廓特征,与预置的各标准用户手势的轮廓特征进行匹配,确定匹配度最高的标准用户手势,将匹配度最高的标准用户手势,作为从所述用户人像中识别的用户手势;Optionally, the embodiment of the present application may also match the contour features of each standard user gesture according to the contour feature of the connected area in the user portrait, and identify the user gesture in the user portrait; And extracting the connected area in the user portrait, matching the contour feature of each connected area with the preset contour features of each standard user gesture, determining a standard user gesture with the highest matching degree, and matching the standard user with the highest matching degree Gesture as a user gesture recognized from the user portrait;
可选的,本申请实施例也可基于图8所示,通过各标准用户手势的检测器,识别所述用户人像中的用户手势;具体的,可使用各标准用户手势的检测器,分别对所述用户人像进行检测,得到各标准用户手势的检测器对所述用户人像的检测结果,根据所述用户人像的检测结果,确定从所述用户人像中识别的用户手势。Optionally, the embodiment of the present application may also identify a user gesture in the user portrait by using a detector of each standard user gesture according to FIG. 8; specifically, a detector of each standard user gesture may be used, respectively The user portrait is detected, and the detection result of the user portrait by the detector of each standard user gesture is obtained, and the user gesture recognized from the user portrait is determined according to the detection result of the user portrait.
步骤S550、根据预定义的各用户手势与飞行指令的对应关系,确定所述用户手势对应的飞行指令。Step S550: Determine, according to a predefined correspondence between each user gesture and a flight instruction, a flight instruction corresponding to the user gesture.
步骤S560、根据所述飞行指令控制飞行器飞行。Step S560, controlling the flight of the aircraft according to the flight instruction.
显然,图11所示方法通过人脸检测算法,识别用户图像中的合法用户的用户人像,从而识别该合法用户的用户人像的用户手势,来控制飞行器根据相应的飞行指令进行飞行,仅是本申请实施例对飞行器进行飞行控制的一种优选方案;Obviously, the method shown in FIG. 11 uses a face detection algorithm to identify a user portrait of a legitimate user in a user image, thereby identifying a user gesture of the user portrait of the legitimate user, to control the aircraft to fly according to the corresponding flight instruction, only this A preferred embodiment of the flight control of an aircraft by applying an embodiment;
可选的,如果是采用地面图像采集装置实现用户图像的采集,则本申请实施例也可通过限制地面图像采集装置只能由合法用户开启(如设置地面图像采集装置的开启密码等),保障地面图像采集装置采集合法用户的用户图像,来控制飞行器的飞行;此时,飞行器可免去基于人脸检测算法,判断合法用户的步骤。Optionally, if the image acquisition of the user image is implemented by using the ground image acquisition device, the embodiment of the present application can also be opened only by the legal user by limiting the ground image acquisition device (such as setting the password of the ground image acquisition device). The ground image acquisition device collects the user image of the legitimate user to control the flight of the aircraft; at this time, the aircraft can eliminate the step of judging the legitimate user based on the face detection algorithm.
可选的,本申请实施例也可通过人员驱离,选择人员较少的场所来维持飞行器的飞行现场只有合法用户,使得飞行器可直接通过所采集的用户图像,实现用户手势的识别,免去基于人脸检测算法,判断合法用户的步骤。Optionally, the embodiment of the present application can also be used to disperse the personnel and select a place with fewer personnel to maintain only the legal user on the flight scene of the aircraft, so that the aircraft can directly recognize the user gesture through the collected user image, and eliminate the user gesture. The step of judging a legitimate user based on a face detection algorithm.
可选的,如果由地面图像采集装置采集用户图像,本申请还可设置与地面图像采集装置相通信的地面处理芯片,由地面处理芯片识别所述用户图像中的用户手势,并确定所述用户手势对应的飞行指令;由地面处理芯片通过无线通信技术,将所述飞行指令传输给飞行器的处理芯片,由飞行器的处理芯片根据 所述飞行指令,控制飞行器飞行;Optionally, if the user image is collected by the ground image acquisition device, the present application may further set a ground processing chip that communicates with the ground image acquisition device, and the ground processing chip identifies the user gesture in the user image, and determines the user. a flight instruction corresponding to the gesture; the ground processing chip transmits the flight instruction to the processing chip of the aircraft through a wireless communication technology, and the processing chip of the aircraft controls the flight of the aircraft according to the flight instruction;
如图12所示,地面图像采集装置2采集用户图像后,可传输给地面处理芯片4;地面处理芯片4可识别所述用户图像中的用户手势,具体识别方式可如图6、图7、图8和图11所示任一方式实现;地面处理芯片4根据预定义的各用户手势与飞行指令的对应关系,确定所述用户手势对应的飞行指令,将所述飞行指令通过无线通信技术传输给飞行器1的处理芯片;飞行器1的处理芯片根据所述飞行指令,控制飞行器飞行。As shown in FIG. 12, after the ground image capturing device 2 collects the user image, it can be transmitted to the ground processing chip 4; the ground processing chip 4 can identify the user gesture in the user image, and the specific recognition manner can be as shown in FIG. 6 and FIG. The method shown in FIG. 8 and FIG. 11 is implemented. The ground processing chip 4 determines a flight instruction corresponding to the user gesture according to a predefined correspondence between each user gesture and a flight instruction, and transmits the flight instruction through a wireless communication technology. A processing chip for the aircraft 1; the processing chip of the aircraft 1 controls the flight of the aircraft in accordance with the flight instruction.
本申请实施例提供的飞行器飞行控制方法,可通过用户手势控制飞行器的飞行,飞行器的飞行控制操作极为便捷,可达到便捷的实现飞行器的飞行控制的目的。The aircraft flight control method provided by the embodiment of the present application can control the flight of the aircraft by the user gesture, and the flight control operation of the aircraft is extremely convenient, and the flight control of the aircraft can be conveniently achieved.
本申请实施例中,用户还可通过约定的第一手势(该约定的第一手势为上述描述的预定义的各用户手势中的某一个用户手势),挥动人手,产生以第一手势运动的手势轨迹;In the embodiment of the present application, the user may also wave the human hand through the agreed first gesture (the first gesture of the appointment is one of the predefined user gestures described above), and generate the motion with the first gesture. Gesture track
图13示出了本申请实施例提供的飞行器飞行控制方法的流程图,该方法可应用于飞行器,具体可应用于飞行器的处理芯片,参照图13,该方法可以包括:FIG. 13 is a flowchart of a flight control method for an aircraft provided by an embodiment of the present application. The method is applicable to an aircraft, and is specifically applicable to a processing chip of an aircraft. Referring to FIG. 13, the method may include:
步骤S600、获取用户图像。Step S600: Acquire a user image.
可选的,用户图像可由飞行器自带的图像采集装置采集得到,即飞行器的处理芯片可获取飞行器的图像采集装置所采集的用户图像,实现对用户图像的获取;Optionally, the user image may be acquired by an image acquisition device provided by the aircraft, that is, the processing chip of the aircraft may acquire a user image collected by the image acquisition device of the aircraft to obtain the image of the user;
可选的,用户图像也可以是由地面图像采集装置采集得到,地面图像采集装置可通过无线通信技术,将所采集的用户图像传输给飞行器的处理芯片,以实现对用户图像的获取。Optionally, the user image may also be acquired by the ground image acquisition device, and the ground image acquisition device may transmit the collected user image to the processing chip of the aircraft through wireless communication technology to obtain the image of the user.
本实施例中,以用户图像可由飞行器自带的图像采集装置采集得到进行说明。In this embodiment, the user image can be collected by an image acquisition device provided by the aircraft for description.
步骤S610、识别所述用户图像中的用户手势。Step S610: Identify a user gesture in the user image.
可选的,识别所述用户人像中的用户手势的实现方式可参照上文相应部分所示。Optionally, an implementation manner of identifying a user gesture in the user portrait may be referred to the corresponding part above.
可选的,本申请实施例可基于图6所示,根据肤色检测算法,识别所述用 户人像中的用户手势;具体的,可根据肤色检测算法,识别所述用户人像中的人体皮肤区域,从人体皮肤区域中提取用户手势区域,将所述用户手势区域的轮廓特征,与预置的各标准用户手势的轮廓特征进行匹配,确定与所述用户手势区域的轮廓特征匹配度最高的标准用户手势,得到从所述用户人像识别的用户手势;Optionally, the embodiment of the present application may identify a user gesture in the user portrait according to the skin color detection algorithm, as shown in FIG. 6 . Specifically, the human skin region in the user portrait may be identified according to a skin color detection algorithm. Extracting a user gesture area from a human skin area, matching a contour feature of the user gesture area with a preset contour feature of each standard user gesture, and determining a standard user having the highest matching degree with the contour feature of the user gesture area Gesture, obtaining a user gesture recognized from the user portrait;
可选的,本申请实施例也可基于图7所示,根据用户人像中的连通区域的轮廓特征,与各标准用户手势的轮廓特征进行匹配,识别所述用户人像中的用户手势;具体的,可提取所述用户人像中的连通区域,将各连通区域的轮廓特征,与预置的各标准用户手势的轮廓特征进行匹配,确定匹配度最高的标准用户手势,将匹配度最高的标准用户手势,作为从所述用户人像中识别的用户手势;Optionally, the embodiment of the present application may also match the contour features of each standard user gesture according to the contour feature of the connected area in the user portrait, and identify the user gesture in the user portrait; And extracting the connected area in the user portrait, matching the contour feature of each connected area with the preset contour features of each standard user gesture, determining a standard user gesture with the highest matching degree, and matching the standard user with the highest matching degree Gesture as a user gesture recognized from the user portrait;
可选的,本申请实施例也可基于图8所示,通过各标准用户手势的检测器,识别所述用户人像中的用户手势;具体的,可使用各标准用户手势的检测器,分别对所述用户人像进行检测,得到各标准用户手势的检测器对所述用户人像的检测结果,根据所述用户人像的检测结果,确定从所述用户人像中识别的用户手势。Optionally, the embodiment of the present application may also identify a user gesture in the user portrait by using a detector of each standard user gesture according to FIG. 8; specifically, a detector of each standard user gesture may be used, respectively The user portrait is detected, and the detection result of the user portrait by the detector of each standard user gesture is obtained, and the user gesture recognized from the user portrait is determined according to the detection result of the user portrait.
步骤S620、若所识别的用户手势为预定的第一手势,确定所述第一手势在所述用户图像中的位置。Step S620: Determine the position of the first gesture in the user image if the identified user gesture is a predetermined first gesture.
可选的,本申请实施例可通过预先训练的第一手势的检测器,对所述用户图像进行检测,判断所述用户图像中是否存在第一手势,以识别用户图像中的用户手势是否为第一手势;在通过第一手势的检测器,识别到所述用户图像中存在第一手势(即用户图像中的用户手势为第一手势)时,可确定出第一手势在所述用户图像中的位置;可选的,可确定第一手势的检测器所识别的第一手势在用户图像中的区域,以该区域的中心点在用户图像中的位置,作为第一手势在用户图像中的位置。Optionally, the embodiment of the present application may detect the user image by using a pre-trained first gesture detector, and determine whether a first gesture exists in the user image to identify whether the user gesture in the user image is a first gesture; when it is recognized by the detector of the first gesture that the first gesture exists in the user image (ie, the user gesture in the user image is the first gesture), the first gesture may be determined in the user image a position in the user image; optionally, determining an area of the first gesture recognized by the detector of the first gesture in the image of the user, the position of the center point of the area in the user image, as the first gesture in the user image s position.
可选的,本申请实施例也可根据皮肤检测算法,识别用户图像中的人体皮肤区域;从人体皮肤区域中去除人脸区域,得到用户手势区域(由于人体裸露的皮肤一般是人脸和人手,因此可将去除人脸区域的人体皮肤区域,作为用户手势区域使用);将用户手势区域的轮廓特征,与预定的第一手势的轮廓特征 进行匹配,通过匹配度判断所述用户图像中是否存在第一手势,以识别用户图像中的用户手势是否为第一手势;Optionally, the embodiment of the present application may also identify a human skin area in the user image according to the skin detection algorithm; remove the human face area from the human skin area, and obtain a user gesture area (because the naked skin of the human body is generally a face and a human hand) Therefore, the human skin area of the face area can be removed as the user gesture area; the contour feature of the user gesture area is matched with the contour feature of the predetermined first gesture, and the matching degree is used to determine whether the user image is There is a first gesture to identify whether the user gesture in the user image is the first gesture;
可选的,如果用户手势区域的轮廓特征,与预定的第一手势的轮廓特征的匹配度高于预定第一匹配度,则可确定用户手势区域中的用户手势为第一手势,即所述用户图像中存在第一手势;可选的,本申请实施例可以用户手势区域在图像中的位置(可选为,用户手势区域的中心点在图像中的位置),作为第一手势在所述用户图像中的位置。Optionally, if the contour feature of the user gesture area is matched with the contour feature of the predetermined first gesture by a predetermined first matching degree, determining that the user gesture in the user gesture area is the first gesture, that is, the A first gesture is present in the user image. Optionally, the embodiment of the present application may position the user gesture area in the image (optionally, the position of the center point of the user gesture area in the image), as the first gesture is in the The location in the user's image.
可选的,本申请实施例也可提取用户图像中的连通区域(优选为,提取去除人脸区域后的用户图像的各连通区域),将各连通区域的轮廓特征与预定的第一手势的轮廓特征进行匹配,通过匹配度判断所述用户图像中是否存在第一手势,以识别用户图像中的用户手势是否为第一手势;Optionally, the embodiment of the present application may also extract a connected area in the user image (preferably, extract each connected area of the user image after removing the face area), and set the contour feature of each connected area with the predetermined first gesture. The contour feature is matched, and the first gesture is determined in the user image by the matching degree to identify whether the user gesture in the user image is the first gesture;
如果存在与第一手势的轮廓特征的匹配度高于预定第二匹配度的连通区域,则可确定用户图像中存在第一手势,从而以该连通区域在图像中的位置(可选为,该连通区域的中心点在图像中的位置),作为第一手势在所述用户图像中的位置;可选的,第一匹配度和第二匹配度可以相同,也可以不同,具体可以视实际情况设定。If there is a connected area that matches the contour feature of the first gesture by a predetermined second matching degree, it may be determined that the first gesture exists in the user image, thereby the position of the connected area in the image (optionally, The position of the center point of the connected area in the image is the position of the first gesture in the user image; optionally, the first matching degree and the second matching degree may be the same or different, and may be different depending on the actual situation. set up.
可见,本申请实施例可以先判断用户图像中是否存在用户手势,且用户手势是否为第一手势(可以是通过第一手势的检测器判断,也可以是通过用户手势区域,或连通区域与第一手势的轮廓特征的匹配度判断),在判断出用户图像中存在用户手势,且用户手势为第一手势后,可确定第一手势在用户图像中的位置。It can be seen that the embodiment of the present application may first determine whether a user gesture exists in the user image, and whether the user gesture is a first gesture (may be determined by a detector of the first gesture, or may be a user gesture area, or a connected area and a The matching degree judgment of the contour feature of a gesture determines that the user gesture is present in the user image, and after the user gesture is the first gesture, the position of the first gesture in the user image may be determined.
步骤S630、根据所述第一手势在所述用户图像中的位置,调整飞行器的飞行姿态,以使飞行器跟随所述第一手势的手势轨迹飞行。Step S630: Adjust a flight attitude of the aircraft according to a position of the first gesture in the user image, so that the aircraft follows the gesture trajectory of the first gesture to fly.
在得到第一手势在所述用户图像中的位置后,本申请实施例可根据所述位置,确定飞行器在与第一手势的手势轨迹相同的水平运动方向上,调整的水平移动距离;及根据所述位置,确定飞行器在与第一手势的手势轨迹相同的垂直运动方向上,调整的垂直移动距离;从而以所确定的水平移动距离和垂直移动距离调整飞行器的飞行姿态,使得第一手势始终位于所述图像采集装置的图像采集视野范围内;可选的,通过调整飞行器的飞行姿态,可使得第一手势始终 位于图像采集装置的图像采集视野范围内,实现飞行器跟随所述第一手势的手势轨迹飞行。After obtaining the position of the first gesture in the user image, the embodiment of the present application may determine, according to the location, an adjusted horizontal movement distance of the aircraft in the same horizontal motion direction as the gesture trajectory of the first gesture; Determining the vertical movement distance of the aircraft in the same vertical movement direction as the gesture trajectory of the first gesture; thereby adjusting the flight attitude of the aircraft with the determined horizontal movement distance and vertical movement distance, so that the first gesture is always The image capturing device is located in the image capturing field of view of the image capturing device. Optionally, by adjusting the flying posture of the aircraft, the first gesture is always located within the image capturing field of view of the image capturing device, so that the aircraft follows the first gesture. Gesture trajectory flight.
可见,对于图像采集装置采集的各存在第一手势的用户图像,若以第一手势在用户图像中的位置,调整飞行器的飞行姿态,则飞行器可根据用户第一手势的手势轨迹,实时的进行飞行姿态的调整,使得飞行器可跟随用户的第一手势的手势轨迹飞行,实现对飞行器的飞行路线的控制。It can be seen that, for the user image of the first gesture collected by the image acquisition device, if the flight posture of the aircraft is adjusted by the position of the first gesture in the user image, the aircraft can perform the real-time according to the gesture trajectory of the first gesture of the user. The adjustment of the flight attitude enables the aircraft to fly following the gesture trajectory of the user's first gesture, enabling control of the flight path of the aircraft.
本申请实施例提供的飞行器的飞行路线控制方法中,飞行器的处理芯片可获取飞行器的图像采集装置采集的用户图像,识别所述用户图像中的用户手势,若所识别的用户手势为预定的第一手势,则可确定所述第一手势在所述用户图像中的位置,进而根据所述第一手势在所述用户图像中的位置,调整飞行器的飞行姿态,以使飞行器跟随所述第一手势的手势轨迹飞行,实现对飞行器的飞行路线的控制。可见,本申请实施例中,用户可通过操作第一手势,使得飞行器可根据第一手势在所采集的用户图像中的位置,调整飞行姿态,使得飞行器可跟随用户的第一手势的手势轨迹飞行。本申请实施例可通过用户的第一手势的手势轨迹控制,飞行器的飞行路线,便捷的实现飞行器的飞行路线控制。In the flight path control method of the aircraft provided by the embodiment of the present application, the processing chip of the aircraft may acquire a user image collected by the image capturing device of the aircraft, and identify a user gesture in the user image, if the identified user gesture is a predetermined a gesture of determining a position of the first gesture in the user image, and adjusting a flight attitude of the aircraft according to a position of the first gesture in the user image, so that the aircraft follows the first The gesture gesture trajectory flight realizes the control of the flight path of the aircraft. It can be seen that, in the embodiment of the present application, the user can operate the first gesture, so that the aircraft can adjust the flight attitude according to the position of the first gesture in the collected user image, so that the aircraft can follow the gesture trajectory of the user's first gesture. . The embodiment of the present application can control the flight path of the aircraft by the gesture track of the first gesture of the user, and conveniently realize the flight path control of the aircraft.
可选的,图14示出了根据第一手势在所述用户图像中的位置,确定飞行器调整的水平移动距离的方法流程图,该方法可应用于飞行器,具体可应用于飞行器的处理芯片,参照图14,该方法可以包括:Optionally, FIG. 14 is a flowchart of a method for determining a horizontal movement distance adjusted by an aircraft according to a position of the first gesture in the user image, and the method is applicable to an aircraft, and specifically, to a processing chip of the aircraft. Referring to Figure 14, the method can include:
步骤S700、以飞行器的图像采集装置在横轴方向上的视线范围构建横轴坐标,所述横轴坐标的原点为图像采集装置在横轴方向上的视线中点。Step S700: Construct a horizontal axis coordinate with a line of sight range of the image capturing device of the aircraft in the horizontal axis direction, where the origin of the horizontal axis coordinate is the midpoint of the line of sight of the image capturing device in the horizontal axis direction.
如图15所示,以图像采集装置为摄像头为例,假设A点为摄像头的位置,AB和AC分别是摄像头横轴视线的极限(即摄像头在横轴方向上的视线范围),BMC为地面,则BC是以摄像头在横轴方向上的视线范围构建的横轴坐标,BC上的每个点均匀的落在摄像头采集图像的横轴坐标上;AM为摄像头中心线,M为摄像头在横轴方向上的视线中点,为横轴坐标的原点,也即BC线段的中心。As shown in FIG. 15 , taking the image acquisition device as a camera as an example, assume that point A is the position of the camera, AB and AC are the limits of the line of sight of the camera's horizontal axis (ie, the line of sight of the camera in the horizontal axis direction), and BMC is the ground. Then, BC is the horizontal axis coordinate constructed by the line of sight of the camera in the horizontal axis direction, and each point on the BC uniformly falls on the horizontal axis coordinate of the image captured by the camera; AM is the camera center line, and M is the camera in the horizontal direction. The midpoint of the line of sight in the axial direction is the origin of the horizontal axis coordinate, that is, the center of the BC line segment.
步骤S710、确定第一手势在用户图像中的位置在横轴坐标上的投影点,并确定所述投影点在所述横轴坐标上的坐标。Step S710, determining a projection point of the position of the first gesture in the user image on the horizontal axis coordinate, and determining coordinates of the projection point on the horizontal axis coordinate.
在确定出第一手势在图像中的位置后,本申请实施例可确定第一手势在图 像中的位置,在水平方向上的投影点;如图15所示,第一手势在图像中的位置在水平方向上的投影点为P点;P点在横轴BC上的坐标为该投影点在横轴上的坐标。After determining the position of the first gesture in the image, the embodiment of the present application may determine the position of the first gesture in the image, the projection point in the horizontal direction; as shown in FIG. 15 , the position of the first gesture in the image The projection point in the horizontal direction is the P point; the coordinate of the P point on the horizontal axis BC is the coordinate of the projection point on the horizontal axis.
步骤S720、根据所述横轴坐标的长度,飞行器与地面的垂直高度,飞行器的图像采集装置的中心线和垂直方向的角度,图像采集装置的横轴方向视角的半角,及所述投影点在所述横轴坐标上的坐标,确定飞行器的水平移动距离。Step S720, according to the length of the horizontal axis coordinate, the vertical height of the aircraft and the ground, the angle between the center line and the vertical direction of the image capturing device of the aircraft, the half angle of the viewing angle of the horizontal direction of the image capturing device, and the projection point The coordinates on the horizontal axis coordinate determine the horizontal moving distance of the aircraft.
如图15所示,OA是无人机等飞行器离地面的垂直高度;则OAM为摄像头中心线和垂直方向的角度,BAM为摄像头横轴方向视角的半角,为使得第一手势在水平方向上的投影点P落在摄像头采集图像的中心点M上,飞行器需要调整MP的水平移动距离;即本申请实施例可通过调整飞行器的飞行姿态,使得第一手势位于图像采集装置的图像采集视野范围的中心;As shown in Fig. 15, OA is the vertical height of the aircraft such as the drone from the ground; then OAM is the angle between the center line of the camera and the vertical direction, and BAM is the half angle of the angle of view of the horizontal axis of the camera, so that the first gesture is in the horizontal direction. The projection point P falls on the center point M of the image acquired by the camera, and the aircraft needs to adjust the horizontal movement distance of the MP; that is, the embodiment of the present application can adjust the flight posture of the aircraft so that the first gesture is located in the image collection field of the image acquisition device. center of;
相应的,可设OAM为β,BAM为α,飞行器离地面的垂直高度为H,第一手势在用户图像中的位置,在横轴坐标上的投影点的横轴坐标为x,横轴坐标的长度(摄像头在横轴方向上的视线范围长度)为Lx,需要调整的水平移动距离MP为Sx,则可根据如下公式确定飞行器需要调整的水平移动距离:Correspondingly, OAM can be set to β, BAM is α, the vertical height of the aircraft from the ground is H, the position of the first gesture in the user image, and the horizontal axis coordinate of the projection point on the horizontal axis coordinate is x, the horizontal axis coordinate The length (the length of the line of sight of the camera in the horizontal axis direction) is Lx, and the horizontal moving distance MP to be adjusted is Sx, and the horizontal moving distance that the aircraft needs to adjust can be determined according to the following formula:
Sx=(2*x*H*tanα)/(Lx*cosβ)。Sx = (2 * x * H * tan α) / (Lx * cos β).
可选的,飞行器的高度数据可以通过超声波或是气压计获取;角度数据可以根据需要设定固定的角度。Optionally, the height data of the aircraft can be obtained by ultrasonic or barometer; the angle data can be set at a fixed angle as needed.
可选的,飞行器的处理芯片可获取实时采集的每帧用户图像,基于每帧用户图像中第一手势的位置,实时的确定飞行器的水平移动距离,然后向飞行器的飞行机构输出飞行控制指令,使得飞行器可在与第一手势的手势轨迹相同的水平运动方向上,调整所确定的水平移动距离,使得飞行器可跟随第一手势的手势轨迹在相同的水平运动方向上飞行。Optionally, the processing chip of the aircraft can acquire a user image of each frame collected in real time, determine the horizontal moving distance of the aircraft in real time based on the position of the first gesture in each frame of the user image, and then output a flight control instruction to the flight mechanism of the aircraft. The aircraft is enabled to adjust the determined horizontal movement distance in the same horizontal motion direction as the gesture trajectory of the first gesture, so that the aircraft can follow the gesture trajectory of the first gesture to fly in the same horizontal motion direction.
可选的,图16示出了根据第一手势在所述用户图像中的位置,确定飞行器调整的垂直移动距离的方法流程图,该方法可应用于飞行器,具体可应用于飞行器的处理芯片,参照图16,该方法可以包括:Optionally, FIG. 16 is a flowchart of a method for determining an adjusted vertical movement distance of an aircraft according to a position of the first gesture in the user image, and the method is applicable to an aircraft, and specifically applicable to an processing chip of the aircraft. Referring to Figure 16, the method can include:
步骤S800、以飞行器的图像采集装置在纵轴方向上的视线范围构建纵轴坐标,所述纵轴坐标的原点为所述图像采集装置在纵轴方向上的视线中点。Step S800, constructing a vertical axis coordinate with a line of sight range of the image capturing device of the aircraft in the longitudinal axis direction, and an origin of the vertical axis coordinate is a midpoint of the line of sight of the image capturing device in the longitudinal axis direction.
如图17所示,以图像采集装置为摄像头为例,假设A点为摄像头的位置, AB和AC分别是摄像头纵轴视线的极限(即摄像头在纵轴方向上的视线范围),则BC是以摄像头在纵轴方向上的视线范围构建的纵轴坐标;虚线AD为摄像头中心线,D为摄像头在纵轴方向上的视线中点,为纵轴坐标的原点。As shown in Fig. 17, taking the image acquisition device as a camera, let A point be the position of the camera, AB and AC are the limits of the vertical axis of the camera (that is, the line of sight of the camera in the vertical axis direction), then BC is The vertical axis coordinate constructed by the line of sight of the camera in the longitudinal direction; the dotted line AD is the center line of the camera, and D is the midpoint of the line of sight of the camera in the longitudinal direction, which is the origin of the vertical axis coordinate.
步骤S810、确定第一手势在用户图像中的位置在纵轴坐标上的投影点,并确定该投影点在所述纵轴坐标上的坐标。Step S810, determining a projection point of the position of the first gesture in the user image on the vertical axis coordinate, and determining coordinates of the projection point on the vertical axis coordinate.
在确定出第一手势在用户图像中的位置后,本申请实施例可确定第一手势在用户图像中的位置在垂直方向上的投影点,即第一手势在用户图像中的位置,在纵轴坐标上的投影点,如图17所示,第一手势在用户图像中的位置,在垂直方向上的投影点为P点;P点在纵轴BC上的坐标为该投影点在纵轴上的坐标。After determining the position of the first gesture in the user image, the embodiment of the present application may determine a projection point of the position of the first gesture in the user image in the vertical direction, that is, a position of the first gesture in the user image, The projection point on the coordinate of the axis, as shown in Fig. 17, the position of the first gesture in the user image, the projection point in the vertical direction is P point; the coordinate of the P point on the vertical axis BC is the projection point on the vertical axis The coordinates on the top.
步骤S820、根据所述纵轴坐标的高度,飞行器与地面的垂直高度,图像采集装置纵轴方向的半视角,所述图像采集装置的倾角与所述半视角的角度差,及该投影点在所述纵轴坐标上的坐标,确定飞行器的垂直移动距离。Step S820, according to the height of the vertical axis coordinate, the vertical height of the aircraft and the ground, the half angle of view of the vertical axis direction of the image capturing device, the angle difference between the inclination angle of the image capturing device and the half angle of view, and the projection point is The coordinates on the vertical axis coordinates determine the vertical movement distance of the aircraft.
如图17所示,AO是飞行器离地面的垂直高度,OAD是摄像头的倾角,CAD是摄像头纵轴方向的半视角,摄像头纵轴方向的半视角可以是摄像头纵轴方向视角的半角;OAC是OAD与CAD角的角度差;纵轴坐标的高度可以根据图像界面的高度确定,比如采集的是640*360分辨率的图像,则纵轴坐标的高度可以为360,即可以根据界面的纵轴高度确定纵轴坐标的高度;As shown in Fig. 17, AO is the vertical height of the aircraft from the ground, OAD is the inclination of the camera, CAD is the half angle of the longitudinal direction of the camera, and the half angle of view of the longitudinal axis of the camera can be the half angle of the longitudinal direction of the camera; OAC is The angle difference between the OAD and the CAD angle; the height of the vertical axis coordinate can be determined according to the height of the image interface. For example, if the image is collected by 640*360 resolution, the height of the vertical axis coordinate can be 360, that is, according to the vertical axis of the interface. Height determines the height of the vertical axis coordinates;
为使得投影点P落在摄像头采集图像的中心点D上,飞行器需要调整PD的垂直移动距离;In order for the projection point P to fall on the center point D of the image captured by the camera, the aircraft needs to adjust the vertical movement distance of the PD;
相应的,可设AO为H,CAD为θ,OAC为δ,纵轴坐标的高度为Ly,第一手势在用户图像中的位置,在纵轴坐标上的投影点的纵轴坐标为y,飞行器需要调整的垂直移动距离为Sy,则可根据如下公式确定飞行器需要调整的垂直移动距离:Correspondingly, AO can be set to H, CAD is θ, OAC is δ, vertical coordinate height is Ly, the position of the first gesture in the user image, and the vertical axis coordinate of the projection point on the vertical coordinate is y, The vertical movement distance that the aircraft needs to adjust is Sy, then the vertical movement distance that the aircraft needs to adjust can be determined according to the following formula:
Sy=H*(tan(δ+θ)-tan(δ+θ-arctan(2*y*tanθ/Ly)))。Sy=H*(tan(δ+θ)-tan(δ+θ-arctan(2*y*tanθ/Ly)))).
可选的,飞行器的处理芯片可获取实时采集的每帧用户图像,基于每帧用户图像中第一手势的位置,实时的确定飞行器的垂直移动距离,然后向飞行器的飞行机构输出飞行控制指令,使得飞行器可在与第一手势的手势轨迹相同的垂直运动方向,调整所确定的垂直移动距离。Optionally, the processing chip of the aircraft can acquire a user image of each frame collected in real time, determine the vertical moving distance of the aircraft in real time based on the position of the first gesture in each frame of the user image, and then output a flight control instruction to the flight mechanism of the aircraft. The aircraft is allowed to adjust the determined vertical movement distance in the same vertical movement direction as the gesture trajectory of the first gesture.
可选的,处理芯片基于每一帧图像所确定的水平移动距离和垂直移动距离可通过飞行控制指令输出,使得飞行器调整飞行姿态,实现在与第一手势的手势轨迹相同的水平运动方向上,调整所确定的水平移动距离,及在与第一手势的手势轨迹相同的垂直运动方向,调整所确定的垂直移动距离的,使得飞行器可实时跟随用户的第一手势的手势轨迹飞行,实现对飞行器的飞行路线的控制。Optionally, the horizontal moving distance and the vertical moving distance determined by the processing chip based on each frame image may be output through a flight control command, so that the aircraft adjusts the flying posture to achieve the same horizontal motion direction as the gesture trajectory of the first gesture. Adjusting the determined horizontal moving distance, and adjusting the determined vertical moving distance in the same vertical moving direction as the gesture trajectory of the first gesture, so that the aircraft can follow the gesture trajectory of the user's first gesture in real time to achieve the aircraft Control of the flight route.
可选的,本申请实施例可通过用户的第二手势,通知飞行器开始和取消跟随用户的第一手势飞行,即飞行器在未跟随用户的第一手势飞行时,若通过用户图像检测到用户的第二手势,则飞行器可开始跟随用户的第一手势飞行;相应的,用户可在操作第二手势后,切换通过第一手势进行手势轨迹操作,使得飞行器基于每帧用户图像中第一手势的位置,调整飞行姿态,跟随第一手势的手势轨迹飞行;而在用户希望飞行器取消跟随用户的第一手势飞行时,用户可从第一手势的手势轨迹操作,切换为操作第二手势,飞行器通过用户图像检测到用户的第二手势后,可取消跟随用户的第一手势飞行;Optionally, the embodiment of the present application may notify the aircraft to start and cancel the first gesture of following the user by the second gesture of the user, that is, when the aircraft flies without following the first gesture of the user, if the user is detected by the user image The second gesture, the aircraft can start to follow the first gesture of the user to fly; correspondingly, after the second gesture is operated, the user can switch the gesture trajectory operation by using the first gesture, so that the aircraft is based on the image of each frame of the user. a gesture position, adjusting the flight attitude, following the gesture trajectory of the first gesture; and when the user wants the aircraft to cancel the first gesture of following the user, the user can switch from the gesture trajectory of the first gesture to the second hand Potential, after the aircraft detects the second gesture of the user through the user image, the first gesture flight following the user may be cancelled;
可选的,图18示出了本申请实施例提供的飞行器的飞行路线控制方法的另一流程图,该方法可应用于飞行器,具体可应用于飞行器的处理芯片,参照图18,该方法可以包括:Optionally, FIG. 18 is another flowchart of a method for controlling a flight path of an aircraft provided by an embodiment of the present application. The method is applicable to an aircraft, and specifically, to a processing chip of an aircraft. Referring to FIG. 18, the method may be include:
步骤S900、实时获取图像采集装置采集的用户图像。Step S900: Acquire a user image collected by the image collection device in real time.
步骤S910、识别所述用户图像中的用户手势。Step S910: Identify a user gesture in the user image.
可选的,对于每一采集的用户图像,本申请实施例可识别用户图像中的用户手势是预定的第一手势,还是预定的第二手势,并根据不同的识别结果执行不同的处理流程;根据用户图像中识别的不同用户手势,执行不同的处理流程的示意,可参照下述步骤S920至步骤S940。Optionally, for each captured user image, the embodiment of the present application may identify whether the user gesture in the user image is a predetermined first gesture or a predetermined second gesture, and perform different processing processes according to different recognition results. The instructions for performing different processing flows according to different user gestures identified in the user image may be referred to the following steps S920 to S940.
可选的,对于每一采集的用户图像,本申请实施例可分别通过预先训练的第一手势的检测器和第二手势的检测器,对所述用户图像进行检测,判断用户图像中存在第一手势还是第二手势,或者,均不存在第一手势和第二手势。Optionally, for each captured user image, the embodiment of the present application may detect the user image by using a pre-trained first gesture detector and a second gesture detector, respectively, to determine that the user image exists. The first gesture is also the second gesture, or both the first gesture and the second gesture are absent.
可选的,对于每一采集的用户图像,本申请实施例也可以是通过皮肤检测算法,识别用户图像中的人体皮肤区域,将去除人脸区域的人体皮肤区域作为用户手势区域,分别将第一手势的轮廓特征和第二手势的轮廓特征,与用户手 势区域的轮廓特征进行匹配,判断用户图像中存在第一手势还是第二手势,或者,均不存在第一手势和第二手势;可选的,如果用户手势区域的轮廓特征,与第一手势的轮廓特征的匹配度高于预定第一匹配度,则可确定用户图像中存在第一手势,否则,确定用户图像中不存在第一手势;如果用户手势区域的轮廓特征,与第二手势的轮廓特征的匹配度高于预定的第一匹配度,则可确定用户图像中存在第二手势,否则,确定用户图像中不存在第二手势。Optionally, for each captured user image, the embodiment of the present application may also identify a human skin area in the user image by using a skin detection algorithm, and remove the human skin area of the face area as a user gesture area, respectively. A contour feature of the gesture and a contour feature of the second gesture are matched with the contour feature of the user gesture area to determine whether the first gesture or the second gesture exists in the user image, or neither the first gesture nor the second hand exists Optionally, if the contour feature of the user gesture area is matched with the contour feature of the first gesture by a predetermined first matching degree, it may be determined that the first gesture exists in the user image, otherwise, determining that the user image is not There is a first gesture; if the contour feature of the user gesture area is matched with the contour feature of the second gesture by a predetermined first matching degree, it may be determined that the second gesture exists in the user image, otherwise, the user image is determined There is no second gesture in it.
可选的,对于每一采集的用户图像,本申请实施例还可以提取用户图像中的连通区域,分别将第一手势的轮廓特征和第二手势的轮廓特征,与各连通区域的轮廓特征进行匹配,判断用户图像中存在第一手势还是第二手势,或者,均不存在第一手势和第二手势;可选的,如果存在与第一手势的轮廓特征的匹配度高于预定第二匹配度的连通区域,则可确定该连通区域表示的用户手势为第一手势,确定用户图像中存在第一手势,否则,确定用户图像中不存在第一手势;如果存在与第二手势的轮廓特征的匹配度高于预定第二匹配度的连通区域,则可确定该连通区域表示的用户手势为第二手势,确定用户图像中存在第二手势,否则,确定用户图像中不存在第二手势。Optionally, for each captured user image, the embodiment of the present application may further extract a connected area in the user image, respectively, the contour feature of the first gesture and the contour feature of the second gesture, and the contour feature of each connected area. Performing a match to determine whether the first gesture or the second gesture exists in the user image, or neither the first gesture nor the second gesture exists; optionally, if there is a matching degree with the contour feature of the first gesture is higher than a predetermined one The connected area of the second matching degree may determine that the user gesture represented by the connected area is the first gesture, determining that the first gesture exists in the user image, otherwise determining that the first gesture does not exist in the user image; if the second hand exists If the matching degree of the contour feature is higher than the connected region of the predetermined second matching degree, determining that the user gesture represented by the connected region is the second gesture, determining that the second gesture exists in the user image, otherwise determining the user image There is no second gesture.
可选的,本申请实施例可先检测用户图像中是否存在第一手势,在用户图像中不存在第一手势时,再检测用户图像中是否存在第二手势;也可以是先检测用户图像中是否存在第二手势,在用户图像中不存在第二手势时,在检测用户图像中是否存在第一手势;还可以是同时检测用户图像中是否存在第一手势,或者第二手势。Optionally, the embodiment of the present application may first detect whether a first gesture exists in the user image, and if there is no first gesture in the user image, whether the second gesture is detected in the user image, or the user image may be detected first. Whether there is a second gesture, whether there is a first gesture in detecting the user image when there is no second gesture in the user image, and whether the first gesture or the second gesture is detected in the user image at the same time. .
步骤S920、若所识别的用户手势为预定的第二手势,且飞行器当前未进入第一模式,触发所述飞行器进入第一模式,所述第一模式用于指示飞行器跟随用户的第一手势的手势轨迹飞行。Step S920: If the identified user gesture is a predetermined second gesture, and the aircraft does not currently enter the first mode, triggering the aircraft to enter a first mode, the first mode is used to indicate that the aircraft follows the user's first gesture. Gesture trajectory flight.
步骤S930、若所识别的用户手势为预定的第一手势,且飞行器当前已进入第一模式,确定所述第一手势在所述用户图像中的位置,根据所述第一手势在所述用户图像中的位置,调整飞行器的飞行姿态,以使飞行器跟随所述第一手势的手势轨迹飞行。Step S930, if the identified user gesture is a predetermined first gesture, and the aircraft has entered the first mode, determining a position of the first gesture in the user image, according to the first gesture in the user The position in the image adjusts the flight attitude of the aircraft to cause the aircraft to follow the gesture trajectory of the first gesture.
可选的,图13所示步骤S620和步骤S630的执行,可以建立在用户图像中识别的用户手势为第一手势,且飞行器当前已进入第一模式的情况下。Optionally, the execution of step S620 and step S630 shown in FIG. 13 may establish that the user gesture identified in the user image is the first gesture, and the aircraft has entered the first mode.
步骤S940、若所识别的用户手势为预定的第二手势,且飞行器当前已进入第一模式,触发所述飞行器退出第一模式,指示所述飞行器取消跟随用户的第一手势的手势轨迹飞行。Step S940, if the identified user gesture is a predetermined second gesture, and the aircraft has entered the first mode, triggering the aircraft to exit the first mode, instructing the aircraft to cancel the gesture trajectory following the first gesture of the user. .
本申请实施例可定义飞行器跟随用户的第一手势的手势轨迹飞行的飞行模式为第一模式,飞行器进入第一模式后,可基于用户图像中第一手势的位置,调整飞行姿态,实现跟随所述第一手势的手势轨迹飞行的目的;而在飞行器未进入第一模式的状态,即使所采集的用户图像中存在第一手势,飞行器也不会基于用户图像中第一手势的位置,调整飞行姿态;因此飞行器是否进入第一模式,是飞行器是否跟随所述第一手势的手势轨迹飞行的前提。The embodiment of the present application may define a flight mode in which the aircraft follows the gesture trajectory of the first gesture of the user as the first mode. After the aircraft enters the first mode, the flight attitude may be adjusted based on the position of the first gesture in the user image to implement the following manner. The purpose of the first gesture of the gesture trajectory flight; and in the state where the aircraft does not enter the first mode, even if the first gesture exists in the captured user image, the aircraft does not adjust the flight based on the position of the first gesture in the user image. Gesture; therefore whether the aircraft enters the first mode is a precondition for whether the aircraft follows the gesture trajectory of the first gesture.
本申请实施例中,飞行器进入和退出第一模式,是由用户的第二手势控制的;如果飞行器当前未进入第一模式,则用户的第二手势可触发飞行器进入第一模式,使得飞行器可基于后续采集的用户图像中第一手势的位置,调整飞行姿态;如果飞行器当前进入第一模式,则用户的第二手势可触发飞行器退出第一模式,使得飞行器取消跟随用户的第一手势的手势轨迹飞行。In the embodiment of the present application, the aircraft enters and exits the first mode, which is controlled by the second gesture of the user; if the aircraft does not currently enter the first mode, the second gesture of the user may trigger the aircraft to enter the first mode, so that The aircraft may adjust the flight attitude based on the position of the first gesture in the subsequently acquired user image; if the aircraft is currently entering the first mode, the second gesture of the user may trigger the aircraft to exit the first mode, such that the aircraft cancels the first following the user Gesture gesture track flying.
基于图18所示,用户控制飞行器的飞行路线的方式可以是:Based on FIG. 18, the manner in which the user controls the flight path of the aircraft may be:
起始状态下,用户做出第二手势;飞行器通过采集的用户图像识别到第二手势后,飞行器进入第一模式;In the initial state, the user makes a second gesture; after the aircraft recognizes the second gesture by the collected user image, the aircraft enters the first mode;
用户做出第二手势后,切换手势为第一手势,并且通过第一手势挥动手臂;飞行器进入第一模式后,通过采集的用户图像识别到第一手势,可根据第一手势在采集的各用户图像中的位置,调整飞行姿态,实现飞行器跟随第一手势的手势轨迹飞行的目的;After the user makes the second gesture, the gesture is the first gesture, and the arm is swung by the first gesture; after the aircraft enters the first mode, the first gesture is recognized by the collected user image, and the first gesture is collected according to the first gesture. Position in each user image, adjusting the flight attitude, and achieving the purpose of the aircraft following the gesture trajectory of the first gesture;
用户希望飞行器取消跟随第一手势飞行时,可以切换手势为第二手势;飞行器通过采集的用户图像识别到第二手势后,从第一模式退出,不再跟随用户的第一手势的手势轨迹飞行。When the user wants the aircraft to cancel following the first gesture flight, the gesture may be switched to the second gesture; after the aircraft recognizes the second gesture by the collected user image, the aircraft exits from the first mode, and no longer follows the gesture of the user's first gesture. Trajectory flight.
以第二手势为五指张开手势,第一手势为握拳手势为例,图19示出了相应的飞行器的飞行路线控制示例图,如图19所示:Taking the second gesture as a five-finger open gesture, the first gesture is an example of a fist-finger gesture, and FIG. 19 shows an example of a flight path control of the corresponding aircraft, as shown in FIG.
飞行器在未进入第一模式的初始状态下,飞行器如果检测到采集的用户图像中存在五指张开手势,飞行器进入第一模式;In an initial state in which the aircraft does not enter the first mode, if the aircraft detects that there is a five-finger open gesture in the captured user image, the aircraft enters the first mode;
在飞行器进入第一模式后,飞行器如果检测到采集的用户图像中存在握拳 手势,则可以握拳手势在用户图像中的位置,调整飞行姿态,飞行器跟随用户握拳手势的手势轨迹飞行;After the aircraft enters the first mode, if the aircraft detects that there is a fist gesture in the captured user image, the position of the fist gesture in the user image may be adjusted, the flight attitude is adjusted, and the aircraft follows the gesture trajectory of the user's fist gesture;
在飞行器进入第一模式后,飞行器如果再次检测到用户图像中存在五指张开手势,飞行器退出第一模式;可选的,此时飞行器可在当前位置悬停。After the aircraft enters the first mode, if the aircraft detects that there is a five-finger open gesture in the user image, the aircraft exits the first mode; optionally, the aircraft can hover at the current position.
需要说明的是,上文描述的通过用户的第二手势,触发飞行器进入和退出第一模式,以使飞行器执行或取消根据用户的第一手势在用户图像中的位置,调整飞行姿态的方式,仅是可选的;It should be noted that, by the second gesture of the user described above, the aircraft is triggered to enter and exit the first mode, so that the aircraft performs or cancels the position according to the first gesture of the user in the user image, and the manner of adjusting the flight attitude. , only optional;
本申请实施例也可直接在检测到用户图像中存在第一手势时,以第一手势在用户图像中的位置,调整飞行姿态,实现飞行器跟随第一手势的手势轨迹飞行的目的,而不需要引入用户的第二手势控制飞行器执行或取消跟随第一手势的手势轨迹飞行;即用户可以在希望飞行器根据第一手势的手势轨迹飞行时,直接通过第一手势挥动手臂,使得飞行器跟随第一手势飞行,而不用先做出第二手势;用户在希望飞行器取消跟随第一手势飞行时,可以通过不操作第一手势实现。In the embodiment of the present application, when the first gesture is detected in the user image, the position of the first gesture in the user image is adjusted, and the flight attitude is adjusted to achieve the purpose of the aircraft following the gesture of the first gesture. Introducing the second gesture of the user to control the aircraft to perform or cancel the gesture trajectory following the first gesture; that is, the user can swing the arm directly through the first gesture when the aircraft is desired to fly according to the gesture trajectory of the first gesture, so that the aircraft follows the first Gesture flight without first making a second gesture; when the user wants the aircraft to cancel following the first gesture flight, the user can do so by not operating the first gesture.
可选的,本申请实施例可采用预先训练的第一手势的检测器,和第二手势的检测器进行用户图像中用户手势的识别;Optionally, the embodiment of the present application may adopt a detector of the first gesture that is pre-trained, and the detector of the second gesture performs identification of the user gesture in the user image;
可选的,对于握拳等第一手势,本申请实施例可采集大量的第一手势的手势图像及第一手势的背景图像,提取各第一手势的手势图像的haar等特征,以及各第一手势的背景图像的haar等特征;根据第一手势的手势图像的haar特征以及第一手势的背景图像的haar特征,采用cascade等机器训练方法进行训练,生成第一手势的检测器;第一手势的检测器可以识别采集的用户图像中是否存在第一手势,并在用户图像中存在第一手势时,确定第一手势在用户图像中的位置;Optionally, for the first gesture such as a fist, the embodiment of the present application may collect a large number of gesture images of the first gesture and a background image of the first gesture, and extract features such as haar of the gesture image of each first gesture, and each first a haar feature of the background image of the gesture; a haar feature of the gesture image of the first gesture and a haar feature of the background image of the first gesture, using a machine training method such as cascade to generate a detector of the first gesture; the first gesture The detector may identify whether there is a first gesture in the collected user image, and determine a position of the first gesture in the user image when the first gesture exists in the user image;
可选的,对于五指张开等第二手势,本申请实施例可采集大量的第二手势的手势图像及第二手势的背景图像,提取各第二手势的手势图像的方向梯度直方图(Histogram of Oriented Gradient,HOG)等特征,以及各第二手势的背景图像的HOG等特征;根据第二手势的手势图像的HOG特征,以及第二手势的背景图像的HOG特征,采用支持向量机(Support Vector Machine,SVM)等机器训练方法进行训练,生成第二手势的检测器;第二手势的检测器可以识别采 集的用户图像中是否存在第二手势,并在用户图像中存在第二手势时,确定第二手势在用户图像中的位置。Optionally, for the second gesture such as the five-finger opening, the embodiment of the present application may collect a plurality of gesture images of the second gesture and a background image of the second gesture, and extract a direction gradient of the gesture image of each second gesture. Features such as Histogram of Oriented Gradient (HOG), and HOG characteristics of the background image of each second gesture; HOG features of the gesture image according to the second gesture, and HOG features of the background image of the second gesture a training method using a support vector machine (SVM) to generate a second gesture detector; the second gesture detector can identify whether there is a second gesture in the collected user image, and When there is a second gesture in the user image, the location of the second gesture in the user image is determined.
可选的,在从采集的用户图像中识别到第一手势后,并确定用户图像中第一手势的区域后,可以该区域的中心点在用户图像中的位置,作为第一手势在用户图像中的位置;或者,也可以在用户图像中定义一个边缘与该区域对应的矩形框,以该矩形框的中心点在用户图像中的位置,作为第一手势在用户图像中的位置;第二手势在用户图像中的位置的确定可与此同理;可选的,本段介绍的确定手势在用户图像中的位置的方式,可不限于采用检测器识别用户手势的情况,也可适用于通过用户图像中的皮肤区域,或者连通区域识别用户手势的情况。Optionally, after the first gesture is recognized from the collected user image, and the area of the first gesture in the user image is determined, the position of the center point of the area in the user image may be used as the first gesture in the user image. Or the position in the user image; or a rectangular frame corresponding to the area in the user image, the position of the center point of the rectangular frame in the user image as the position of the first gesture in the user image; The determination of the position of the gesture in the user image may be the same; optionally, the manner of determining the position of the gesture in the user image described in this paragraph may be limited to the case of using the detector to recognize the user's gesture, and may also be applied to The situation of the user's gesture is recognized by the skin area in the user image, or the connected area.
可选的,由于地面可能同时存在多个用户,飞行器在获取用户图像后,用户图像中也可能存在多个同时做出第一手势或第二手势的用户,此时飞行器需要确定基于哪个用户的手势进行飞行控制;基于此,本申请实施例可设定控制飞行器飞行的合法用户,为实现飞行器基于合法用户的用户手势进行飞行控制,本申请实施例可预置合法用户的人脸特征,在获取到用户图像后,飞行器可以判断用户图像中是否存在与合法用户的人脸特征匹配的用户人脸,从而在用户图像中存在与合法用户的人脸特征匹配的用户人脸时,基于用户图像中合法用户(用户图像中人脸区域与合法用户的人脸特征匹配的用户)的第一手势或第二手势,进行飞行控制;Optionally, since there may be multiple users at the same time on the ground, after the aircraft acquires the user image, there may be multiple users in the user image that simultaneously make the first gesture or the second gesture. At this time, the aircraft needs to determine which user is based on the user. The gestures are used for flight control; based on this, the embodiment of the present application can set a legal user to control the flight of the aircraft, and the flight control of the user gesture based on the legal user is implemented in the embodiment of the present application. After acquiring the user image, the aircraft may determine whether there is a user face matching the facial feature of the legal user in the user image, so that when there is a user face matching the facial feature of the legal user in the user image, the user is based on the user a first gesture or a second gesture of a legitimate user in the image (a user whose face area matches a face feature of a legitimate user in the user image) performs flight control;
相应的,本申请实施例在识别用户图像中的用户手势前,可以先提取用户图像中的人脸区域,判断所提取的人脸区域中是否存在与合法用户的人脸特征相匹配的人脸区域,从而对所述用户图像中与合法用户的人脸特征相匹配的人脸区域对应的合法用户的用户手势进行识别;Correspondingly, before identifying the user gesture in the user image, the embodiment of the present application may first extract a face region in the user image, and determine whether there is a face matching the facial feature of the legal user in the extracted face region. a region, thereby identifying a user gesture of a legal user corresponding to a face region of the user image that matches a facial feature of the legal user;
可选的,图20示出了本申请实施例提供的飞行器的飞行路线控制方法的再一流程图,该方法可应用于飞行器,具体可应用于飞行器的处理芯片,参照图20,该方法可以包括:Optionally, FIG. 20 is still another flowchart of a method for controlling a flight path of an aircraft provided by an embodiment of the present application. The method may be applied to an aircraft, and may be specifically applied to a processing chip of an aircraft. Referring to FIG. 20, the method may be include:
步骤S1000、获取图像采集装置采集的用户图像。Step S1000: Acquire a user image collected by the image collection device.
步骤S1010、判断所述用户图像中是否存在与合法用户的人脸特征相匹配的人脸区域,若否,执行步骤S1020,若是,执行步骤S1030。In step S1010, it is determined whether there is a face area matching the facial features of the legal user in the user image. If not, step S1020 is performed, and if yes, step S1030 is performed.
可选的,对于每一获取的用户图像,本申请实施例可判断用户图像中是否具有合法用户的人脸区域。Optionally, for each acquired user image, the embodiment of the present application may determine whether the user image has a face area of a legitimate user.
步骤S1020、结束流程。Step S1020, ending the process.
若当前用户图像中不存在与合法用户的人脸特征相匹配的人脸区域,则可确认当前用户图像中不存在合法用户的人像,不能够基于当前用户图像进行飞行器的飞行路线控制,可结束当前流程,并等待下一帧获取的用户图像到来,对下一帧获取的用户图像进行如步骤S1010的处理。If there is no face area matching the facial features of the legitimate user in the current user image, it can be confirmed that there is no portrait of the legitimate user in the current user image, and the flight path control of the aircraft cannot be performed based on the current user image, and the end can be ended. The current process waits for the user image acquired in the next frame to arrive, and performs the processing of step S1010 on the user image acquired in the next frame.
步骤S1030、对与合法用户的人脸特征相匹配的人脸区域在用户图像中对应的用户手势进行识别。Step S1030: Identify a user gesture corresponding to a face feature of a legitimate user in a user gesture in the user image.
可选的,在确定用户图像中存在与合法用户的人脸特征相匹配的人脸区域后,本申请实施例可提取该人脸区域在用户图像中对应的用户人像,识别该用户人像的用户手势,实现对合法用户在用户图像中的用户手势进行识别。Optionally, after determining that the face area of the user image matches the face feature of the legal user, the embodiment of the present application may extract the user figure corresponding to the face area in the user image, and identify the user of the user figure. Gestures enable recognition of user gestures by legitimate users in the user's image.
步骤S1040、若所识别的用户手势为预定的第一手势,确定所述第一手势在所述用户图像中的位置。Step S1040: Determine the position of the first gesture in the user image if the identified user gesture is a predetermined first gesture.
步骤S1050、根据所述第一手势在所述用户图像中的位置,调整飞行器的飞行姿态,以使飞行器跟随所述第一手势的手势轨迹飞行。Step S1050: Adjust a flight attitude of the aircraft according to a position of the first gesture in the user image, so that the aircraft follows the gesture trajectory of the first gesture to fly.
显然,图20所示的通过人脸检测技术,验证用户图像中是否具有合法用户的方式,也可以应用于图18所示方法;对于图18所示的每一获取的用户图像,可进行是否存在与合法用户的人脸特征相匹配的人脸区域的判断,并在判断结果为是时,对与合法用户的人脸特征相匹配的人脸区域在用户图像中对应的用户手势进行识别,并进行后续处理。Obviously, the method for verifying whether a user has a legitimate user in the user image by using the face detection technology shown in FIG. 20 can also be applied to the method shown in FIG. 18; for each acquired user image shown in FIG. There is a face region matching the facial features of the legal user, and when the determination result is yes, the user face corresponding to the face feature of the legitimate user is identified in the user image corresponding to the user face, And follow-up processing.
本申请实施例提供的飞行器的飞行路线控制方法,可通过用户的第一手势的手势轨迹控制飞行器的飞行路线,便捷的实现飞行器的飞行路线控制。The flight path control method of the aircraft provided by the embodiment of the present application can control the flight path of the aircraft through the gesture track of the first gesture of the user, and conveniently realize the flight path control of the aircraft.
下面对本申请实施例提供的飞行器进行介绍,下文描述的飞行器内容可与上文描述内容相互对应参照。The aircraft provided by the embodiments of the present application will be described below, and the aircraft contents described below may be referred to each other in correspondence with the above description.
下面站在飞行器识别用户图像中的用户手势的角度,对本申请实施例提供的飞行器飞行控制装置进行介绍。下文描述的飞行器飞行控制装置,可以认为是飞行器的处理芯片为实现本申请实施例提供的飞行器飞行控制方法,所需设置的功能模块架构;下文描述的飞行器飞行控制装置可与上文描述的内容相互 对应参照。The aircraft flight control device provided by the embodiment of the present application is introduced below by the angle of the user's gesture in the image of the user. The aircraft flight control device described below can be considered as the processing chip of the aircraft to implement the aircraft flight control method provided by the embodiment of the present application, and the functional module architecture required to be set; the aircraft flight control device described below can be described above. Corresponding reference.
图21为本申请实施例提供的飞行器飞行控制装置的结构框图,该飞行器飞行控制装置可应用于飞行器,具体可应用于飞行器的处理芯片,参照图21,该飞行器飞行控制装置可以包括:FIG. 21 is a structural block diagram of an aircraft flight control device according to an embodiment of the present disclosure. The aircraft flight control device is applicable to an aircraft, and is specifically applicable to a processing chip of an aircraft. Referring to FIG. 21, the aircraft flight control device may include:
图像获取模块100,用于获取用户图像;The image obtaining module 100 is configured to acquire a user image.
手势识别模块200,用于识别所述用户图像中的用户手势;a gesture recognition module 200, configured to identify a user gesture in the user image;
飞行指令确定模块300,用于根据预定义的各用户手势与飞行指令的对应关系,确定所述用户手势对应的飞行指令;The flight instruction determining module 300 is configured to determine a flight instruction corresponding to the user gesture according to a predefined correspondence between each user gesture and a flight instruction;
飞行控制模块400,用于根据所述飞行指令控制飞行器飞行。The flight control module 400 is configured to control the flight of the aircraft according to the flight instruction.
可选的,手势识别模块200,用于识别所述用户图像中的用户手势,具体包括:Optionally, the gesture recognition module 200 is configured to identify a user gesture in the user image, and specifically includes:
根据肤色检测算法,识别所述用户图像中的人体皮肤区域;Identifying a human skin region in the user image according to a skin color detection algorithm;
从人体皮肤区域中提取用户手势区域;Extracting a user gesture area from a human skin area;
将用户手势区域的轮廓特征,与预置的各标准用户手势的轮廓特征进行匹配,确定与所述用户手势区域的轮廓特征匹配度最高的标准用户手势;Matching a contour feature of the user gesture area with a preset contour feature of each standard user gesture to determine a standard user gesture with the highest degree of matching with the contour feature of the user gesture area;
将所确定的标准用户手势作为从所述用户图像中识别的用户手势。The determined standard user gesture is taken as the user gesture identified from the user image.
可选的,手势识别模块200,用于从人体皮肤区域中提取用户手势区域,具体包括:Optionally, the gesture recognition module 200 is configured to extract a user gesture area from the human skin area, and specifically includes:
去除所述人体皮肤区域中的人脸区域,得到用户手势区域。The face area in the human skin area is removed to obtain a user gesture area.
可选的,手势识别模块200,用于识别所述用户图像中的用户手势,具体包括:Optionally, the gesture recognition module 200 is configured to identify a user gesture in the user image, and specifically includes:
提取所述用户图像中的连通区域;Extracting a connected area in the user image;
提取各连通区域的轮廓特征;Extracting contour features of each connected area;
将各连通区域的轮廓特征,与预置的各标准用户手势的轮廓特征进行匹配,确定匹配度最高的标准用户手势,将匹配度最高的标准用户手势,作为从所述用户图像中识别的用户手势。Matching the contour features of each connected area with the preset contour features of each standard user gesture to determine the standard user gesture with the highest matching degree, and the standard user gesture with the highest matching degree as the user identified from the user image gesture.
可选的,手势识别模块200,用于提取所述用户图像中的连通区域,具体包括:Optionally, the gesture recognition module 200 is configured to extract the connected area in the user image, and specifically includes:
提取用户图像中的所有连通区域,或,提取去除人脸区域后的用户图像中 的连通区域。All connected areas in the user image are extracted, or connected areas in the user image after the face area is removed are extracted.
可选的,图22示出了本申请实施例提供的飞行器飞行控制装置的另一结构框图,结合图21和图22所示,该飞行器飞行控制装置还可以包括:Optionally, FIG. 22 is another structural block diagram of the aircraft flight control device provided by the embodiment of the present application. As shown in FIG. 21 and FIG. 22, the aircraft flight control device may further include:
训练模块500,用于对于各标准用户手势,预先采集含有标准用户手势的多个用户图像,作为各标准用户手势对应的图像样本;对于各标准用户手势对应的图像样本,根据机器训练方法,训练各标准用户手势的检测器。The training module 500 is configured to pre-collect a plurality of user images including standard user gestures as image samples corresponding to standard user gestures for each standard user gesture; and image samples corresponding to each standard user gesture, according to a machine training method, A detector for each standard user gesture.
相应的,手势识别模块200,用于识别所述用户图像中的用户手势,具体包括:Correspondingly, the gesture recognition module 200 is configured to identify a user gesture in the user image, and specifically includes:
使用各标准用户手势的检测器,分别对所述用户图像进行检测,得到各标准用户手势的检测器对所述用户图像的检测结果;Using the detectors of the standard user gestures, respectively detecting the user images, and obtaining detection results of the user images by the detectors of the standard user gestures;
根据所述用户图像的检测结果,确定从所述用户图像中识别的用户手势。A user gesture recognized from the user image is determined based on a detection result of the user image.
可选的,图像获取模块100,用于获取用户图像,具体包括:Optionally, the image obtaining module 100 is configured to acquire a user image, and specifically includes:
获取所述飞行器的图像采集装置所采集的用户图像;Obtaining a user image collected by the image acquisition device of the aircraft;
或者,获取地面图像采集装置所采集的用户图像。Alternatively, the user image collected by the ground image acquisition device is acquired.
可选的,若图像获取模块100获取的是飞行器的图像采集装置所采集的用户图像,如图23所示飞行器飞行控制装置的再一结构框图,结合图21和图23所示,该飞行器飞行控制装置还可以包括:Optionally, if the image acquisition module 100 acquires the user image collected by the image acquisition device of the aircraft, as shown in FIG. 23, another structural block diagram of the aircraft flight control device, as shown in FIG. 21 and FIG. 23, the aircraft flies. The control device may further include:
角度调整模块600,用于在根据所述飞行指令控制飞行器飞行后,调整所述飞行器的图像采集装置的图像采集角度,使得用户处于所述图像采集装置的图像采集范围内。The angle adjustment module 600 is configured to adjust an image acquisition angle of the image capturing device of the aircraft after the aircraft is controlled to fly according to the flight instruction, so that the user is within the image collection range of the image capturing device.
可选的,如果所获取的用户图像中包含多个用户人像,本申请实施例需要识别出合法用户的用户人像,从而基于合法用户的用户人像的用户手势,实现飞行器的飞行控制;Optionally, if the obtained user image includes multiple user portraits, the embodiment of the present application needs to identify a user portrait of a legitimate user, thereby implementing flight control of the aircraft based on a user gesture of the user portrait of the legitimate user;
相应的,手势识别模块200,用于从人体皮肤区域中提取用户手势区域,具体包括:Correspondingly, the gesture recognition module 200 is configured to extract a user gesture area from the human skin area, and specifically includes:
判断所述用户图像中是否存在与合法用户的人脸特征相匹配的人脸区域;Determining whether there is a face area in the user image that matches a facial feature of the legal user;
若所述用户图像中存在与合法用户的人脸特征相匹配的人脸区域,提取所述用户图像中与合法用户的人脸特征相匹配的人脸区域所对应的用户人像;And if there is a face region in the user image that matches a face feature of the legal user, extracting a user portrait corresponding to the face region of the user image that matches the face feature of the legal user;
识别所述用户人像中的用户手势。A user gesture in the user portrait is identified.
可选的,手势识别模块200识别所述用户人像中的用户手势的方式,可参照上文描述;具体的,手势识别模块200,用于识别所述用户人像中的用户手势,具体包括:Optionally, the manner in which the gesture recognition module 200 identifies the user gesture in the user portrait may be referred to the above description. Specifically, the gesture recognition module 200 is configured to identify the user gesture in the user portrait, and specifically includes:
识别所述用户人像中的人体皮肤区域,从人体皮肤区域中提取用户手势区域,将所述用户手势区域的轮廓特征,与预置的各标准用户手势的轮廓特征进行匹配,确定与所述用户手势区域的轮廓特征匹配度最高的标准用户手势,得到从所述用户人像中识别的用户手势;Identifying a human skin area in the user portrait, extracting a user gesture area from the human skin area, matching a contour feature of the user gesture area with a preset contour feature of each standard user gesture, and determining the user a standard user gesture with the highest degree of contour feature matching of the gesture area, obtaining a user gesture recognized from the user portrait;
或,提取所述用户人像中的连通区域,将各连通区域的轮廓特征,与预置的各标准用户手势的轮廓特征进行匹配,确定匹配度最高的标准用户手势,将匹配度最高的标准用户手势,作为从所述用户人像中识别的用户手势;Or, extracting the connected area in the user portrait, matching the contour feature of each connected area with the preset contour features of each standard user gesture, determining a standard user gesture with the highest matching degree, and selecting the standard user with the highest matching degree. Gesture as a user gesture recognized from the user portrait;
或,使用各标准用户手势的检测器,分别对所述用户人像进行检测,得到各标准用户手势的检测器对所述用户人像的检测结果,根据所述用户人像的检测结果,确定从所述用户人像中识别的用户手势。Or, using the detector of each standard user gesture, respectively detecting the user portrait, obtaining a detection result of the user portrait by a detector of each standard user gesture, and determining, according to the detection result of the user portrait, from the User gestures identified in the user's portrait.
可选的,图24示出了本申请实施例提供的飞行器飞行控制装置的另一结构框图,结合图21和图24所示,该飞行器飞行控制装置还可以包括:Optionally, FIG. 24 is another structural block diagram of the aircraft flight control device provided by the embodiment of the present application. As shown in FIG. 21 and FIG. 24, the aircraft flight control device may further include:
手势位置确定模块700,用于若所识别的用户手势为预定的第一手势,确定所述第一手势在所述用户图像中的位置;a gesture location determining module 700, configured to determine a location of the first gesture in the user image if the identified user gesture is a predetermined first gesture;
所述飞行控制模块400,还用于根据所述第一手势在所述用户图像中的位置,调整飞行器的飞行姿态,以使飞行器跟随所述第一手势的手势轨迹飞行。The flight control module 400 is further configured to adjust a flight attitude of the aircraft according to a position of the first gesture in the user image, so that the aircraft follows the gesture trajectory of the first gesture to fly.
可选的,若图像获取模块100获取的是飞行器的图像采集装置所采集的用户图像;则所述飞行控制模块400,用于根据所述第一手势在所述用户图像中的位置,调整飞行器的飞行姿态,具体包括:Optionally, if the image acquisition module 100 acquires the user image collected by the image capturing device of the aircraft, the flight control module 400 is configured to adjust the aircraft according to the position of the first gesture in the user image. The flight posture specifically includes:
根据所述位置,确定飞行器在与第一手势的手势轨迹相同的水平运动方向上,调整的水平移动距离;及根据所述位置,确定飞行器在与第一手势的手势轨迹相同的垂直运动方向,调整的垂直移动距离;Determining, according to the position, an adjusted horizontal movement distance in the same horizontal motion direction as the gesture trajectory of the first gesture; and determining, according to the position, the same vertical motion direction of the aircraft as the gesture trajectory of the first gesture, Adjusted vertical movement distance;
以所确定的水平移动距离和垂直移动距离调整飞行器的飞行姿态,使得第一手势始终位于所述图像采集装置的图像采集视野范围内。The flight attitude of the aircraft is adjusted with the determined horizontal movement distance and vertical movement distance such that the first gesture is always within the image acquisition field of view of the image acquisition device.
所述飞行控制模块400,还用于若所识别的用户手势为预定的第二手势,且飞行器当前未进入第一模式,触发所述飞行器进入第一模式,所述第一模式 用于指示飞行器跟随用户的第一手势的手势轨迹飞行;The flight control module 400 is further configured to: if the recognized user gesture is a predetermined second gesture, and the aircraft does not currently enter the first mode, triggering the aircraft to enter the first mode, where the first mode is used to indicate The aircraft follows the gesture trajectory of the user's first gesture;
若所识别的用户手势为预定的第二手势,且飞行器当前已进入第一模式,触发所述飞行器退出第一模式,指示所述飞行器取消跟随用户的第一手势的手势轨迹飞行;If the identified user gesture is a predetermined second gesture, and the aircraft has entered the first mode, triggering the aircraft to exit the first mode, instructing the aircraft to cancel the gesture trajectory following the first gesture of the user;
所述飞行控制模块400,用于若所识别的用户手势为预定的第一手势,确定所述第一手势在所述用户图像中的位置,具体包括:The flight control module 400 is configured to: if the identified user gesture is a predetermined first gesture, determine a location of the first gesture in the user image, specifically:
若所识别的用户手势为预定的第一手势,且飞行器当前已进入第一模式,确定所述第一手势在所述用户图像中的位置。If the identified user gesture is a predetermined first gesture and the aircraft has currently entered the first mode, determining the location of the first gesture in the user image.
所述手势识别模块200,还用于在识别所述用户图像中的用户手势之前,判断所述用户图像中是否存在与合法用户的人脸特征相匹配的人脸区域;The gesture recognition module 200 is further configured to: before identifying the user gesture in the user image, determining whether there is a face region in the user image that matches a facial feature of the legal user;
所述手势识别模块200,用于识别所述用户图像中的用户手势,具体包括:The gesture recognition module 200 is configured to identify a user gesture in the user image, and specifically includes:
若所述用户图像中存在与合法用户的人脸特征相匹配的人脸区域,对与合法用户的人脸特征相匹配的人脸区域在用户图像中对应的用户手势进行识别。If there is a face region in the user image that matches the face feature of the legitimate user, the face region that matches the face feature of the legitimate user is identified in the user gesture corresponding to the user image.
本申请实施例还提供一种飞行器,该飞行器可以包括:图像采集装置和处理芯片;其中,处理芯片可以包括:上述所述的飞行器飞行控制装置。The embodiment of the present application further provides an aircraft, the aircraft may include: an image capturing device and a processing chip; wherein the processing chip may include: the aircraft flight control device described above.
可选的,飞行器的图像采集装置可采集用户图像,相应的,处理芯片的图像获取模块可获取飞行器的图像采集装置所采集的用户图像;Optionally, the image capturing device of the aircraft can collect the image of the user, and correspondingly, the image acquiring module of the processing chip can acquire the image of the user collected by the image capturing device of the aircraft;
可选的,处理芯片的图像获取模块也可能获取地面图像采集装置所采集的用户图像。Optionally, the image acquisition module of the processing chip may also acquire the user image collected by the ground image acquisition device.
可选的,本申请实施例还提供一种飞行器飞行控制系统,如图3所示,该飞行器飞行控制系统可以包括:地面图像采集装置和飞行器;Optionally, the embodiment of the present application further provides an aircraft flight control system. As shown in FIG. 3, the aircraft flight control system may include: a ground image acquisition device and an aircraft;
其中,地面图像采集装置,用于采集用户图像,并传输给飞行器;Wherein, the ground image acquisition device is configured to collect a user image and transmit it to the aircraft;
所述飞行器包括处理芯片;所述处理芯片,用于获取地面图像采集装置传输的用户图像;识别所述用户图像中的用户手势;根据预定义的各用户手势与飞行指令的对应关系,确定所述用户手势对应的飞行指令;根据所述飞行指令控制飞行器飞行。The aircraft includes a processing chip; the processing chip is configured to acquire a user image transmitted by the ground image capturing device; identify a user gesture in the user image; and determine a location according to a predefined correspondence between each user gesture and a flight instruction. Determining a flight instruction corresponding to the user gesture; controlling flight of the aircraft according to the flight instruction.
飞行器的处理芯片的具体功能实现可参照上文相应部分描述。The specific functional implementation of the processing chip of the aircraft can be described with reference to the corresponding section above.
可选的,本申请实施例还提供另一种飞行器飞行控制系统,如图12所示,该飞行器飞行控制系统可以包括:地面图像采集装置,地面处理芯片和飞行器;Optionally, the embodiment of the present application further provides another aircraft flight control system. As shown in FIG. 12, the aircraft flight control system may include: a ground image acquisition device, a ground processing chip, and an aircraft;
其中,地面图像采集装置,用于采集用户图像,并传输给地面处理芯片;Wherein, the ground image acquisition device is configured to collect user images and transmit to the ground processing chip;
地面处理芯片,用于获取地面图像采集装置传输的用户图像;识别所述用户图像中的用户手势;根据预定义的各用户手势与飞行指令的对应关系,确定所述用户手势对应的飞行指令;将所述飞行指令传输给飞行器;a ground processing chip, configured to acquire a user image transmitted by the ground image capturing device; identify a user gesture in the user image; and determine a flight instruction corresponding to the user gesture according to a predefined correspondence between each user gesture and a flight instruction; Transmitting the flight instruction to the aircraft;
可选的,地面处理芯片实现用户手势识别,及用户手势对应的飞行指令确定的具体实现方式,可参照上文描述的飞行器的处理芯片识别用户手势,及确定用户手势对应的飞行指令的具体内容。Optionally, the ground processing chip implements user gesture recognition, and a specific implementation manner of the flight instruction corresponding to the user gesture, and may refer to the processing chip of the aircraft described above to identify the user gesture, and determine the specific content of the flight instruction corresponding to the user gesture. .
所述飞行器包括处理芯片;所述处理芯片,用于获取所述飞行指令,根据所述飞行指令控制飞行器飞行。The aircraft includes a processing chip; the processing chip is configured to acquire the flight instruction, and control aircraft flight according to the flight instruction.
本申请实施例可通过用户手势控制飞行器的飞行,飞行器的飞行控制操作极为便捷,可达到便捷的实现飞行器的飞行控制的目的。The embodiment of the present application can control the flight of the aircraft by the user gesture, and the flight control operation of the aircraft is extremely convenient, and the purpose of the flight control of the aircraft can be conveniently achieved.
本说明书中各个实施例采用递进的方式描述,每个实施例重点说明的都是与其他实施例的不同之处,各个实施例之间相同相似部分互相参见即可。对于实施例公开的装置而言,由于其与实施例公开的方法相对应,所以描述的比较简单,相关之处参见方法部分说明即可。The various embodiments in the present specification are described in a progressive manner, and each embodiment focuses on differences from other embodiments, and the same similar parts between the various embodiments may be referred to each other. For the device disclosed in the embodiment, since it corresponds to the method disclosed in the embodiment, the description is relatively simple, and the relevant parts can be referred to the method part.
专业人员还可以进一步意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,能够以电子硬件、计算机软件或者二者的结合来实现,为了清楚地说明硬件和软件的可互换性,在上述说明中已经按照功能一般性地描述了各示例的组成及步骤。这些功能究竟以硬件还是软件方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。A person skilled in the art will further appreciate that the elements and algorithm steps of the various examples described in connection with the embodiments disclosed herein can be implemented in electronic hardware, computer software or a combination of both, in order to clearly illustrate the hardware and software. Interchangeability, the composition and steps of the various examples have been generally described in terms of function in the above description. Whether these functions are performed in hardware or software depends on the specific application and design constraints of the solution. A person skilled in the art can use different methods to implement the described functions for each particular application, but such implementation should not be considered to be beyond the scope of the present application.
结合本文中所公开的实施例描述的方法或算法的步骤可以直接用硬件、处理器执行的软件模块,或者二者的结合来实施。软件模块可以置于随机存储器(RAM)、内存、只读存储器(ROM)、电可编程ROM、电可擦除可编程ROM、寄存器、硬盘、可移动磁盘、CD-ROM、或技术领域内所公知的任意其它形式的存储介质中。The steps of a method or algorithm described in connection with the embodiments disclosed herein can be implemented directly in hardware, a software module executed by a processor, or a combination of both. The software module can be placed in random access memory (RAM), memory, read only memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, hard disk, removable disk, CD-ROM, or technical field. Any other form of storage medium known.
对所公开的实施例的上述说明,使本领域专业技术人员能够实现或使用本申请。对这些实施例的多种修改对本领域的专业技术人员来说将是显而易见 的,本文中所定义的一般原理可以在不脱离本申请的核心思想或范围的情况下,在其它实施例中实现。因此,本申请将不会被限制于本文所示的这些实施例,而是要符合与本文所公开的原理和新颖特点相一致的最宽的范围。The above description of the disclosed embodiments enables those skilled in the art to make or use the application. Various modifications to these embodiments are obvious to those skilled in the art, and the general principles defined herein may be implemented in other embodiments without departing from the spirit or scope of the application. Therefore, the application is not limited to the embodiments shown herein, but is to be accorded the broadest scope of the principles and novel features disclosed herein.

Claims (32)

  1. 一种飞行器飞行控制方法,包括:An aircraft flight control method includes:
    获取用户图像;Obtaining a user image;
    识别所述用户图像中的用户手势;Identifying a user gesture in the user image;
    根据预定义的各用户手势与飞行指令的对应关系,确定所述用户手势对应的飞行指令;Determining, according to a predefined correspondence between each user gesture and a flight instruction, a flight instruction corresponding to the user gesture;
    根据所述飞行指令控制飞行器飞行。The aircraft is controlled to fly according to the flight instructions.
  2. 根据权利要求1所述的方法,所述识别所述用户图像中的用户手势包括:The method of claim 1, the identifying a user gesture in the user image comprising:
    根据肤色检测算法,识别所述用户图像中的人体皮肤区域;Identifying a human skin region in the user image according to a skin color detection algorithm;
    从人体皮肤区域中提取用户手势区域;Extracting a user gesture area from a human skin area;
    将用户手势区域的轮廓特征,与预置的各标准用户手势的轮廓特征进行匹配,确定与所述用户手势区域的轮廓特征匹配度最高的标准用户手势;Matching a contour feature of the user gesture area with a preset contour feature of each standard user gesture to determine a standard user gesture with the highest degree of matching with the contour feature of the user gesture area;
    将所确定的标准用户手势作为从所述用户图像中识别的用户手势。The determined standard user gesture is taken as the user gesture identified from the user image.
  3. 根据权利要求2所述的方法,所述从人体皮肤区域中提取用户手势区域包括:The method of claim 2, the extracting the user gesture area from the human skin area comprises:
    去除所述人体皮肤区域中的人脸区域,得到用户手势区域。The face area in the human skin area is removed to obtain a user gesture area.
  4. 根据权利要求1所述的方法,所述识别所述用户图像中的用户手势包括:The method of claim 1, the identifying a user gesture in the user image comprising:
    提取所述用户图像中的连通区域;Extracting a connected area in the user image;
    提取各连通区域的轮廓特征;Extracting contour features of each connected area;
    将各连通区域的轮廓特征,与预置的各标准用户手势的轮廓特征进行匹配,确定匹配度最高的标准用户手势,将匹配度最高的标准用户手势,作为从所述用户图像中识别的用户手势。Matching the contour features of each connected area with the preset contour features of each standard user gesture to determine the standard user gesture with the highest matching degree, and the standard user gesture with the highest matching degree as the user identified from the user image gesture.
  5. 根据权利要求4所述的方法,所述提取所述用户图像中的连通区域包括:The method of claim 4, the extracting the connected area in the user image comprises:
    提取用户图像中的所有连通区域,或,提取去除人脸区域后的用户图像中的连通区域。Extract all connected areas in the user image, or extract connected areas in the user image after removing the face area.
  6. 根据权利要求1所述的方法,所述方法还包括:The method of claim 1 further comprising:
    对于各标准用户手势,预先采集含有标准用户手势的多个用户图像,作为各标准用户手势对应的图像样本;For each standard user gesture, multiple user images containing standard user gestures are pre-acquired as image samples corresponding to each standard user gesture;
    对于各标准用户手势对应的图像样本,根据机器训练方法,训练各标准用户手势的检测器;For each image sample corresponding to the standard user gesture, according to the machine training method, the detector of each standard user gesture is trained;
    所述识别所述用户图像中的用户手势包括:The identifying a user gesture in the user image includes:
    使用各标准用户手势的检测器,分别对所述用户图像进行检测,得到各标准用户手势的检测器对所述用户图像的检测结果;Using the detectors of the standard user gestures, respectively detecting the user images, and obtaining detection results of the user images by the detectors of the standard user gestures;
    根据所述用户图像的检测结果,确定从所述用户图像中识别的用户手势。A user gesture recognized from the user image is determined based on a detection result of the user image.
  7. 根据权利要求1至6中任一项所述的方法,所述获取用户图像包括:The method according to any one of claims 1 to 6, wherein the obtaining a user image comprises:
    获取所述飞行器的图像采集装置所采集的用户图像;Obtaining a user image collected by the image acquisition device of the aircraft;
    或者,获取地面图像采集装置所采集的用户图像。Alternatively, the user image collected by the ground image acquisition device is acquired.
  8. 根据权利要求7所述的方法,若获取用户图像包括:获取所述飞行器的图像采集装置所采集的用户图像;所述方法还包括:The method of claim 7, wherein the acquiring the user image comprises: acquiring a user image collected by the image capturing device of the aircraft; the method further comprising:
    在根据所述飞行指令控制飞行器飞行后,调整所述飞行器的图像采集装置的图像采集角度,使得用户处于所述图像采集装置的图像采集范围内。After controlling the flight of the aircraft according to the flight instruction, the image acquisition angle of the image acquisition device of the aircraft is adjusted such that the user is within the image acquisition range of the image acquisition device.
  9. 根据权利要求1所述的方法,所述识别所述用户图像中的用户手势包括:The method of claim 1, the identifying a user gesture in the user image comprising:
    判断所述用户图像中是否存在与合法用户的人脸特征相匹配的人脸区域;Determining whether there is a face area in the user image that matches a facial feature of the legal user;
    若所述用户图像中存在与合法用户的人脸特征相匹配的人脸区域,提取所述用户图像中与合法用户的人脸特征相匹配的人脸区域所对应的用户人像;And if there is a face region in the user image that matches a face feature of the legal user, extracting a user portrait corresponding to the face region of the user image that matches the face feature of the legal user;
    识别所述用户人像中的用户手势。A user gesture in the user portrait is identified.
  10. 根据权利要求9所述的方法,其特征在于,所述识别所述用户人像中的用户手势包括:The method according to claim 9, wherein said identifying a user gesture in said user portrait comprises:
    识别所述用户人像中的人体皮肤区域,从人体皮肤区域中提取用户手势区域,将所述用户手势区域的轮廓特征,与预置的各标准用户手势的轮廓特征进行匹配,确定与所述用户手势区域的轮廓特征匹配度最高的标准用户手势,得到从所述用户人像中识别的用户手势;Identifying a human skin area in the user portrait, extracting a user gesture area from the human skin area, matching a contour feature of the user gesture area with a preset contour feature of each standard user gesture, and determining the user a standard user gesture with the highest degree of contour feature matching of the gesture area, obtaining a user gesture recognized from the user portrait;
    或,提取所述用户人像中的连通区域,将各连通区域的轮廓特征,与预置的各标准用户手势的轮廓特征进行匹配,确定匹配度最高的标准用户手势,将 匹配度最高的标准用户手势,作为从所述用户人像中识别的用户手势;Or, extracting the connected area in the user portrait, matching the contour feature of each connected area with the preset contour features of each standard user gesture, determining a standard user gesture with the highest matching degree, and selecting the standard user with the highest matching degree. Gesture as a user gesture recognized from the user portrait;
    或,使用各标准用户手势的检测器,分别对所述用户人像进行检测,得到各标准用户手势的检测器对所述用户人像的检测结果,根据所述用户人像的检测结果,确定从所述用户人像中识别的用户手势。Or, using the detector of each standard user gesture, respectively detecting the user portrait, obtaining a detection result of the user portrait by a detector of each standard user gesture, and determining, according to the detection result of the user portrait, from the User gestures identified in the user's portrait.
  11. 根据权利要求1所述的方法,所述识别所述用户图像中的用户手势之后,所述方法还包括:The method of claim 1, after the identifying a user gesture in the user image, the method further comprising:
    若所述用户手势为预定的第一手势,确定所述第一手势在所述用户图像中的位置;Determining a position of the first gesture in the user image if the user gesture is a predetermined first gesture;
    根据所述第一手势在所述用户图像中的位置,调整飞行器的飞行姿态,以使飞行器跟随所述第一手势的手势轨迹飞行。Adjusting the flight attitude of the aircraft according to the position of the first gesture in the user image to cause the aircraft to follow the gesture trajectory of the first gesture.
  12. 根据权利要求11所述的方法,若所述获取用户图像包括:获取所述飞行器的图像采集装置所采集的用户图像;所述根据所述第一手势在所述用户图像中的位置,调整飞行器的飞行姿态包括:The method according to claim 11, wherein the acquiring the user image comprises: acquiring a user image collected by the image capturing device of the aircraft; and adjusting the aircraft according to the position of the first gesture in the user image The flight attitude includes:
    根据所述位置,确定飞行器在与第一手势的手势轨迹相同的水平运动方向上,调整的水平移动距离;及根据所述位置,确定飞行器在与第一手势的手势轨迹相同的垂直运动方向上,调整的垂直移动距离;Determining, according to the position, an adjusted horizontal movement distance in the same horizontal motion direction as the gesture trajectory of the first gesture; and determining, according to the position, the same vertical motion direction of the aircraft as the gesture trajectory of the first gesture , adjusted vertical movement distance;
    以所确定的水平移动距离和垂直移动距离调整飞行器的飞行姿态,使得第一手势始终位于所述图像采集装置的图像采集视野范围内。The flight attitude of the aircraft is adjusted with the determined horizontal movement distance and vertical movement distance such that the first gesture is always within the image acquisition field of view of the image acquisition device.
  13. 根据权利要求12所述的方法,若所述获取用户图像包括:获取所述飞行器的图像采集装置所采集的用户图像;The method according to claim 12, wherein the acquiring the user image comprises: acquiring a user image collected by the image capturing device of the aircraft;
    则所述根据所述位置,确定飞行器在与第一手势的手势轨迹相同的水平运动方向上,调整的水平移动距离包括:Then, according to the location, determining that the aircraft is in the same horizontal motion direction as the gesture trajectory of the first gesture, the adjusted horizontal moving distance includes:
    以所述图像采集装置在横轴方向上的视线范围构建横轴坐标,所述横轴坐标的原点为所述图像采集装置在横轴方向上的视线中点;Constructing a horizontal axis coordinate with a line of sight range of the image capturing device in a horizontal axis direction, where an origin of the horizontal axis coordinate is a midpoint of a line of sight of the image capturing device in a horizontal axis direction;
    确定所述位置在横轴坐标上的投影点,并确定所述投影点在所述横轴坐标上的坐标;Determining a projection point of the position on a horizontal axis coordinate, and determining a coordinate of the projection point on the horizontal axis coordinate;
    根据所述横轴坐标的长度,飞行器与地面的垂直高度,所述图像采集装置的中心线和垂直方向的角度,所述图像采集装置的横轴方向视角的半角,及所述投影点在所述横轴坐标上的坐标,确定飞行器的水平移动距离。According to the length of the horizontal axis coordinate, the vertical height of the aircraft and the ground, the angle between the center line and the vertical direction of the image capturing device, the half angle of the viewing angle of the image capturing device in the horizontal axis direction, and the projection point in the The coordinates on the horizontal axis coordinates are used to determine the horizontal moving distance of the aircraft.
  14. 根据权利要求12所述的方法,若所述获取用户图像包括:获取所述飞行器的图像采集装置所采集的用户图像;The method according to claim 12, wherein the acquiring the user image comprises: acquiring a user image collected by the image capturing device of the aircraft;
    则所述根据所述位置,确定飞行器在与第一手势的手势轨迹相同的垂直运动方向上,调整的垂直移动距离包括:Then, according to the location, determining that the aircraft is in the same vertical motion direction as the gesture trajectory of the first gesture, the adjusted vertical movement distance includes:
    以所述图像采集装置在纵轴方向上的视线范围构建纵轴坐标,所述纵轴坐标的原点为所述图像采集装置在纵轴方向上的视线中点;Constructing a longitudinal axis coordinate with a line of sight range of the image capturing device in a longitudinal axis direction, the origin of the vertical axis coordinate being a midpoint of a line of sight of the image capturing device in a longitudinal axis direction;
    确定所述位置在纵轴坐标上的投影点,并确定该投影点在所述纵轴坐标上的坐标;Determining a projection point of the position on a coordinate of the longitudinal axis, and determining a coordinate of the projection point on the coordinate of the longitudinal axis;
    根据所述纵轴坐标的高度,飞行器与地面的垂直高度,所述图像采集装置纵轴方向的半视角,所述图像采集装置的倾角与所述半视角的角度差,及该投影点在所述纵轴坐标上的坐标,确定飞行器的垂直移动距离。a vertical height of the aircraft from the ground according to the height of the vertical axis coordinate, a half angle of view of the longitudinal axis of the image capturing device, an angle difference between the inclination of the image capturing device and the half angle of view, and the projection point is The coordinates on the vertical axis coordinates determine the vertical movement distance of the aircraft.
  15. 根据权利要求11所述的方法,所述识别所述用户图像中的用户手势包括:The method of claim 11, the identifying a user gesture in the user image comprising:
    通过预先训练的第一手势的检测器,对所述用户图像进行检测,判断所述用户图像中是否存在第一手势;And detecting, by the detector of the first gesture that is pre-trained, the user image, and determining whether the first gesture exists in the user image;
    或,根据皮肤检测算法,识别用户图像中的人体皮肤区域,从人体皮肤区域中去除人脸区域,得到用户手势区域,将用户手势区域的轮廓特征,与预定的第一手势的轮廓特征进行匹配,通过匹配度判断所述用户图像中是否存在第一手势;Or, according to the skin detection algorithm, identifying a human skin region in the user image, removing the human face region from the human skin region, obtaining a user gesture region, and matching the contour feature of the user gesture region with the contour feature of the predetermined first gesture Determining, by matching degree, whether there is a first gesture in the user image;
    或,提取用户图像中的连通区域,将各连通区域的轮廓特征与预定的第一手势的轮廓特征进行匹配,通过匹配度判断所述用户图像中是否存在第一手势。Or, extracting the connected area in the user image, matching the contour feature of each connected area with the contour feature of the predetermined first gesture, and determining whether the first gesture exists in the user image by the matching degree.
  16. 根据权利要求15所述的方法,所述所识别的用户手势为预定的第一手势包括:The method of claim 15, the identified first user gesture being a predetermined first gesture comprises:
    通过预先训练的第一手势的检测器,识别到用户图像中存在第一手势;Identifying, by the detector of the pre-trained first gesture, that the first gesture exists in the user image;
    或,用户图像中用户手势区域的轮廓特征,与预定的第一手势的轮廓特征的匹配度高于预定第一匹配度,则识别到用户图像中存在第一手势;Or, the contour feature of the user gesture area in the user image is matched with the contour feature of the predetermined first gesture by a predetermined first matching degree, and the first gesture is recognized in the user image;
    或,用户图像中存在与第一手势的轮廓特征的匹配度高于预定第二匹配度的连通区域,则识别到用户图像中存在第一手势。Or, if there is a connected area in the user image that matches the contour feature of the first gesture by a predetermined second matching degree, it is recognized that the first gesture exists in the user image.
  17. 根据权利要求15或16所述的方法,所述确定所述第一手势在所述用户图像中的位置包括:The method according to claim 15 or 16, wherein determining the location of the first gesture in the user image comprises:
    确定所述用户图像中第一手势对应的区域,以该区域的中心点在所述用户图像中的位置,作为第一手势在所述用户图像中的位置;Determining, in the user image, an area corresponding to the first gesture, where the center point of the area is in the user image, as a position of the first gesture in the user image;
    或,确定所述用户图像中第一手势的区域,定义边缘与该区域对应的矩形框,以该矩形框的中心点在所述用户图像中的位置,作为第一手势在所述用户图像中的位置。Or determining an area of the first gesture in the user image, defining a rectangular frame whose edge corresponds to the area, and a position of the center point of the rectangular frame in the user image as a first gesture in the user image s position.
  18. 根据权利要求11所述的方法,所述识别所述用户图像中的用户手势之后,所述方法还包括:The method of claim 11, after the identifying a user gesture in the user image, the method further comprises:
    若所识别的用户手势为预定的第二手势,且飞行器当前未进入第一模式,触发所述飞行器进入第一模式,所述第一模式用于指示飞行器跟随用户的第一手势的手势轨迹飞行;If the identified user gesture is a predetermined second gesture, and the aircraft is not currently entering the first mode, triggering the aircraft to enter a first mode, the first mode being used to indicate a gesture trajectory of the aircraft following the first gesture of the user flight;
    若所识别的用户手势为预定的第二手势,且飞行器当前已进入第一模式,触发所述飞行器退出第一模式,指示所述飞行器取消跟随用户的第一手势的手势轨迹飞行;If the identified user gesture is a predetermined second gesture, and the aircraft has entered the first mode, triggering the aircraft to exit the first mode, instructing the aircraft to cancel the gesture trajectory following the first gesture of the user;
    所述若所识别的用户手势为预定的第一手势,确定所述第一手势在所述用户图像中的位置包括:If the identified user gesture is a predetermined first gesture, determining the location of the first gesture in the user image includes:
    若所识别的用户手势为预定的第一手势,且飞行器当前已进入第一模式,确定所述第一手势在所述用户图像中的位置。If the identified user gesture is a predetermined first gesture and the aircraft has currently entered the first mode, determining the location of the first gesture in the user image.
  19. 根据权利要求18所述的方法,所述识别所述用户图像中的用户手势包括:The method of claim 18, the identifying a user gesture in the user image comprising:
    分别通过预先训练的第一手势的检测器和第二手势的检测器,对所述用户图像进行检测,以识别所述用户图像中的用户手势;The user image is detected by a detector of the first gesture that is pre-trained and a detector of the second gesture to identify a user gesture in the user image;
    或,根据皮肤检测算法,识别用户图像中的人体皮肤区域,从人体皮肤区域中去除人脸区域,得到用户手势区域,将用户手势区域的轮廓特征,分别与预定的第一手势的轮廓特征,和预定的第二手势的轮廓特征进行匹配,以识别所述用户图像中的用户手势;Or, according to the skin detection algorithm, identifying a human skin region in the user image, removing the human face region from the human skin region, obtaining a user gesture region, and respectively contouring the contour of the user gesture region with the contour feature of the predetermined first gesture, Matching a contour feature of the predetermined second gesture to identify a user gesture in the user image;
    或,提取用户图像中的连通区域,将各连通区域的轮廓特征,分别与预定的第一手势的轮廓特征,和预定的第二手势的轮廓特征进行匹配,以识别所述 用户图像中的用户手势。Or extracting a connected area in the user image, respectively matching the contour features of each connected area with the contour feature of the predetermined first gesture and the contour feature of the predetermined second gesture to identify the image in the user image User gestures.
  20. 根据权利要求11或18所述的方法,所述方法还包括:The method of claim 11 or 18, further comprising:
    判断所述用户图像中是否存在与合法用户的人脸特征相匹配的人脸区域;Determining whether there is a face area in the user image that matches a facial feature of the legal user;
    所述识别所述用户图像中的用户手势包括:The identifying a user gesture in the user image includes:
    若所述用户图像中存在与合法用户的人脸特征相匹配的人脸区域,对与合法用户的人脸特征相匹配的人脸区域在用户图像中对应的用户手势进行识别。If there is a face region in the user image that matches the face feature of the legitimate user, the face region that matches the face feature of the legitimate user is identified in the user gesture corresponding to the user image.
  21. 一种飞行器飞行控制装置,应用于飞行器,所述飞行器飞行控制装置包括:An aircraft flight control device is applied to an aircraft, the aircraft flight control device comprising:
    图像获取模块,用于获取用户图像;An image acquisition module, configured to acquire a user image;
    手势识别模块,用于识别所述用户图像中的用户手势;a gesture recognition module, configured to identify a user gesture in the user image;
    飞行指令确定模块,用于根据预定义的各用户手势与飞行指令的对应关系,确定所述用户手势对应的飞行指令;a flight instruction determining module, configured to determine a flight instruction corresponding to the user gesture according to a predefined correspondence between each user gesture and a flight instruction;
    飞行控制模块,用于根据所述飞行指令控制飞行器飞行。A flight control module is configured to control aircraft flight according to the flight instruction.
  22. 根据权利要求21所述的飞行器飞行控制装置,还包括:The aircraft flight control device according to claim 21, further comprising:
    训练模块,用于对于各标准用户手势,预先采集含有标准用户手势的多个用户图像,作为各标准用户手势对应的图像样本;对于各标准用户手势对应的图像样本,根据机器训练方法,训练各标准用户手势的检测器;a training module, configured to pre-collect a plurality of user images including standard user gestures as image samples corresponding to standard user gestures for each standard user gesture; and for each standard user gesture corresponding image sample, according to a machine training method, training each a detector for standard user gestures;
    所述手势识别模块,用于识别所述用户图像中的用户手势,具体包括:The gesture recognition module is configured to identify a user gesture in the user image, and specifically includes:
    使用各标准用户手势的检测器,分别对所述用户图像进行检测,得到各标准用户手势的检测器对所述用户图像的检测结果;Using the detectors of the standard user gestures, respectively detecting the user images, and obtaining detection results of the user images by the detectors of the standard user gestures;
    根据所述用户图像的检测结果,确定从所述用户图像中识别的用户手势。A user gesture recognized from the user image is determined based on a detection result of the user image.
  23. 根据权利要求21所述的飞行器飞行控制装置,所述手势识别模块,用于识别所述用户图像中的用户手势,包括:The aircraft flight control device according to claim 21, wherein the gesture recognition module is configured to identify a user gesture in the user image, including:
    判断所述用户图像中是否存在与合法用户的人脸特征相匹配的人脸区域;Determining whether there is a face area in the user image that matches a facial feature of the legal user;
    若所述用户图像中存在与合法用户的人脸特征相匹配的人脸区域,提取所述用户图像中与合法用户的人脸特征相匹配的人脸区域所对应的用户人像;And if there is a face region in the user image that matches a face feature of the legal user, extracting a user portrait corresponding to the face region of the user image that matches the face feature of the legal user;
    识别所述用户人像中的用户手势。A user gesture in the user portrait is identified.
  24. 根据权利要求21所述的飞行器飞行控制装置,所述飞行器飞行控制装置还包括手势位置确定模块,用于若所识别的用户手势为预定的第一手势, 确定所述第一手势在所述用户图像中的位置;The aircraft flight control device according to claim 21, the aircraft flight control device further comprising a gesture position determining module, configured to determine that the first gesture is in the user if the recognized user gesture is a predetermined first gesture The position in the image;
    所述飞行控制模块,还用于根据所述第一手势在所述用户图像中的位置,调整飞行器的飞行姿态,以使飞行器跟随所述第一手势的手势轨迹飞行。The flight control module is further configured to adjust a flight attitude of the aircraft according to a position of the first gesture in the user image, so that the aircraft follows the gesture trajectory of the first gesture to fly.
  25. 根据权利要求24所述的飞行器飞行控制装置,若所述图像获取模块,用于获取所述飞行器的图像采集装置所采集的用户图像;The aircraft flight control device according to claim 24, wherein the image acquisition module is configured to acquire a user image collected by an image acquisition device of the aircraft;
    所述飞行控制模块,用于根据所述第一手势在所述用户图像中的位置,调整飞行器的飞行姿态,具体包括:The flight control module is configured to adjust a flight attitude of the aircraft according to a position of the first gesture in the user image, specifically:
    根据所述位置,确定飞行器在与第一手势的手势轨迹相同的水平运动方向上,调整的水平移动距离;及根据所述位置,确定飞行器在与第一手势的手势轨迹相同的垂直运动方向,调整的垂直移动距离;Determining, according to the position, an adjusted horizontal movement distance in the same horizontal motion direction as the gesture trajectory of the first gesture; and determining, according to the position, the same vertical motion direction of the aircraft as the gesture trajectory of the first gesture, Adjusted vertical movement distance;
    以所确定的水平移动距离和垂直移动距离调整飞行器的飞行姿态,使得第一手势始终位于所述图像采集装置的图像采集视野范围内。The flight attitude of the aircraft is adjusted with the determined horizontal movement distance and vertical movement distance such that the first gesture is always within the image acquisition field of view of the image acquisition device.
  26. 根据权利要求24所述的飞行器飞行控制装置,所述飞行控制模块,还用于:The aircraft flight control device according to claim 24, wherein the flight control module is further configured to:
    若所识别的用户手势为预定的第二手势,且飞行器当前未进入第一模式,触发所述飞行器进入第一模式,所述第一模式用于指示飞行器跟随用户的第一手势的手势轨迹飞行;If the identified user gesture is a predetermined second gesture, and the aircraft is not currently entering the first mode, triggering the aircraft to enter a first mode, the first mode being used to indicate a gesture trajectory of the aircraft following the first gesture of the user flight;
    若所识别的用户手势为预定的第二手势,且飞行器当前已进入第一模式,触发所述飞行器退出第一模式,指示所述飞行器取消跟随用户的第一手势的手势轨迹飞行;If the identified user gesture is a predetermined second gesture, and the aircraft has entered the first mode, triggering the aircraft to exit the first mode, instructing the aircraft to cancel the gesture trajectory following the first gesture of the user;
    所述飞行控制模块,用于若所识别的用户手势为预定的第一手势,确定所述第一手势在所述用户图像中的位置,包括:The flight control module is configured to determine a location of the first gesture in the user image if the identified user gesture is a predetermined first gesture, including:
    若所识别的用户手势为预定的第一手势,且飞行器当前已进入第一模式,确定所述第一手势在所述用户图像中的位置。If the identified user gesture is a predetermined first gesture and the aircraft has currently entered the first mode, determining the location of the first gesture in the user image.
  27. 根据权利要求24所述的飞行器飞行控制装置,所述手势识别模块,还用于:在识别所述用户图像中的用户手势之前,判断所述用户图像中是否存在与合法用户的人脸特征相匹配的人脸区域;The aircraft flight control device according to claim 24, wherein the gesture recognition module is further configured to: before identifying a user gesture in the user image, determining whether there is a facial feature of the legal user in the user image Matching face area;
    所述手势识别模块,用于识别所述用户图像中的用户手势,包括:The gesture recognition module is configured to identify a user gesture in the user image, including:
    若所述用户图像中存在与合法用户的人脸特征相匹配的人脸区域,对与合 法用户的人脸特征相匹配的人脸区域在用户图像中对应的用户手势进行识别。If there is a face region in the user image that matches the face feature of the legitimate user, the face region matching the face feature of the legitimate user is identified in the user gesture corresponding to the user image.
  28. 一种飞行器,其特征在于,包括:图像采集装置和处理芯片;所述处理芯片执行如权利要求1至20项中任一项所述的飞行器飞行控制方法。An aircraft characterized by comprising: an image acquisition device and a processing chip; the processing chip performing the aircraft flight control method according to any one of claims 1 to 20.
  29. 一种飞行器飞行控制系统,其特征在于,包括:An aircraft flight control system, comprising:
    地面图像采集装置和飞行器;Ground image acquisition device and aircraft;
    所述地面图像采集装置,用于采集用户图像,并传输给所述飞行器;The ground image acquisition device is configured to collect a user image and transmit it to the aircraft;
    所述飞行器包括处理芯片;所述处理芯片用于:The aircraft includes a processing chip; the processing chip is used to:
    获取地面图像采集装置传输的用户图像;识别所述用户图像中的用户手势;根据预定义的各用户手势与飞行指令的对应关系,确定所述用户手势对应的飞行指令;根据所述飞行指令控制飞行器飞行。Acquiring a user image transmitted by the ground image capturing device; identifying a user gesture in the user image; determining a flight instruction corresponding to the user gesture according to a predefined correspondence between each user gesture and a flight instruction; controlling according to the flight instruction Aircraft flying.
  30. 一种飞行器飞行控制系统,其特征在于,包括:地面图像采集装置,地面处理芯片和飞行器;An aircraft flight control system, comprising: a ground image acquisition device, a ground processing chip and an aircraft;
    所述地面图像采集装置,用于采集用户图像,并传输给地面处理芯片;The ground image acquisition device is configured to collect a user image and transmit the image to a ground processing chip;
    所述地面处理芯片用于:The ground processing chip is used to:
    获取地面图像采集装置传输的用户图像;识别所述用户图像中的用户手势;根据预定义的各用户手势与飞行指令的对应关系,确定所述用户手势对应的飞行指令;将所述飞行指令传输给所述飞行器;Acquiring a user image transmitted by the ground image capturing device; identifying a user gesture in the user image; determining a flight instruction corresponding to the user gesture according to a predefined correspondence between each user gesture and a flight instruction; transmitting the flight instruction Giving the aircraft
    所述飞行器包括处理芯片;所述处理芯片,用于获取所述飞行指令,根据所述飞行指令控制飞行器飞行。The aircraft includes a processing chip; the processing chip is configured to acquire the flight instruction, and control aircraft flight according to the flight instruction.
  31. 一种计算机可读存储介质,包括指令,当所述指令在计算机上运行时,所述计算机执行上述权利要求1至权利要求20所述的方法。A computer readable storage medium comprising instructions that, when executed on a computer, perform the method of claims 1 to 20.
  32. 一种包含指令的计算机程序产品,当所述计算机程序产品在计算机上运行时,所述计算机执行上述权利要求1至权利要求20所述的方法。A computer program product comprising instructions for performing the method of claims 1 to 20 when said computer program product is run on a computer.
PCT/CN2018/073783 2017-01-24 2018-01-23 Method of controlling flight device, device, flight device, and system WO2018137608A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
CN201710060176.XA CN106774945A (en) 2017-01-24 2017-01-24 A kind of aircraft flight control method, device, aircraft and system
CN201710060380.1 2017-01-24
CN201710060176.X 2017-01-24
CN201710060380.1A CN106843489B (en) 2017-01-24 2017-01-24 A kind of the flight path control method and aircraft of aircraft

Publications (1)

Publication Number Publication Date
WO2018137608A1 true WO2018137608A1 (en) 2018-08-02

Family

ID=62979149

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/073783 WO2018137608A1 (en) 2017-01-24 2018-01-23 Method of controlling flight device, device, flight device, and system

Country Status (1)

Country Link
WO (1) WO2018137608A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104808799A (en) * 2015-05-20 2015-07-29 成都通甲优博科技有限责任公司 Unmanned aerial vehicle capable of indentifying gesture and identifying method thereof
CN105955308A (en) * 2016-05-20 2016-09-21 腾讯科技(深圳)有限公司 Aircraft control method and device
CN106020227A (en) * 2016-08-12 2016-10-12 北京奇虎科技有限公司 Control method and device for unmanned aerial vehicle
CN106339006A (en) * 2016-09-09 2017-01-18 腾讯科技(深圳)有限公司 Object tracking method of aircraft and apparatus thereof
CN106774945A (en) * 2017-01-24 2017-05-31 腾讯科技(深圳)有限公司 A kind of aircraft flight control method, device, aircraft and system
CN106843489A (en) * 2017-01-24 2017-06-13 腾讯科技(深圳)有限公司 The flight path control method and aircraft of a kind of aircraft

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104808799A (en) * 2015-05-20 2015-07-29 成都通甲优博科技有限责任公司 Unmanned aerial vehicle capable of indentifying gesture and identifying method thereof
CN105955308A (en) * 2016-05-20 2016-09-21 腾讯科技(深圳)有限公司 Aircraft control method and device
CN106020227A (en) * 2016-08-12 2016-10-12 北京奇虎科技有限公司 Control method and device for unmanned aerial vehicle
CN106339006A (en) * 2016-09-09 2017-01-18 腾讯科技(深圳)有限公司 Object tracking method of aircraft and apparatus thereof
CN106774945A (en) * 2017-01-24 2017-05-31 腾讯科技(深圳)有限公司 A kind of aircraft flight control method, device, aircraft and system
CN106843489A (en) * 2017-01-24 2017-06-13 腾讯科技(深圳)有限公司 The flight path control method and aircraft of a kind of aircraft

Similar Documents

Publication Publication Date Title
CN106843489B (en) A kind of the flight path control method and aircraft of aircraft
CN105528573B (en) Subscriber terminal equipment and its iris identification method
CN106598071B (en) Flight control method and device, the unmanned plane of trailing type
CN106131413B (en) Shooting equipment and control method thereof
EP3128454B1 (en) Authentication apparatus and processing apparatus
CN106774945A (en) A kind of aircraft flight control method, device, aircraft and system
US11006864B2 (en) Face detection device, face detection system, and face detection method
US10564712B2 (en) Information processing device, information processing method, and program
WO2017166725A1 (en) Photographing control method, device, and system
JP2006259931A (en) Face authentication apparatus and its control method, electronic device equipped with face authentication apparatus, face authentication apparatus control program and recording medium recorded with the program
WO2017113742A1 (en) Facial recognition-based playback control method and terminal
WO2016187985A1 (en) Photographing device, tracking photographing method and system, and computer storage medium
CN106557744A (en) Wearable face identification device and implementation method
JP4968922B2 (en) Device control apparatus and control method
WO2020095350A1 (en) Information processing device, information processing method, and recording medium
CN107256027A (en) The helmet and its control method for unmanned plane
CN106529500A (en) Information processing method and system
CN111626240B (en) Face image recognition method, device and equipment and readable storage medium
JPH1115979A (en) Face detection and method and device for tracing face
CN109729268B (en) Face shooting method, device, equipment and medium
WO2022082440A1 (en) Method, apparatus and system for determining target following strategy, and device and storage medium
JP2007249298A (en) Face authentication apparatus and face authentication method
WO2018137608A1 (en) Method of controlling flight device, device, flight device, and system
JP7327923B2 (en) Information processing device, information processing method, system and program
US12046075B2 (en) Information processing apparatus, information processing method, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18744841

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18744841

Country of ref document: EP

Kind code of ref document: A1