WO2020078217A1 - Unmanned aerial vehicle-based tracking method and system, unmanned aerial vehicle and terminal - Google Patents

Unmanned aerial vehicle-based tracking method and system, unmanned aerial vehicle and terminal Download PDF

Info

Publication number
WO2020078217A1
WO2020078217A1 PCT/CN2019/109558 CN2019109558W WO2020078217A1 WO 2020078217 A1 WO2020078217 A1 WO 2020078217A1 CN 2019109558 W CN2019109558 W CN 2019109558W WO 2020078217 A1 WO2020078217 A1 WO 2020078217A1
Authority
WO
WIPO (PCT)
Prior art keywords
tracking
drone
terminal
instruction
information
Prior art date
Application number
PCT/CN2019/109558
Other languages
French (fr)
Chinese (zh)
Inventor
汤鹏程
Original Assignee
深圳市道通智能航空技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市道通智能航空技术有限公司 filed Critical 深圳市道通智能航空技术有限公司
Publication of WO2020078217A1 publication Critical patent/WO2020078217A1/en

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/12Target-seeking control
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/13Satellite images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • G06V40/166Detection; Localisation; Normalisation using acquisition arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • G06V40/171Local features and components; Facial parts ; Occluding parts, e.g. glasses; Geometrical relationships
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/07Target detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/08Detecting or categorising vehicles

Definitions

  • the invention relates to the technical field of unmanned aerial vehicles, in particular to a tracking method, system, unmanned aerial vehicle and terminal based on an unmanned aerial vehicle.
  • Embodiments of the present invention provide a tracking method based on a drone, and a system and terminal applying the method, which can be applied in a wider range of occasions.
  • a first aspect of the present invention provides a drone-based tracking method.
  • the method may include: the drone flies according to the obtained search instruction and obtains a real-time image;
  • the drone extracts the information of the object to be tracked from the currently acquired real-time image according to the target information from the terminal and the currently acquired real-time image;
  • the UAV calculates the matching degree between the target information and the object information to be tracked
  • the drone sends feedback information to the terminal for the terminal to generate a tracking instruction according to the feedback information
  • the drone sends the current tracking information of the object to be tracked to the terminal.
  • a second aspect of the present invention provides a drone, which may include:
  • a first memory for storing a first computer-readable program
  • a first processor is configured to execute the first computer-readable program to implement a tracking method, and the method includes:
  • the target information from the terminal and the currently acquired real-time image extract the object information to be tracked from the currently acquired real-time image
  • a tracking instruction from the terminal is received, the corresponding object is tracked, and corresponding tracking information is sent to the terminal.
  • a third aspect of the present invention also provides a terminal, the terminal includes:
  • a second memory for storing a second computer-readable program
  • a second processor configured to execute the second computer-readable program to implement a tracking method, the method including:
  • the tracking instruction is sent to the drone to control the drone to return the current tracking information of the object to be tracked.
  • a fourth aspect of the present invention provides a drone-based tracking system including a drone and a terminal; the drone includes a first memory for storing a first computer-readable program; and
  • a first processor is configured to execute the first computer-readable program to implement a tracking method, and the method includes:
  • the target information from the terminal and the currently acquired real-time image extract the object information to be tracked from the currently acquired real-time image
  • the terminal includes:
  • a second memory for storing a second computer-readable program
  • a second processor configured to execute the second computer-readable program to implement a tracking method, the method including:
  • the tracking instruction is sent to the drone.
  • embodiments of the present invention provide a drone-based tracking method, and a system and terminal applying the method.
  • a user can directly send tracking target information to the drone through the terminal, and the drone can use Target information can be identified and tracked, and there is no need to select the tracking target in the real-time image returned by the drone, and the application is wider.
  • FIG. 1 is a schematic flowchart of a first example of the tracking method according to the first embodiment of the present invention
  • FIG. 2 is a schematic diagram of a first example of the tracking system according to the first embodiment of the present invention.
  • FIG. 3 is a schematic flowchart of a second example of the tracking method according to the first embodiment of the present invention.
  • FIG. 4 is a schematic flowchart of a third example of the tracking method according to the first embodiment of the present invention.
  • FIG. 5 is a schematic flowchart of a fourth example of the tracking method according to the first embodiment of the present invention.
  • FIG. 6 is a schematic diagram of a third example of the tracking method according to the first embodiment of the present invention.
  • FIG. 7 is a schematic flowchart of a fourth example of the tracking method according to the first embodiment of the present invention.
  • FIG. 8 is a schematic flowchart of a fifth example of the tracking method according to the first embodiment of the present invention.
  • FIG. 9 is a schematic flowchart of a first example of a tracking method according to a second embodiment of the invention.
  • FIG. 10 is a schematic flowchart of a first example of a tracking system according to a second embodiment of the invention.
  • FIG. 11 is a schematic flowchart of a second example of the tracking method according to the second embodiment of the present invention.
  • FIG. 12 is a schematic flowchart of a third example of the tracking method according to the second embodiment of the present invention.
  • FIG. 13 is a schematic flowchart of a fourth example of the tracking method according to the second embodiment of the present invention.
  • FIG. 14 is a schematic flowchart of a fifth example of the tracking method according to the second embodiment of the present invention.
  • FIG. 15 is a schematic diagram of a first example of a sub-flow of a tracking method according to a second embodiment of the invention.
  • 16 is a schematic diagram of a second example of the sub-flow of the tracking method according to the second embodiment of the present invention.
  • 17 is a schematic diagram of a third example of the sub-flow of the tracking method according to the second embodiment of the present invention.
  • FIG. 18 is a schematic flowchart of a sixth example of the tracking method according to the second embodiment of the present invention.
  • FIG. 19 is a schematic flowchart of a seventh example of the tracking method according to the second embodiment of the present invention.
  • 20 is a schematic diagram of functional modules of the tracking system according to the first embodiment of the present invention.
  • 21 is a schematic diagram of functional modules of a tracking system according to a second embodiment of the invention.
  • FIG. 22 is a schematic diagram of a first example of a sub-function module of a tracking system according to a second embodiment of the invention.
  • FIG. 23 is a schematic diagram of a second example of the sub-function module of the tracking system according to the second embodiment of the present invention.
  • FIG. 24 is a schematic structural diagram of a first embodiment of a terminal according to an embodiment of the present invention.
  • FIG. 25 is a schematic structural diagram of a second embodiment of a terminal according to an embodiment of the present invention.
  • Embodiments of the present invention provide a tracking method based on a drone, and a system and terminal applying the method.
  • a user can directly send tracking target information to the drone through the terminal, and the drone can be identified based on the tracking target information. And tracking, there is no need to select the tracking target in the real-time image returned by the drone, and the application is wider.
  • the terminal 10 communicates with the drone 20 through the ground station 30.
  • the user can control the drone 20 through the terminal 10 to realize the tracking method.
  • the tracking method includes:
  • step S101 the UAV 20 flies according to the obtained search instruction and obtains a real-time image.
  • the search command is sent from the terminal 10 to the drone 20.
  • the terminal 10 sends the search instruction to the drone 20 in response to user operation.
  • search strategies include flight strategies, such as UAV search path, attitude, bearing, airspeed, etc.
  • the search strategy also includes image acquisition strategies, such as shooting methods, shooting light, and framing strategies.
  • the user sets a corresponding search strategy through the terminal to generate a search instruction and sends it to the drone. After the drone obtains the search instruction, it will fly according to the search instruction.
  • the drone 20 extracts the information of the object to be tracked from the currently acquired real-time image according to the tracking target information from the terminal 10 and the currently acquired real-time image.
  • the tracking target information may be text information or image information.
  • the user sends a picture of tracking target information to the drone through the terminal 10.
  • the tracking target is a vehicle
  • the user sends the vehicle license plate number to the drone 20 through the terminal 10.
  • the tracking target is a person
  • the user sends a face image of the tracking target to the drone 20 through the terminal 10.
  • the face image may be extracted from a picture sent by a non-drone 20, or may be extracted from a picture sent by the drone 20.
  • the user can select the face contour of the tracking target in the picture through the terminal 10 or select the face area of the tracking target through the rectangular frame by the terminal 10 and send it to the drone 20.
  • the image of the person included in the picture is only the tracking target, and the user can directly send the picture to the drone 20 through the terminal 10.
  • the UAV 20 extracts the tracking target information according to the tracking target information.
  • the tracking target information acquired by the drone 20 is text information
  • the drone 20 extracts text information from the currently acquired real-time image.
  • the tracking target information acquired by the drone 20 is face image information
  • the drone 20 extracts face image information from the currently acquired real-time image.
  • the UAV 20 calculates the matching degree of the tracking target information and the information of the tracking target. Specifically, the unmanned aerial vehicle 20 compares the acquired tracking target information and the extracted tracking target information to obtain a matching degree.
  • the matching degree can be set according to the proportion of the number of text matches, for example, 60% of the text is the same, and the matching degree is 60%.
  • the matching degree can be compared according to the feature value of the image information. For example, the face contour is set as the feature value. If the feature values are 70% consistent, the matching degree is 70%.
  • step S107 if the calculated matching degree is greater than the preset matching degree, the UAV 20 sends feedback information to the terminal 10 for the terminal 10 to generate a tracking instruction according to the feedback information.
  • the preset matching degree of the text information is greater than the preset matching degree of the image information. Understandably, since the character information recognition technology has higher accuracy than the image information recognition technology, the matching degree is accordingly increased to further reduce errors.
  • the feedback information is a picture. The following will describe in detail how the terminal generates the tracking command based on the feedback information, which will not be repeated here.
  • step S109 if a tracking instruction from the terminal 10 is received, the drone 20 sends corresponding tracking information to the terminal 10.
  • the drone 20 also tracks the object to be tracked and sends real-time tracking information to the terminal 10.
  • the tracking information includes the GPS positioning information of the tracking target, the tracking path, time, matching degree, and the type of matching algorithm used.
  • the UAV 20 starts tracking after receiving the tracking instruction from the terminal 10, that is, the UAV 20 initially determines that the object to be tracked is sent to the terminal 10 to further confirm whether the object to be tracked is accurate, and further avoids misjudgement of the tracking target The phenomenon.
  • the UAV 20 only sends feedback information to the terminal 10 when the matching degree is high, which can reduce the probability of sending wrong feedback information to the terminal 10, avoid adding unnecessary data to the terminal 10, and greatly improve the efficiency and Processing speed.
  • the tracking target information (such as pictures or text information) sent by the terminal 10 to the drone or the terminal 10 receiving feedback information (such as pictures or text information) from the drone 20 need not be the drone 20 Real-time images returned. That is, the terminal 10 and the UAV 20 can use non-real-time information to achieve tracking of the tracking target, so the quality of communication transmission is not high. That is, when the communication transmission quality between the terminal 10 and the drone 20 is not good, the terminal 10 and the drone 20 can also successfully track the tracking target. Therefore, the tracking method of the above embodiment greatly reduces the dependence on the communication transmission quality, and can also be applied in occasions where the communication transmission quality is poor, and the application scenarios are more extensive.
  • the method further includes the following steps.
  • step S301 the drone 20 detects whether an adjustment instruction from the terminal 10 is received, where the adjustment instruction is sent to the drone 20 when the terminal 10 does not receive feedback information from the drone 20 within a preset time. If the adjustment instruction is received, the drone executes step S302, otherwise executes step S103.
  • Step S302 the drone is flying according to the adjustment instruction.
  • the terminal 10 can determine that the search strategy of the drone 20 needs to be adjusted, and thus send an adjustment instruction to the corresponding drone to adjust the search path, thereby improving the efficiency of tracking.
  • the method further includes the following steps.
  • step S401 the drone 20 tracks the object to be tracked.
  • step S402 the UAV 20 determines whether the tracking instruction is received within a preset time. If the tracking instruction is received, step S109 is executed; otherwise, step S103 is executed.
  • the UAV 20 Since the UAV 20 initially recognizes the object to be tracked and tracks it in time, it can quickly track the suspicious tracking target and avoid missing the tracking target. In addition, since the UAV 20 has not received the tracking command for a long time after sending the feedback information, it indicates that the feedback information is not the information of the tracking target, but is invalid feedback information, so the search is repeated and the flexibility is strong.
  • FIG. 5 shows the specific steps of the terminal 10 controlling the UAV 20. Specifically, the tracking method further includes the following steps.
  • step S500 the terminal 10 sends a search instruction and tracking target information to the drone 20.
  • the search command and tracking target information can be sent at the same time or in a certain order.
  • the terminal 10 displays the received feedback information for the user to identify whether the drone 20 needs the tracking instruction and for the user to input a tracking request based on the feedback information.
  • the feedback information includes a picture or feature information including the object to be tracked.
  • the feedback information is used as an example for illustration. If the picture contains an image of the tracking target, the user can recognize that the drone 20 needs the tracking instruction.
  • the terminal 10 also displays a corresponding confirmation button. If the user recognizes the image containing the tracking target in the picture, the user clicks the corresponding confirmation button to generate the tracking request.
  • the feedback information may also require feature information of the tracking object. If the tracking target is a vehicle, the feature information such as vehicle damage information and vehicle license plate is blocked information. If the tracking target is a person, the feature information may be the person's height, body shape, age, human body defects, face contour information, and so on.
  • step S504 if the terminal 10 receives the tracking request input by the user, the tracking instruction is generated.
  • step S506 the terminal 10 sends the tracking instruction to the drone 20.
  • the user can identify and track the target according to the picture sent by the drone 20, to avoid tracking the wrong target, and improve the tracking efficiency and success rate.
  • the method further includes step S508: The terminal 10 does not receive the feedback information within a preset time, and generates an adjustment instruction to control the drone 10 to adjust the flight path according to the adjustment instruction.
  • the tracking method includes the following steps.
  • step S500 the terminal 10 sends a search instruction and tracking target information to the drone 20.
  • step S101 the UAV 20 flies according to the obtained search instruction and obtains a real-time image.
  • step S103 the drone 20 extracts the information of the object to be tracked based on the acquired tracking target information and the currently acquired real-time image.
  • step S105 the drone 20 calculates the degree of matching between the tracking target information and the information of the target to be tracked.
  • step S107 if the calculated matching degree is greater than the preset matching degree, the drone 20 sends feedback information to the terminal 10. Among them, the feedback information is a picture.
  • step S502 the terminal 10 displays the received feedback information for the user to identify whether the drone needs a tracking instruction, and for the user to input a tracking request according to the picture.
  • Step S504 if a tracking instruction input by the user is received, the terminal 10 generates a tracking instruction.
  • step S506 the terminal 10 sends the tracking instruction to the drone 20.
  • step S109 in response to the tracking instruction, the UAV 20 sends tracking information to the terminal 10 to be tracked.
  • the tracking method further includes the following steps. Step S508, if the feedback information of the drone 20 is not received within the preset time, the terminal 10 sends an adjustment instruction.
  • step S301 the drone 20 flies according to the adjustment instruction.
  • the tracking method further includes the following steps.
  • step S401 the drone 20 tracks the object to be tracked.
  • step S403 if the tracking instruction is not received within the preset time, the drone 20 executes step S103 and the steps after step S103 again.
  • FIG. 9 and FIG. 10 are schematic flowcharts of another drone-based tracking method and an operating environment diagram of the tracking method according to an embodiment of the present invention.
  • the terminal 10 communicates with several drones 20' through the ground station 30 '.
  • the user can control several drones 20 'through the terminal 10' to realize the tracking method.
  • the tracking method includes the following steps.
  • step S601 a number of unmanned 20 'aircrafts respectively fly according to the obtained search instructions and obtain real-time images, wherein the search instructions obtained by each UAV 20' are different to control each UAV 20 'to fly according to different search paths.
  • the search command is sent from the terminal 10 'to several drones 20'.
  • the terminal 10 'sends the search instruction to several drones 20' in response to user operations.
  • the tracking target is a vehicle or a person
  • the user can set a corresponding search strategy according to the possible range or track of the tracking target.
  • Search strategies include flight strategies, such as UAV search path, attitude, bearing, airspeed, etc.
  • the search strategy also includes image acquisition strategies, such as shooting methods, shooting light, and framing strategies.
  • the user sets a corresponding search strategy through the terminal to generate a search instruction and sends it to the drone 20 '. After acquiring the search command, the UAV 20 'flies according to the search command.
  • the user can set a variety of possible search strategies, thereby generating a variety of different search commands to control the several drones 20 'to search simultaneously to achieve fast positioning tracking The effect of the goal.
  • step S603 several drones 20 'extract the information of the object to be tracked from the currently acquired real-time image based on the tracking target information from the terminal 10' and the currently acquired real-time image, respectively.
  • step S103 which will not be repeated here.
  • step S605 several drones 20 'calculate the matching degree of the tracking target information and the tracking target information, respectively.
  • S105 which will not be repeated here.
  • step S607 one or more drones 20 'send feedback information to the terminal 10 for the terminal 10 to generate a tracking instruction according to the feedback information, wherein the drones 20' sending the feedback information are calculated that the matching degree is greater than the preset Match the degree of UAV 20 '.
  • the drones 20' sending the feedback information are calculated that the matching degree is greater than the preset Match the degree of UAV 20 '.
  • step S609 if one or more UAVs 20 'receive a tracking instruction from the terminal 10', the UAV 20 'that received the tracking instruction sends corresponding tracking information to the terminal 10'.
  • the UAV 20 'that received the tracking instruction sends corresponding tracking information to the terminal 10'.
  • the UAV 20 'receive the tracking instruction from the terminal 10' they also track the object to be tracked.
  • the tracking information includes position information and tracking path information of the tracking object.
  • step S611 if one or more drones 20 'receive a return instruction from the terminal 10', one or more drones 20 'return in response to the return instruction. If the drone that needs to track the command is identified, it can be understood that the rest of the unmanned aerial vehicles 20 'still do not need to continue searching, so the terminal 10' generates a return command to control the remaining unmanned Drone 20 'ends the search.
  • the tracking target information (such as pictures or text information) sent by the terminal 10 'to the drone 20' or the terminal 10 'receiving feedback information (such as pictures or text information) from the drone 20' does not require It is a real-time image returned by the UAV 20 ', that is, the terminal 10' and the UAV 20 'can use non-real-time information to achieve tracking of the tracking target, so the quality of communication transmission is not high. That is, when the communication transmission quality between the terminal 10 'and the drone 20' is not good, the terminal 10 'and the drone 20' can also successfully complete the tracking of the tracking target. Therefore, the tracking method of the above embodiment greatly reduces the dependence on the communication transmission quality, and can also be applied in occasions where the communication transmission quality is poor, and the application scenarios are more extensive.
  • step S601 the method further includes the following steps.
  • Step S801 each drone 20 'detects whether an adjustment instruction from the terminal 10' is received, where the adjustment instruction is when the terminal 10 'does not receive feedback information from one or more drones within a preset time Generated and sent to the corresponding UAV 20 '. If the adjustment instruction is received, step S802 is executed, otherwise, step S603 is executed.
  • step S802 the drone 20 'that has received the adjustment instruction flies according to the adjustment instruction. Since the feedback information is not received within the preset time, the terminal 10 'can recognize that the search strategy of the drone 20' needs to be adjusted, so as to send an adjustment instruction to the corresponding drone 20 'to adjust the search path, which can improve tracking effectiveness.
  • the method further includes the following steps.
  • step S901 the unmanned aerial vehicles 20 'that send feedback information track their respective tracking objects.
  • step S902 the unmanned aerial vehicle 20 'sending feedback information judges whether the tracking instruction is received within a preset time. If the tracking instruction is received, step S609 is executed; otherwise, S603 is executed. Since the UAV sends the feedback information but has not received the tracking instruction for a long time, it indicates that the feedback information is not the information of the tracking target, but is invalid feedback information, so the search is restarted. Understandably, in this embodiment, suspicious tracking targets can be quickly tracked, and missed tracking targets can be avoided. In addition, if the tracking instruction is not received within the preset time, it can be determined that the suspicious object is not the tracking target, and the search is performed again according to the original search strategy, which has greater flexibility.
  • the tracking method further includes the following steps.
  • step S1000 the terminal 10 'sends several search instructions and tracking target information to several drones 20'.
  • the terminal 10 ' may simultaneously search for commands and track target information to the drones 20', or send search commands and track target information to each drone 20 'in a certain order.
  • search instructions are different.
  • step S1002 if the terminal 10 'receives the feedback information, the drone 20' that needs the tracking instruction is identified based on the feedback information.
  • step S1004 the terminal 10 'sends the tracking instruction to the drone 20' that needs the tracking instruction.
  • step S1006 the terminal 10 ' also generates a return instruction according to the drone that needs the tracking instruction to control the drone 20' that does not need to send the tracking instruction to return. Specifically, if a drone that needs a tracking instruction is identified, it is understandable that the remaining unmanned aerial vehicles 20 'are not considered to continue searching. Therefore, the terminal 10' generates a return instruction to control the remaining unmanned The searching UAV 20 'ends the search.
  • the identification of the drone 20 'that needs to be tracked according to the feedback information specifically includes the following steps.
  • the terminal 10 displays the received feedback information for the user to identify the drone 20' requiring the tracking instruction, and the user to input the corresponding tracking request according to the picture.
  • the feedback information is a picture containing the object to be tracked. Specifically, if the user identifies the tracking target from the picture, the user can recognize that the drone 20 'needs the tracking instruction.
  • the feedback information may also be characteristic information of the object to be tracked. For example, if the tracking target is a vehicle, the feature information such as vehicle damage information and vehicle license plate information is blocked. For example, if the tracking target is a person, the characteristic information may be the person's height, body shape, age, and human body defect information.
  • the method further includes the following steps.
  • Step S10022 if the terminal 10 'receives the tracking request input by the user, it identifies the drone 20' that needs the tracking instruction according to the tracking request, and generates a corresponding tracking instruction. Specifically, the terminal 10 'also displays a confirmation button corresponding to the pictures one-to-one. If the user recognizes that there is a tracking target image in the picture displayed on the terminal 10 ', the user clicks the corresponding confirmation button to generate the tracking request. In some other feasible embodiments, if the user recognizes that there is a tracking target image in the picture displayed by the terminal 10 ', the terminal 10' may also input a unique identification code corresponding to the drone 20 'to generate the tracking request. The identification code may be the number of the drone or the IP address.
  • the feedback information includes the matching degree and / or the type of matching algorithm used.
  • the following uses feedback information as an example of matching.
  • the method further includes the following steps.
  • step S1101 the terminal 10 'identifies the unmanned aerial vehicle 20' requiring the tracking instruction according to the received matching degree and the preset condition.
  • Step S1103 if the terminal recognizes the drone that needs the tracking instruction, generates the tracking instruction and sends it to the drone that needs the tracking instruction.
  • FIG. 16 is a flowchart of steps in the first implementation example of step S1101.
  • step S1202 the terminal 10 'counts the degree of matching received within a preset time. If the count is 1, execute step S1204; if the count is greater than 1, execute step S1206.
  • step S1204 if the terminal 10 'receives the matching degree sent by only one drone within a preset time, it is recognized that the drone sending the matching degree needs the tracking instruction.
  • step S1206 if the terminal 10 'receives the matching degree sent by the multiple drones 20' within the preset time, the terminal 10 'compares the received matching degree to obtain the highest matching degree.
  • step S1208 the terminal 10 'recognizes that the UAV that sends the highest matching degree needs the tracking instruction.
  • FIG. 17 is a method flowchart of steps in the second implementation example of step S1101.
  • step S1302 the terminal 10 'counts the matching degree received within a preset time. If the count is 1, execute step S1304; if the count is greater than 1, execute step 1306.
  • step S1304 if the terminal 10 'receives the matching degree sent by only one drone 20' within a preset time, it is recognized that the drone 20 'sending the matching degree needs the tracking instruction.
  • step S1306 if the terminal 10 'receives the matching degrees sent by multiple drones 20' within a preset time, the terminal 10 'compares the received matching degrees to obtain the highest number of matching degrees.
  • step S1308 the terminal 10 'recognizes that sending the preset number of drones 20' with the highest matching degree requires the tracking instruction.
  • the analysis unit compares the matching degrees of the same type when analyzing the highest matching degree.
  • the comparison process refers to the above steps. No longer.
  • the tracking method includes the following steps.
  • Step S1000 The terminal 10 'sends several search instructions and tracking target information to several drones 20'.
  • Step S601 Several drones 10 'fly according to the obtained search instruction and obtain real-time images.
  • step S603 a number of drones 20 'extract the information of the object to be tracked from the currently acquired real-time image based on the tracking target information from the terminal 10' and the currently acquired real-time image, respectively.
  • step S605 several drones 20 'calculate the matching degree of the tracking target information and the tracking target information, respectively.
  • step S607 one or more drones 20 'send feedback information to the terminal 10 for the terminal 10 to generate a tracking instruction according to the feedback information, wherein the drones 20' sending the feedback information are calculated that the matching degree is greater than the preset Match the degree of UAV 20 '.
  • step S1002 the terminal 10 'identifies the unmanned aerial vehicle 20' requiring the tracking instruction based on the feedback information.
  • step S1004 the terminal 10 'sends the tracking instruction to the drone 20' that needs the tracking instruction.
  • step S1006 the terminal 10 ' also generates a return instruction according to the drone that needs the tracking instruction to control the drone 20' that does not need to send the tracking instruction to return. Specifically, if the user identifies the drone that needs to track the command, it can be understood that the remaining unmanned drone 20 'is still not required to continue the search, so the terminal 10' generates a return command to control the remaining The search is completed at the UAV 20 'performing the search.
  • step S609 the drone 20 'that has received the tracking instruction sends corresponding tracking information to the terminal 10'.
  • step S609 the drone 20 'that has received the tracking instruction sends corresponding tracking information to the terminal 10'.
  • Step S1008 The terminal 10 'generates a return instruction according to the drone that needs the tracking instruction.
  • step S611 the drone 20 'that has received the return instruction returns in response to the return instruction.
  • the tracking method further includes the following steps.
  • step S901 the unmanned aerial vehicles 20 'that send feedback information track their respective tracking objects.
  • step S403 if the tracking instruction is not received within the preset time, the UAV 20 'sending the feedback information re-executes steps S603 and the steps after step S603.
  • FIG. 2 and FIG. 20 are schematic diagrams of the tracking system 100 and the terminal 10 of the first embodiment, respectively.
  • the tracking system 100 includes a terminal 10, a drone 20, and a ground station 30.
  • the terminal 10 communicates with the drone 20 through the ground station 30 to control the drone 20 to track the tracking target.
  • the terminal 10 includes a setting unit 11, a first communication unit 13, and a tracking management unit 14.
  • the setting unit 11 is used to set a search strategy in response to user operation. Specifically, for example, if the tracking target is a vehicle or a person, the user may set a corresponding search strategy according to the possible range or track of the tracking target. Search strategies include flight strategies, such as UAV search path, attitude, bearing, airspeed, etc. The search strategy also includes image acquisition strategies, such as shooting methods, shooting light, and framing strategies. The user sets the corresponding search strategy through the setting unit 11 and generates a search instruction. The setting unit 11 is also used to send the search instruction to the drone 20 through the first communication unit 13 to control the drone 20 to fly according to the search strategy and obtain real-time images.
  • Search strategies include flight strategies, such as UAV search path, attitude, bearing, airspeed, etc.
  • the search strategy also includes image acquisition strategies, such as shooting methods, shooting light, and framing strategies.
  • the user sets the corresponding search strategy through the setting unit 11 and generates a search instruction.
  • the setting unit 11 is also used to send the search instruction to
  • the setting unit 11 is further configured to set tracking target information in response to the user's operation, and send the tracking target information to the drone 20 through the first communication unit 13.
  • the tracking target is a vehicle
  • the user sets the vehicle license plate number as the tracking target information through the setting unit 11.
  • the tracking target is a person
  • the user sets the face image of the tracking target as tracking target information through the setting unit 11.
  • the face image may be extracted from a picture sent by a non-drone 20, or may be extracted from a picture sent by the drone 20.
  • the user may select the face contour of the tracking target in the picture through the setting unit 11 or use the rectangular frame to select the face area of the tracking target through the setting unit 11 and set it as tracking target information.
  • the image of the person included in the picture is only the tracking target, and the user can directly send the picture to the drone 20 through the setting unit 11.
  • the tracking management 14 is also used to receive feedback information from the UAV 20 through the first communication unit 13 and identify whether tracking instructions need to be generated according to the feedback information.
  • the tracking management 14 is also used to receive tracking information from the UAV 20 through the first communication unit 13.
  • the feedback information includes picture information, and the picture information includes feature information or pictures. In this embodiment, the feedback information is taken as an example for description. See below for the detailed description of the generation and sending of the feedback information and tracking information.
  • the tracking management 14 specifically includes a display unit 140, an interface providing unit 142, and an instruction generating unit 144.
  • the display unit 140 is used to display the received picture on the terminal 10 for the user to recognize whether the drone 20 needs the tracking instruction. Specifically, if the picture contains an image of the tracking target, the user can recognize that the drone needs the tracking instruction.
  • the interface providing unit 142 is used to provide a key and display the key through the display unit 140 for the user to select to generate a tracking request. Specifically, the user can select the key through an input device, such as a mouse, a keyboard, or the like.
  • the instruction generating unit 144 is used to generate the tracking instruction and send it to the drone 20 through the first communication unit 13 if a tracking request input by the user is received.
  • the display unit 140 is also used to display the received tracking information.
  • the UAV 20 includes a second communication unit 22, a shooting unit 23, a search unit 24, a feedback unit 26, and a tracking unit 27.
  • the shooting unit 23 is used to acquire an image.
  • the search unit 24 obtains the search instruction and tracking target information sent by the terminal 10 through the second communication unit 22, and controls the drone 20 to fly and obtain real-time images according to the search instruction.
  • the searching unit 24 is also used to track the target information and the currently acquired real-time image to determine whether it is necessary to send feedback information to the terminal 10.
  • the search unit 24 includes a flight control unit 240, a shooting control unit 242, and an image processing unit 244.
  • the flight control unit 240 is used to control the UAV 20 to fly according to the search instruction according to the search instruction.
  • the search instructions include flight strategies, such as the UAV search path, attitude, bearing, and airspeed.
  • the shooting control unit 242 is used to control the shooting unit 23 to acquire a real-time image according to the search instruction according to the search instruction.
  • the search instruction also includes image acquisition strategies such as shooting mode, shooting light, and framing strategy.
  • the image processing unit 244 is used to extract the information of the object to be tracked from the currently acquired real-time image according to the tracking target information from the terminal and the currently acquired real-time image by the drone, for example, the tracking acquired by the drone
  • the target information is text information
  • the drone extracts text information from the currently acquired real-time image.
  • the tracking target information acquired by the drone is a face image
  • the drone extracts face information from the currently acquired real-time image.
  • the image processing unit 242 is also used to calculate the matching degree according to the tracking target information and the tracking target information. Specifically, the image processing unit 242 compares the tracking target information and the tracking target information to calculate a matching degree. In this embodiment, if it is text information, the matching degree can be set according to the proportion of the number of text matches, for example, 60% of the text is the same, and the matching degree is 60%. If it is image information, the matching degree can be compared according to the feature value of the image information. For example, the face contour is set as the feature value. If the feature values are 70% consistent, the matching degree is 70%.
  • the feedback unit 26 is used to determine whether the calculated matching degree is greater than the preset matching degree, and if the calculated matching degree is greater than the preset matching degree, send feedback information to the terminal 10 for the terminal 10 to generate a tracking instruction according to the feedback information .
  • the preset matching degree of the text information is greater than the preset matching degree of the image information. Understandably, since the character recognition technology has higher accuracy than the image recognition technology, the matching degree is accordingly increased to further reduce errors.
  • the feedback information is preferably picture information.
  • the tracking unit 27 is configured to send the information of the object to be tracked in response to the tracking instruction sent from the terminal 10.
  • the tracking unit 27 is also used to track the object to be tracked and send real-time tracking information to the terminal 10.
  • the tracking information includes the GPS positioning information of the tracking target, the tracking path, time, matching degree, and the type of matching algorithm used.
  • the flight control unit 240 is also used to adjust the flight strategy of the drone 20 in response to the adjustment instruction sent from the terminal 10.
  • FIG. 7 and FIG. 21 is a schematic diagram of a second embodiment of the tracking system 200 provided by the present invention.
  • the tracking system 200 includes a terminal 10 ', a ground station 30' and a number of drones 20 '.
  • the terminal 10 ' is connected to a number of unmanned aerial vehicles 20' through a ground station 30 '.
  • the terminal 10 ' includes a setting unit 11', an instruction generating unit 12 ', a first communication unit 13', and a tracking management unit 14 '.
  • the setting unit 11 ' is used to set a search strategy in response to user operation. Specifically, for example, if the tracking target is a vehicle or a person, the user may set a corresponding search strategy according to the possible range or track of the tracking target. Search strategies include flight strategies, such as UAV search path, attitude, bearing, airspeed, etc. The search strategy also includes image acquisition strategies, such as shooting methods, shooting light, and framing strategies. The user sets a corresponding search strategy through the terminal to generate several different search instructions and sends them to several drones 20 'to control several drones 20' to fly and acquire real-time images according to different search strategies. The setting unit 11 'is different from the setting 11 in that the setting unit 11' is used to set different search strategies for the unmanned aerial vehicles 20 'one by one.
  • the setting unit 11 ' is also used for setting tracking target information in response to the user's operation, and sending the tracking target information to a plurality of drones 20' through the first communication unit 13 '.
  • the tracking target information sent to several UAVs 20 ' is the same.
  • the function of the setting unit 11 ' is basically the same as that of the setting unit 11 and will not be repeated here.
  • the tracking management unit 14 ' is also used to receive feedback information from the unmanned aerial vehicle 20' through the first communication unit 13 ', and identify whether a tracking instruction needs to be generated based on the feedback information.
  • the tracking management 14 ' is also used to receive tracking information from the drone 20' through the first communication unit 13.
  • the feedback information includes pictures. See the above for the detailed description of the generation and sending of the feedback information and tracking information.
  • FIG. 22 is a functional block diagram of the tracking management unit 14 'according to the first embodiment.
  • the tracking management unit 14 ' specifically includes a display unit 14001, an interface providing unit 14002, and an instruction generating unit 14004.
  • each functional module of the tracking management 14 ' is basically the same as the tracking management unit 14.
  • the interface providing unit 14002 is used to provide several buttons for the user to select to generate a tracking request.
  • the several keys correspond to the displayed pictures one by one, so that a tracking request corresponding to the pictures can be generated.
  • each picture is associated with the unique identification code of the drone 20 'to distinguish the drone 20' that sent the picture.
  • the interface providing unit 14002 generates the tracking request according to the unique identification code, that is, the tracking request includes the unique identification code of the drone 20 '.
  • the user may also input the unique identification code of the drone 20 'requiring the tracking instruction through the interface providing unit 14002 to generate the tracking request.
  • the instruction generating unit 14004 is used to generate a corresponding tracking instruction according to the tracking request input by the user, and send it to the drone 20 'through the first communication unit 13'. Accordingly, the tracking instruction includes the unique identification code of the drone 20 'for the corresponding drone 20' to obtain.
  • the instruction generating unit 14004 is also used to generate a return instruction according to the drone 20 'that needs the tracking instruction to control the return of the drone 20' that does not need to send the tracking instruction. Accordingly, the return instruction contains the unique identification code of the drone 20 '. Understandably, if the user identifies the drone that needs to track the command, the rest of the drones that are still searching are considered to not need to continue searching. Therefore, the command generation unit 14004 sends a return command to the first communication unit 13 ' The corresponding unmanned aerial vehicle 20 'ends the search by controlling the remaining unmanned aerial vehicles.
  • Fig. 23 is a functional block diagram of the tracking management unit 14 'according to the second embodiment.
  • the tracking management unit 14 ' is used to receive feedback information from the unmanned aerial vehicle 20' through the first communication unit 13 ', and identify whether a tracking instruction needs to be generated based on the feedback information.
  • the feedback information includes the matching degree.
  • the tracking management unit 14 ' includes an identification unit 14011, and an instruction generation unit 14012.
  • the recognizing unit 14011 is used to recognize the drone 20 'requiring the tracking instruction according to the received matching degree and preset conditions.
  • the identification unit 14011 includes a counting unit 14013 and an analysis unit 14015.
  • the counting unit 14013 is used to count the matching degree received within a preset time.
  • the analysis unit 14015 is used to recognize that the matching degree sent by only one drone 20 'within a preset time, that is, a count of 1, recognizes that the drone needs the tracking instruction.
  • the analysis unit 14015 is also used to receive the matching degree sent by multiple drones 20 'within a preset time, that is, the count is greater than 1, compare the received matching degree to obtain the highest matching degree, and identify the sending The highest matching degree UAV 20 'needs this tracking instruction.
  • the analysis unit 14013 is configured to receive the matching degrees sent by multiple drones 20 'within a preset time, and compare the received matching degrees to obtain the highest preset number. Matching degree, and recognizes that the UAV 20 'sending the preset number of highest matching degree needs the tracking instruction.
  • the instruction generating unit 14012 is used to generate a corresponding tracking instruction according to the tracking request input by the user, and send it to the drone 20 through the first communication unit 13 '. Accordingly, the tracking instruction includes the unique identification code of the drone 20 'for the corresponding drone 20' to obtain.
  • the instruction generating unit 14012 is also used to generate a return instruction according to the drone 20 'that needs the tracking instruction to control the return of the drone 20' that does not need to send the tracking instruction. Accordingly, the return instruction contains the unique identification code of the drone 20 '. Understandably, if the user identifies the drone that needs to track the command, the rest of the drones that are still searching are considered not to need to continue searching. Therefore, the command generation unit 14012 sends a return command to the first communication unit 13 ' The corresponding unmanned aerial vehicle 20 'ends the search by controlling the remaining unmanned aerial vehicles.
  • the functions of the UAV 20 'and the UAV 20 are basically the same, the difference is that the UAV 20' is also used to return in response to a return instruction.
  • the specific function module of the UAV 20 ' please refer to the UAV 20, which will not be repeated here.
  • the terminal 1000 may include: a first memory 1005 and a first processor 1006, where:
  • the first memory 1005 is used to store a first computer-readable program.
  • the first memory 105 of the embodiment of the present invention may be a system memory, for example, volatile (such as RAM), non-volatile (such as ROM, flash memory, etc.), or a combination of both.
  • the first memory 105 in this embodiment of the present invention may also be an external memory outside the system, such as a magnetic disk, an optical disk, or a magnetic tape.
  • the first processor 1006 is configured to call the first computer-readable program stored in the first memory 1005 and perform the following operations:
  • the first processor 1006 also performs the following operations:
  • the received picture is displayed for the user to identify whether the drone needs the tracking instruction. Specifically, if the picture contains an image of the tracking target, the user can recognize that the drone needs the tracking instruction.
  • the tracking instruction is generated. Specifically, the terminal also displays a corresponding confirmation button. If the user recognizes that there is a tracking target in the picture displayed on the terminal, the user clicks the corresponding confirmation button to generate the tracking request.
  • the first processor 1006 performs the following operations:
  • the first processor 1006 also performs the following operations:
  • the tracking instruction is generated according to the tracking request
  • the unmanned aerial vehicle that needs the tracking instruction generates a return instruction to control the unmanned aerial vehicle that does not need to send the tracking instruction to return.
  • the first processor 1006 also performs the following operations:
  • the drone that needs the tracking instruction is identified.
  • the tracking instruction is generated and sent to the drone that needs the tracking instruction.
  • the first processor 1006 also performs the following operations:
  • the terminal receives the matching degree sent by only one drone within a preset time, it is recognized that the drone sending the matching degree needs to follow the instruction;
  • the terminal compares the received matching degree to obtain the highest matching degree
  • the first processor 1006 also performs the following operations:
  • the received matching degree is compared to obtain the highest number of matching degrees
  • FIG. 25 is a schematic structural composition diagram of another embodiment of the drone of the present invention.
  • the drone 2000 may include: a second memory 2005 and a second processor 2006, where:
  • the second memory 2005 is used to store the first computer readable program.
  • the second memory 2005 of the embodiment of the present invention may be a system memory, for example, volatile (such as RAM), non-volatile (such as ROM, flash memory, etc.), or a combination of both.
  • the second memory 2005 in this embodiment of the present invention may also be an external memory outside the system, such as a magnetic disk, an optical disk, or a magnetic tape.
  • the second processor 2006 is configured to call the first computer-readable program stored in the second memory 2005 and perform the following operations:
  • the object to be tracked is tracked, and corresponding tracking information is sent to the terminal.
  • the second processor 2006 also performs the following steps:
  • the aircraft flies according to the adjustment instruction.
  • the second processor 2006 can also perform the following steps:
  • an embodiment of the present invention further provides a computer storage medium.
  • the computer storage medium may store a program, and when the program is executed, part or all of the steps of the method described in the embodiment of the present invention may be executed.
  • the computer storage medium in the embodiments of the present invention includes: RAM, ROM, EEPROM, flash memory, CD-ROM, DVD, or other optical storage, magnetic tape, magnetic disk, or other magnetic storage, or any other storage medium that can be used to store required information Media that can be accessed by computer equipment.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Remote Sensing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Automation & Control Theory (AREA)
  • Astronomy & Astrophysics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

Provided are an unmanned aerial vehicle (20, 20')-based tracking method and system, an unmanned aerial vehicle (20, 20') and a terminal (10, 10'). The tracking method comprises: an unmanned aerial vehicle (20, 20') flying according to an acquired search instruction and acquiring real-time images (S101); the unmanned aerial vehicle (20, 20') extracting information of an object needing to be tracked from a currently acquired real-time image according to target information from the terminal (10, 10') and the currently acquired real-time image (S103); the unmanned aerial vehicle (20, 20') calculating a matching degree of the target information and the information of the object needing to be tracked (S105); if the calculated matching degree is greater than a preset matching degree, the unmanned aerial vehicle (20, 20') sending feedback information to the terminal (10, 10'), so that the terminal (10, 10') generates a tracking instruction according to the feedback information (S107); and if the tracking instruction from the terminal (10, 10') is received, the unmanned aerial vehicle (20, 20') sending the current tracking information of the object needing to be tracked to the terminal (10, 10') (S109). Accordingly, tracking target information can be directly sent to the unmanned aerial vehicle (20, 20') through the terminal (10, 10'), with no need to select a tracking target from real-time images returned by the unmanned aerial vehicle (20, 20'). The present invention can be applied to a wider range of scenarios.

Description

基于无人机的追踪方法、系统、无人机及终端UAV-based tracking method, system, UAV and terminal
相关申请交叉引用Related applications cross reference
本申请要求于2018年10月17日申请的、申请号为201811211036.9、申请名称为“基于无人机的追踪方法、系统、无人机及终端”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。This application requires the priority of the Chinese patent application filed on October 17, 2018, with the application number 201811211036.9 and the application name "Drone-based tracking method, system, drone and terminal", all of which are approved by The reference is incorporated in this application.
技术领域Technical field
本发明涉及无人机技术领域,尤其涉及一种基于无人机的追踪方法、系统、无人机及终端。The invention relates to the technical field of unmanned aerial vehicles, in particular to a tracking method, system, unmanned aerial vehicle and terminal based on an unmanned aerial vehicle.
背景技术Background technique
现有很多无人机实现目标追踪是通过用户在无人机回传的实时图像中选择目标区域;然后由无人机或者终端根据用户设置的区域进行目标识别;最后再由无人机对目标实现追踪。但是在很多场合,例如,无人机由于通信传输质量较差无法及时回传实时图像,用户无法根据无人机传回的图像来选择目标以使无人机实现目标追踪。Many existing drones achieve target tracking by the user selecting the target area in the real-time image returned by the drone; then the drone or the terminal performs target recognition according to the area set by the user; and finally the drone Implement tracking. However, in many occasions, for example, the UAV cannot return real-time images in time due to poor communication transmission quality, and users cannot select targets based on the images returned by the UAV to enable the UAV to achieve target tracking.
发明内容Summary of the invention
本发明实施例提供一种基于无人机的追踪方法、以及应用该方法的系统与终端,可适用的场合更广。Embodiments of the present invention provide a tracking method based on a drone, and a system and terminal applying the method, which can be applied in a wider range of occasions.
本发明第一方面提供一种基于无人机的追踪方法,其方法可包括:所述无人机根据获取的搜寻指令飞行并获取实时图像;A first aspect of the present invention provides a drone-based tracking method. The method may include: the drone flies according to the obtained search instruction and obtains a real-time image;
所述无人机根据来自终端的目标信息和当前获取的实时图像,从所述当前获取的实时图像中提取出需要追踪对象信息;The drone extracts the information of the object to be tracked from the currently acquired real-time image according to the target information from the terminal and the currently acquired real-time image;
所述无人机计算所述目标信息与需要追踪对象信息的匹配度;The UAV calculates the matching degree between the target information and the object information to be tracked;
若所述计算出的匹配度大于预设的匹配度,所述无人机将反馈信息给发送给所述终端以供所述终端根据反馈信息产生追踪指令;If the calculated matching degree is greater than the preset matching degree, the drone sends feedback information to the terminal for the terminal to generate a tracking instruction according to the feedback information;
若接收到来自所述终端的追踪指令,所述无人机发送需要追踪对象的当前追踪信息给所述终端。If a tracking instruction from the terminal is received, the drone sends the current tracking information of the object to be tracked to the terminal.
本发明第二方面提供一种无人机,其可包括:A second aspect of the present invention provides a drone, which may include:
第一存储器,用于存储第一计算机可读程序;以及A first memory for storing a first computer-readable program; and
第一处理器,用于执行所述第一计算机可读程序以实现追踪方法,所述方法包括:A first processor is configured to execute the first computer-readable program to implement a tracking method, and the method includes:
根据获取的搜寻指令飞行并获取实时图像;Fly according to the obtained search instruction and obtain real-time images;
根据来自终端的目标信息和当前获取的实时图像,从所述当前获取的实时图像中提取出需要追踪对象信息;According to the target information from the terminal and the currently acquired real-time image, extract the object information to be tracked from the currently acquired real-time image;
计算所述目标信息与需要追踪对象信息的匹配度;Calculating the matching degree between the target information and the information of the object to be tracked;
若所述计算出的匹配度大于预设的匹配度,将反馈信息给发送给所述终端以供所述终端根据反馈信息产生追踪指令;以及If the calculated matching degree is greater than the preset matching degree, sending feedback information to the terminal for the terminal to generate a tracking instruction according to the feedback information; and
若接收到来自所述终端的追踪指令,对所述相应对象进行追踪,并发送相应的追踪信息给终端。If a tracking instruction from the terminal is received, the corresponding object is tracked, and corresponding tracking information is sent to the terminal.
本发明第三方面还提供一种终端,所述终端包括:A third aspect of the present invention also provides a terminal, the terminal includes:
第二存储器,用于存储第二计算机可读程序;以及A second memory for storing a second computer-readable program; and
第二处理器,用于执行所述第二计算机可读程序以实现追踪方法,所述方法包括:A second processor, configured to execute the second computer-readable program to implement a tracking method, the method including:
发送搜寻指令给无人机以控制所述无人机按照搜寻指令飞行以及获取实时图像;Sending a search instruction to the drone to control the drone to fly according to the search instruction and obtain real-time images;
若接收到来自无人机的反馈信息,根据所述反馈信息识别是否向需要所述无人机发送追踪指令;If the feedback information from the drone is received, identify whether to send a tracking instruction to the drone according to the feedback information;
若识别出需要向所述无人机发送所述追踪指令,向所述无人机发送追踪指令以控制所述无人机回传需要追踪对象的当前追踪信息。If it is identified that the tracking instruction needs to be sent to the drone, the tracking instruction is sent to the drone to control the drone to return the current tracking information of the object to be tracked.
本发明第四方面提供一种基于无人机的追踪系统,所述追踪系统包括无人机和终端;所述无人机包括第一存储器,用于存储第一计算机可读程序;以及A fourth aspect of the present invention provides a drone-based tracking system including a drone and a terminal; the drone includes a first memory for storing a first computer-readable program; and
第一处理器,用于执行所述第一计算机可读程序以实现追踪方法,所述方法包括:A first processor is configured to execute the first computer-readable program to implement a tracking method, and the method includes:
根据获取的搜寻指令飞行并获取实时图像;Fly according to the obtained search instruction and obtain real-time images;
根据来自所述终端的目标信息和当前获取的实时图像,从所述当前获取的 实时图像中提取出需要追踪对象信息;According to the target information from the terminal and the currently acquired real-time image, extract the object information to be tracked from the currently acquired real-time image;
计算所述目标信息与需要追踪对象信息的匹配度;Calculating the matching degree between the target information and the information of the object to be tracked;
若所述计算出的匹配度大于预设的匹配度,将反馈信息给发送给所述终端以供所述终端根据反馈信息产生追踪指令;以及If the calculated matching degree is greater than the preset matching degree, sending feedback information to the terminal for the terminal to generate a tracking instruction according to the feedback information; and
若接收到来自所述终端的追踪指令,对所述相应对象进行追踪,并发送相应的追踪信息给终端;If a tracking instruction from the terminal is received, track the corresponding object and send corresponding tracking information to the terminal;
所述终端,包括:The terminal includes:
第二存储器,用于存储第二计算机可读程序;以及A second memory for storing a second computer-readable program; and
第二处理器,用于执行所述第二计算机可读程序以实现追踪方法,所述方法包括:A second processor, configured to execute the second computer-readable program to implement a tracking method, the method including:
发送所述搜寻指令给所述无人机以控制所述无人机按照搜寻指令飞行以及获取实时图像;Sending the search command to the drone to control the drone to fly according to the search command and obtain real-time images;
若接收到来自无人机的反馈信息,根据所述反馈信息判断是否向需要所述无人机发送所述追踪指令;If the feedback information from the drone is received, determine whether to send the tracking instruction to the drone according to the feedback information;
若判断出需要向所述无人机发送所述追踪指令,向所述无人机发送追踪指令。If it is determined that the tracking instruction needs to be sent to the drone, the tracking instruction is sent to the drone.
由上可见,本发明实施例提供一种基于无人机的追踪方法、及应用该方法的系统与终端,用户可通过终端直接发送追踪目标信息给无人机,无人机就可以根据该追踪目标信息进行识别和追踪,无需在无人机回传的实时图像中选择追踪目标,应用的场合更广。As can be seen from the above, embodiments of the present invention provide a drone-based tracking method, and a system and terminal applying the method. A user can directly send tracking target information to the drone through the terminal, and the drone can use Target information can be identified and tracked, and there is no need to select the tracking target in the real-time image returned by the drone, and the application is wider.
附图说明BRIEF DESCRIPTION
图1为本发明第一实施方式的追踪方法的第一实施例的流程示意图;FIG. 1 is a schematic flowchart of a first example of the tracking method according to the first embodiment of the present invention;
图2为本发明第一实施方式的追踪系统的第一实施例的示意图;2 is a schematic diagram of a first example of the tracking system according to the first embodiment of the present invention;
图3为本发明第一实施方式的追踪方法的第二实施例的流程示意图;3 is a schematic flowchart of a second example of the tracking method according to the first embodiment of the present invention;
图4为本发明第一实施方式例的追踪方法的第三实施例的流程示意图;4 is a schematic flowchart of a third example of the tracking method according to the first embodiment of the present invention;
图5为本发明第一实施方式的追踪方法的第四实施例的流程示意图;5 is a schematic flowchart of a fourth example of the tracking method according to the first embodiment of the present invention;
图6为本发明第一实施方式的追踪方法的第三实施例示意图;6 is a schematic diagram of a third example of the tracking method according to the first embodiment of the present invention;
图7为本发明第一实施方式的追踪方法的第四实施例的流程示意图;7 is a schematic flowchart of a fourth example of the tracking method according to the first embodiment of the present invention;
图8为本发明第一实施方式例的追踪方法的第五实施例的流程示意图;8 is a schematic flowchart of a fifth example of the tracking method according to the first embodiment of the present invention;
图9为本发明第二实施方式的追踪方法的第一实施例的流程示意图;9 is a schematic flowchart of a first example of a tracking method according to a second embodiment of the invention;
图10为本发明第二实施方式的追踪系统的第一实施例的流程示意图;10 is a schematic flowchart of a first example of a tracking system according to a second embodiment of the invention;
图11为本发明第二实施方式的追踪方法的第二实施例的流程示意图;11 is a schematic flowchart of a second example of the tracking method according to the second embodiment of the present invention;
图12为本发明第二实施方式的追踪方法的第三实施例的流程示意图;12 is a schematic flowchart of a third example of the tracking method according to the second embodiment of the present invention;
图13为本发明第二实施方式的追踪方法的第四实施例的流程示意图;13 is a schematic flowchart of a fourth example of the tracking method according to the second embodiment of the present invention;
图14为本发明第二实施方式的追踪方法的第五实施例的流程示意图;14 is a schematic flowchart of a fifth example of the tracking method according to the second embodiment of the present invention;
图15为本发明第二实施方式的追踪方法的子流程第一实施例示意图;15 is a schematic diagram of a first example of a sub-flow of a tracking method according to a second embodiment of the invention;
图16为本发明第二实施方式的追踪方法的子流程第二实施例示意图;16 is a schematic diagram of a second example of the sub-flow of the tracking method according to the second embodiment of the present invention;
图17为本发明第二实施方式的追踪方法的子流程第三实施例示意图;17 is a schematic diagram of a third example of the sub-flow of the tracking method according to the second embodiment of the present invention;
图18为本发明第二实施方式的追踪方法的第六实施例的流程示意图;18 is a schematic flowchart of a sixth example of the tracking method according to the second embodiment of the present invention;
图19为本发明第二实施方式的追踪方法的第七实施例的流程示意图;19 is a schematic flowchart of a seventh example of the tracking method according to the second embodiment of the present invention;
图20为本发明第一实施方式的追踪系统的功能模块示意图;20 is a schematic diagram of functional modules of the tracking system according to the first embodiment of the present invention;
图21为本发明第二实施方式的追踪系统的功能模块示意图;21 is a schematic diagram of functional modules of a tracking system according to a second embodiment of the invention;
图22为本发明第二实施方式的追踪系统的子功能模块第一实施例示意图;22 is a schematic diagram of a first example of a sub-function module of a tracking system according to a second embodiment of the invention;
图23为本发明第二实施方式的追踪系统的子功能模块第二实施例示意图;23 is a schematic diagram of a second example of the sub-function module of the tracking system according to the second embodiment of the present invention;
图24为本发明实施例的终端的第一实施例结构示意图;24 is a schematic structural diagram of a first embodiment of a terminal according to an embodiment of the present invention;
图25为本发明实施例的终端的第二实施例结构示意。FIG. 25 is a schematic structural diagram of a second embodiment of a terminal according to an embodiment of the present invention.
具体实施方式detailed description
本发明实施例提供一种基于无人机的追踪方法、及应用该方法的系统与终端,用户可通过终端直接发送追踪目标信息给无人机,无人机就可以根据该追踪目标信息进行识别和追踪,无需在无人机回传的实时图像中选择追踪目标,应用的场合更广。Embodiments of the present invention provide a tracking method based on a drone, and a system and terminal applying the method. A user can directly send tracking target information to the drone through the terminal, and the drone can be identified based on the tracking target information. And tracking, there is no need to select the tracking target in the real-time image returned by the drone, and the application is wider.
图1和图2分别为本发明实施例的一种基于无人机的追踪方法流程示意图和该追踪方法的运行环境示意图。在本实施例中,终端10通过地面站30与无人机20通讯。用户可通过终端10操控无人机20来实现该追踪方法。如图1所示,该追踪方法包括:1 and 2 are respectively a schematic flowchart of a tracking method based on a drone and a schematic diagram of an operating environment of the tracking method according to an embodiment of the present invention. In this embodiment, the terminal 10 communicates with the drone 20 through the ground station 30. The user can control the drone 20 through the terminal 10 to realize the tracking method. As shown in Figure 1, the tracking method includes:
步骤S101,无人机20根据获取的搜寻指令飞行并获取实时图像。在本实施例中,该搜寻指令由终端10发送给无人机20。具体地,终端10响应用户操作发送该搜寻指令给无人机20。例如,追踪目标是车辆或者人物,用户可以根据 追踪目标可能的活动范围或者踪迹设置相应的搜寻策略。搜寻策略包括飞行策略,如无人机搜寻路径、姿态、方位、空速等。搜寻策略还包括图像获取策略,如拍摄方式、拍摄光线、取景策略等。用户通过终端设置相应的搜寻策略从而生成搜寻指令并发送给该无人机。无人机获取到该搜寻指令后根据搜寻指令进行飞行。In step S101, the UAV 20 flies according to the obtained search instruction and obtains a real-time image. In this embodiment, the search command is sent from the terminal 10 to the drone 20. Specifically, the terminal 10 sends the search instruction to the drone 20 in response to user operation. For example, if the tracking target is a vehicle or a person, the user can set a corresponding search strategy according to the possible range or track of the tracking target. Search strategies include flight strategies, such as UAV search path, attitude, bearing, airspeed, etc. The search strategy also includes image acquisition strategies, such as shooting methods, shooting light, and framing strategies. The user sets a corresponding search strategy through the terminal to generate a search instruction and sends it to the drone. After the drone obtains the search instruction, it will fly according to the search instruction.
步骤S103,无人机20根据来自终端10的追踪目标信息和当前获取的实时图像,从当前获取的实时图像中提取出需要追踪对象信息。追踪目标信息可以是文字信息也可以是图像信息。具体地,用户通过终端10发送追踪目标信息的图片给该无人机。例如,追踪目标是车辆,用户通过终端10发送车辆的车牌号码给无人机20。例如,追踪目标是人物,用户通过终端10发送追踪目标的人脸图像给无人机20。其中,人脸图像可以从非无人机20发送过来的图片中提取,也可以是从无人机20发送过来的图片中提取。具体地,用户可以通过终端10在图片中将追踪目标的人脸轮廓进行选取或者是通过终端10利用矩形框将追踪对象的人脸区域进行选取,并发送给无人机20。或者图片中包含的人物图像仅是追踪目标,用户可以直接通过终端10将图片发送给无人机20。无人机20获取到追踪目标信息后,根据追踪目标信息提取需要追踪对象信息。例如,无人机20获取到的追踪目标信息是文字信息,无人机20从当前获取的实时图像中提取出文字信息。例如,无人机20获取到的追踪目标信息是人脸图像信息,无人机20从当前获取的实时图像中提取出人脸图像信息。In step S103, the drone 20 extracts the information of the object to be tracked from the currently acquired real-time image according to the tracking target information from the terminal 10 and the currently acquired real-time image. The tracking target information may be text information or image information. Specifically, the user sends a picture of tracking target information to the drone through the terminal 10. For example, the tracking target is a vehicle, and the user sends the vehicle license plate number to the drone 20 through the terminal 10. For example, the tracking target is a person, and the user sends a face image of the tracking target to the drone 20 through the terminal 10. The face image may be extracted from a picture sent by a non-drone 20, or may be extracted from a picture sent by the drone 20. Specifically, the user can select the face contour of the tracking target in the picture through the terminal 10 or select the face area of the tracking target through the rectangular frame by the terminal 10 and send it to the drone 20. Or the image of the person included in the picture is only the tracking target, and the user can directly send the picture to the drone 20 through the terminal 10. After acquiring the tracking target information, the UAV 20 extracts the tracking target information according to the tracking target information. For example, the tracking target information acquired by the drone 20 is text information, and the drone 20 extracts text information from the currently acquired real-time image. For example, the tracking target information acquired by the drone 20 is face image information, and the drone 20 extracts face image information from the currently acquired real-time image.
步骤S105,无人机20计算追踪目标信息与需要追踪对象信息的匹配度。具体地,无人机20将获取到追踪目标信息和提取到的需要追踪对象信息进行比较,得出匹配度。在本实施例,如果是文字信息,匹配度可以根据文字匹配的个数比例来设置,例如,60%的文字相同,匹配度为60%。如果是图像信息,匹配度可以根据图像信息的特征值进行比较,例如,将人脸轮廓设置为特征值,如果特征值有70%一致,匹配度为70%。In step S105, the UAV 20 calculates the matching degree of the tracking target information and the information of the tracking target. Specifically, the unmanned aerial vehicle 20 compares the acquired tracking target information and the extracted tracking target information to obtain a matching degree. In this embodiment, if it is text information, the matching degree can be set according to the proportion of the number of text matches, for example, 60% of the text is the same, and the matching degree is 60%. If it is image information, the matching degree can be compared according to the feature value of the image information. For example, the face contour is set as the feature value. If the feature values are 70% consistent, the matching degree is 70%.
步骤S107,若计算出的匹配度大于预设的匹配度,无人机20将反馈信息给发送给终端10,以供终端10根据反馈信息产生追踪指令。具体地,在本实施例中,文字信息的预设匹配度大于图像信息的预设匹配度。可以理解地,由于文字信息识别技术相对于图像信息识别技术准确性高,因此,相应地将其匹配度提高,进一步降低误差。在本实施例中,反馈信息为图片。下文将详细描述终 端如何根据反馈信息产生追踪指令,在此不再赘述。In step S107, if the calculated matching degree is greater than the preset matching degree, the UAV 20 sends feedback information to the terminal 10 for the terminal 10 to generate a tracking instruction according to the feedback information. Specifically, in this embodiment, the preset matching degree of the text information is greater than the preset matching degree of the image information. Understandably, since the character information recognition technology has higher accuracy than the image information recognition technology, the matching degree is accordingly increased to further reduce errors. In this embodiment, the feedback information is a picture. The following will describe in detail how the terminal generates the tracking command based on the feedback information, which will not be repeated here.
步骤S109,若接收到来自终端10的追踪指令,无人机20发送相应的追踪信息给终端10。优选地,无人机20还对需要追踪对象进行追踪并发送实时追踪信息给终端10。追踪信息包括追踪目标的GPS定位信息、追踪路径、时间、匹配度、采用的匹配算法类型等。无人机20在接到终端10的追踪指令后才开始追踪,即,无人机20初步判断出需要追踪对象后再发送给终端10进一步确认需要追踪对象是否准确,进一步避免了误判追踪目标的现象。另外,无人机20只在匹配度较高时才发送反馈信息给终端10,可以降低发送错误的反馈信息给终端10的概率,避免给终端10增加处理不必要的数据,大大提升了效率和处理速度。In step S109, if a tracking instruction from the terminal 10 is received, the drone 20 sends corresponding tracking information to the terminal 10. Preferably, the drone 20 also tracks the object to be tracked and sends real-time tracking information to the terminal 10. The tracking information includes the GPS positioning information of the tracking target, the tracking path, time, matching degree, and the type of matching algorithm used. The UAV 20 starts tracking after receiving the tracking instruction from the terminal 10, that is, the UAV 20 initially determines that the object to be tracked is sent to the terminal 10 to further confirm whether the object to be tracked is accurate, and further avoids misjudgement of the tracking target The phenomenon. In addition, the UAV 20 only sends feedback information to the terminal 10 when the matching degree is high, which can reduce the probability of sending wrong feedback information to the terminal 10, avoid adding unnecessary data to the terminal 10, and greatly improve the efficiency and Processing speed.
上述实施例中,终端10发送给无人机的追踪目标信息(如图片或者文字信息)或者终端10接收来自无人机20的反馈信息(如图片或者文字信息)等信息无需是无人机20回传的实时图像。亦即,终端10和无人机20利用非实时信息即可实现对追踪目标的追踪,因此,对通信传输质量要求不高。即,终端10和无人机20之间的通信传输质量不佳的情况下,终端10和无人机20也可以顺利地完成对追踪目标进行追踪。因此,上述实施例的追踪方法对通信传输质量的依赖大大降低,在一些通信传输质量不佳的场合也可以适用,应用场合更加广泛。In the above embodiment, the tracking target information (such as pictures or text information) sent by the terminal 10 to the drone or the terminal 10 receiving feedback information (such as pictures or text information) from the drone 20 need not be the drone 20 Real-time images returned. That is, the terminal 10 and the UAV 20 can use non-real-time information to achieve tracking of the tracking target, so the quality of communication transmission is not high. That is, when the communication transmission quality between the terminal 10 and the drone 20 is not good, the terminal 10 and the drone 20 can also successfully track the tracking target. Therefore, the tracking method of the above embodiment greatly reduces the dependence on the communication transmission quality, and can also be applied in occasions where the communication transmission quality is poor, and the application scenarios are more extensive.
请参看图3,该方法还包括如下步骤。Please refer to FIG. 3, the method further includes the following steps.
步骤S301,无人机20检测是否接收到来自终端10的调整指令,其中,该调整指令为终端10在预设的时间内未接收到无人机20反馈信息时发送给无人机20。若接收到该调整指令,无人机执行步骤S302,反之执行步骤S103。In step S301, the drone 20 detects whether an adjustment instruction from the terminal 10 is received, where the adjustment instruction is sent to the drone 20 when the terminal 10 does not receive feedback information from the drone 20 within a preset time. If the adjustment instruction is received, the drone executes step S302, otherwise executes step S103.
步骤S302,无人机根据该调整指令飞行。Step S302, the drone is flying according to the adjustment instruction.
由于在预设时间内未接收到反馈信息,终端10可以判断出无人机20的搜寻策略需要调整,从而发送调整指令给相应的无人机调整搜寻路径,从而可以提高追踪的效率。Since the feedback information is not received within the preset time, the terminal 10 can determine that the search strategy of the drone 20 needs to be adjusted, and thus send an adjustment instruction to the corresponding drone to adjust the search path, thereby improving the efficiency of tracking.
请参看图4,无人机20发送该反馈信息之后,该方法还包括如下步骤。Referring to FIG. 4, after the UAV 20 sends the feedback information, the method further includes the following steps.
步骤S401,无人机20追踪需要追踪对象。In step S401, the drone 20 tracks the object to be tracked.
步骤S402,无人机20判断在预设时间内是否收到该追踪指令。若接收到追踪指令,执行步骤S109;反之执行步骤S103。In step S402, the UAV 20 determines whether the tracking instruction is received within a preset time. If the tracking instruction is received, step S109 is executed; otherwise, step S103 is executed.
由于无人机20初步识别出需要追踪的对象便及时跟踪,可以快速地对可疑的追踪目标进行追踪,避免错过追踪目标。另外,由于无人机20发送反馈信息后却长时间内没有收到该追踪指令,表示该反馈信息并不是追踪目标的信息,是无效的反馈信息,从而重新进行搜寻,灵活性较强。Since the UAV 20 initially recognizes the object to be tracked and tracks it in time, it can quickly track the suspicious tracking target and avoid missing the tracking target. In addition, since the UAV 20 has not received the tracking command for a long time after sending the feedback information, it indicates that the feedback information is not the information of the tracking target, but is invalid feedback information, so the search is repeated and the flexibility is strong.
请参看图5,展现了终端10控制无人机20具体步骤。具体地,该追踪方法还包括下面步骤。Please refer to FIG. 5, which shows the specific steps of the terminal 10 controlling the UAV 20. Specifically, the tracking method further includes the following steps.
步骤S500,终端10发送搜寻指令和追踪目标信息给无人机20。搜寻指令和追踪目标信息可以同时发送也可以按照一定的顺序发送。In step S500, the terminal 10 sends a search instruction and tracking target information to the drone 20. The search command and tracking target information can be sent at the same time or in a certain order.
步骤S502,若接收到反馈信息,终端10显示接收到的反馈信息,以供用户识别无人机20是否需要该追踪指令并供用户根据该反馈信息输入追踪请求。具体地,该反馈信息包括包含需要追踪对象的图片或者特征信息。在本实施例中以反馈信息为图片进行举例说明。若该图片中包含追踪目标的图像,用户可以识别出无人机20需要该追踪指令。终端10还显示相应的确认按键。若用户识别出图片中包含追踪目标的图像,用户点击相应的确认按键即可产生该追踪请求。在一些其他可行的实施例中,反馈信息也可以需要追踪对象的特征信息,若追踪目标为车辆,特征信息如车辆破损信息,车辆车牌被遮挡信息。若追踪目标为人,该特征信息可以人的身高,体型,年龄,人体缺陷、人脸轮廓信息等。In step S502, if feedback information is received, the terminal 10 displays the received feedback information for the user to identify whether the drone 20 needs the tracking instruction and for the user to input a tracking request based on the feedback information. Specifically, the feedback information includes a picture or feature information including the object to be tracked. In this embodiment, the feedback information is used as an example for illustration. If the picture contains an image of the tracking target, the user can recognize that the drone 20 needs the tracking instruction. The terminal 10 also displays a corresponding confirmation button. If the user recognizes the image containing the tracking target in the picture, the user clicks the corresponding confirmation button to generate the tracking request. In some other feasible embodiments, the feedback information may also require feature information of the tracking object. If the tracking target is a vehicle, the feature information such as vehicle damage information and vehicle license plate is blocked information. If the tracking target is a person, the feature information may be the person's height, body shape, age, human body defects, face contour information, and so on.
步骤S504,若终端10接收到用户输入的追踪请求,产生该追踪指令。In step S504, if the terminal 10 receives the tracking request input by the user, the tracking instruction is generated.
步骤S506,终端10将该追踪指令发送给无人机20。In step S506, the terminal 10 sends the tracking instruction to the drone 20.
本实施例中,用户可以根据无人机20发送的图片进行识别追踪目标,避免追踪错误的目标,提高了追踪的效率和成功率。In this embodiment, the user can identify and track the target according to the picture sent by the drone 20, to avoid tracking the wrong target, and improve the tracking efficiency and success rate.
在一些可行的实施例中,该方法还包括步骤S508:终端10在预设时间内未接收到反馈信息,产生调整指令以控制无人机10根据调整指令飞行调整飞行路径。In some feasible embodiments, the method further includes step S508: The terminal 10 does not receive the feedback information within a preset time, and generates an adjustment instruction to control the drone 10 to adjust the flight path according to the adjustment instruction.
上述实施例分别从无人机20和终端10各自执行该追踪方法中相应的步骤进行描述,以展示无人机20或者终端10独自执行该追踪方法的处理过程。下面将从无人机20和终端10互相配合执行该追踪方法中相应的步骤进行描述,以展示无人机20和终端10执行该追踪方法时二者的交互过程。The above embodiments are described in terms of the corresponding steps in the UAV 20 and the terminal 10 executing the tracking method, respectively, to show the processing procedure of the UAV 20 or the terminal 10 alone executing the tracking method. The following will describe the corresponding steps in the UAV 20 and the terminal 10 cooperating with each other to execute the tracking method, to show the interaction process of the UAV 20 and the terminal 10 when performing the tracking method.
请参看图6,该追踪方法包括如下步骤。Please refer to FIG. 6, the tracking method includes the following steps.
步骤S500,终端10发送搜寻指令和追踪目标信息给无人机20。In step S500, the terminal 10 sends a search instruction and tracking target information to the drone 20.
步骤S101,无人机20根据获取的搜寻指令飞行并获取实时图像。In step S101, the UAV 20 flies according to the obtained search instruction and obtains a real-time image.
步骤S103,无人机20根据获取的追踪目标信息和当前获取的实时图像中,提取出需要追踪对象信息。In step S103, the drone 20 extracts the information of the object to be tracked based on the acquired tracking target information and the currently acquired real-time image.
步骤S105,无人机20计算跟踪目标信息与需要追踪对象信息的匹配度。In step S105, the drone 20 calculates the degree of matching between the tracking target information and the information of the target to be tracked.
步骤S107,若计算出的匹配度大于预设的匹配度,无人机20发送反馈信息给终端10。其中,反馈信息为图片。In step S107, if the calculated matching degree is greater than the preset matching degree, the drone 20 sends feedback information to the terminal 10. Among them, the feedback information is a picture.
步骤S502,终端10显示接收到的反馈信息以供用户识别无人机是否需要追踪指令,并供用户根据该图片输入追踪请求。In step S502, the terminal 10 displays the received feedback information for the user to identify whether the drone needs a tracking instruction, and for the user to input a tracking request according to the picture.
步骤S504,若接收到用户输入的追踪指令,终端10产生追踪指令。Step S504, if a tracking instruction input by the user is received, the terminal 10 generates a tracking instruction.
步骤S506,终端10发送该追踪指令给无人机20。In step S506, the terminal 10 sends the tracking instruction to the drone 20.
步骤S109,无人机20响应追踪指令,发送需要追踪对象的追踪信息给终端10。In step S109, in response to the tracking instruction, the UAV 20 sends tracking information to the terminal 10 to be tracked.
请参看图7,在一些可行的实施例中,该追踪方法还包括下面步骤。步骤S508,若在预设时间内未接收到无人机20的反馈信息,终端10发送调整指令。Please refer to FIG. 7. In some feasible embodiments, the tracking method further includes the following steps. Step S508, if the feedback information of the drone 20 is not received within the preset time, the terminal 10 sends an adjustment instruction.
步骤S301,无人机20根据调整指令飞行。In step S301, the drone 20 flies according to the adjustment instruction.
请参看图8,在一些可行的实施例中,无人机20发送反馈信息后,该追踪方法还包括下面步骤。Please refer to FIG. 8. In some feasible embodiments, after the UAV 20 sends feedback information, the tracking method further includes the following steps.
步骤S401,无人机20追踪需要追踪对象。In step S401, the drone 20 tracks the object to be tracked.
步骤S403,若在预设时间内未接收到追踪指令,无人机20重新执行步骤S103及步骤S103之后的步骤。In step S403, if the tracking instruction is not received within the preset time, the drone 20 executes step S103 and the steps after step S103 again.
请参看图9和图10,图9和图10分别为本发明实施例提供另一基于无人机的追踪方法流程示意图和该追踪方法的运行环境示意图。在本实施例中,终端10’通过地面站30’与若干无人机20’通讯。用户可通过终端10’操控若干无人机20’来实现该追踪方法。如图6所示,该追踪方法包括下面步骤。Please refer to FIG. 9 and FIG. 10, which are schematic flowcharts of another drone-based tracking method and an operating environment diagram of the tracking method according to an embodiment of the present invention. In this embodiment, the terminal 10 'communicates with several drones 20' through the ground station 30 '. The user can control several drones 20 'through the terminal 10' to realize the tracking method. As shown in FIG. 6, the tracking method includes the following steps.
步骤S601,若干无人20’机分别根据获取到的搜寻指令飞行并获取实时图像,其中,各无人机20’获取的搜寻指令不同以控制各无人机20’按照不同搜寻路径进行飞行。该搜寻指令由终端10’发送给若干无人机20’。In step S601, a number of unmanned 20 'aircrafts respectively fly according to the obtained search instructions and obtain real-time images, wherein the search instructions obtained by each UAV 20' are different to control each UAV 20 'to fly according to different search paths. The search command is sent from the terminal 10 'to several drones 20'.
具体地,终端10’相应用户操作发送该搜寻指令给若干无人机20’。例如,追踪目标是车辆或者人物,用户可以根据追踪目标可能的活动范围或者踪迹设 置相应的搜寻策略。搜寻策略包括飞行策略,如无人机搜寻路径、姿态、方位、空速等。搜寻策略还包括图像获取策略,如拍摄方式、拍摄光线、取景策略等。用户通过终端设置相应的搜寻策略从而生成搜寻指令并发送给无人机20’。无人机20’获取到该搜寻指令后根据搜寻指令进行飞行。由于目标的准确位置并不确定,为了加快搜寻速度,用户可以设置多种可能的搜寻策略,从而产生多种不同的搜寻指令,以控制该若干无人机20’同时进行搜寻以达到快速定位追踪目标的效果。Specifically, the terminal 10 'sends the search instruction to several drones 20' in response to user operations. For example, if the tracking target is a vehicle or a person, the user can set a corresponding search strategy according to the possible range or track of the tracking target. Search strategies include flight strategies, such as UAV search path, attitude, bearing, airspeed, etc. The search strategy also includes image acquisition strategies, such as shooting methods, shooting light, and framing strategies. The user sets a corresponding search strategy through the terminal to generate a search instruction and sends it to the drone 20 '. After acquiring the search command, the UAV 20 'flies according to the search command. Since the exact position of the target is uncertain, in order to speed up the search, the user can set a variety of possible search strategies, thereby generating a variety of different search commands to control the several drones 20 'to search simultaneously to achieve fast positioning tracking The effect of the goal.
步骤S603,若干无人机20’分别根据来自终端10’的追踪目标信息和当前获取的实时图像,从当前获取的实时图像中提取出需要追踪对象信息。该步骤的详细描述请参照步骤S103,在此不再赘述。In step S603, several drones 20 'extract the information of the object to be tracked from the currently acquired real-time image based on the tracking target information from the terminal 10' and the currently acquired real-time image, respectively. For a detailed description of this step, please refer to step S103, which will not be repeated here.
步骤S605,若干无人机20’分别计算追踪目标信息与需要追踪对象信息的匹配度。该步骤的详细描述请参照S105,在此不再赘述。In step S605, several drones 20 'calculate the matching degree of the tracking target information and the tracking target information, respectively. For a detailed description of this step, please refer to S105, which will not be repeated here.
步骤S607,一个或者多个无人机20’发送反馈信息给终端10以供终端10根据反馈信息产生追踪指令,其中,发送反馈信息的无人机20’为计算出的匹配度大于预设的匹配度的无人机20’。该步骤的详细描述请参照S107。In step S607, one or more drones 20 'send feedback information to the terminal 10 for the terminal 10 to generate a tracking instruction according to the feedback information, wherein the drones 20' sending the feedback information are calculated that the matching degree is greater than the preset Match the degree of UAV 20 '. For a detailed description of this step, please refer to S107.
步骤S609,若一个或者多个无人机20’接收到来自终端10’的追踪指令,接收到该追踪指令的无人机20’发送相应的追踪信息给终端10’。在一些可行的实施例中,若一个或者多个无人机20’接收到来自终端10’的追踪指令还对需要追踪对象进行追踪。其中,该追踪信息包含追踪对象的位置信息和追踪路径信息。In step S609, if one or more UAVs 20 'receive a tracking instruction from the terminal 10', the UAV 20 'that received the tracking instruction sends corresponding tracking information to the terminal 10'. In some feasible embodiments, if one or more drones 20 'receive the tracking instruction from the terminal 10', they also track the object to be tracked. The tracking information includes position information and tracking path information of the tracking object.
步骤S611,若一个或者多个无人机20’接收到来自终端10’的返回指令,一个或者多个无人机20’响应该返回指令返回。若识别出需要追踪指令的无人机后,可以理解地,其余仍在进行搜寻的无人机20’即认为无需在继续搜寻,因此,终端10’产生返回指令以控制其余仍在进行搜寻的无人机20’结束搜寻。In step S611, if one or more drones 20 'receive a return instruction from the terminal 10', one or more drones 20 'return in response to the return instruction. If the drone that needs to track the command is identified, it can be understood that the rest of the unmanned aerial vehicles 20 'still do not need to continue searching, so the terminal 10' generates a return command to control the remaining unmanned Drone 20 'ends the search.
上述实施例中,终端10’发送给无人机20’的追踪目标信息(如图片或者文字信息)或者终端10’接收来自无人机20’的反馈信息(如图片或者文字信息)等信息无需是无人机20’回传的实时图像,亦即,终端10’和无人机20’利用非实时信息即可实现对追踪目标的追踪,因此,对通信传输质量要求不高。即终端10’和无人机20’之间的通信传输质量不佳的情况下,终端10’和无人机20’也可以顺利地完成对追踪目标进行追踪。因此,上述实施例的追踪方法 对通信传输质量的依赖大大降低,在一些通信传输质量不佳的场合也可以适用,应用场合更加广泛。In the above embodiment, the tracking target information (such as pictures or text information) sent by the terminal 10 'to the drone 20' or the terminal 10 'receiving feedback information (such as pictures or text information) from the drone 20' does not require It is a real-time image returned by the UAV 20 ', that is, the terminal 10' and the UAV 20 'can use non-real-time information to achieve tracking of the tracking target, so the quality of communication transmission is not high. That is, when the communication transmission quality between the terminal 10 'and the drone 20' is not good, the terminal 10 'and the drone 20' can also successfully complete the tracking of the tracking target. Therefore, the tracking method of the above embodiment greatly reduces the dependence on the communication transmission quality, and can also be applied in occasions where the communication transmission quality is poor, and the application scenarios are more extensive.
请参看图11,该方法执行步骤S601之后,还包括如下步骤。Referring to FIG. 11, after the method executes step S601, the method further includes the following steps.
步骤S801,每一无人机20’检测是否接收到来自终端10’的调整指令,其中,该调整指令为终端10’在预设的时间内未接收到一个或者多个无人机反馈信息时产生并发送给相应的无人机20’。若接收到该调整指令,执行步骤S802,反之执行步骤S603。Step S801, each drone 20 'detects whether an adjustment instruction from the terminal 10' is received, where the adjustment instruction is when the terminal 10 'does not receive feedback information from one or more drones within a preset time Generated and sent to the corresponding UAV 20 '. If the adjustment instruction is received, step S802 is executed, otherwise, step S603 is executed.
步骤S802,接收到调整指令的无人机20’根据该调整指令飞行。由于在预设时间内未接收到反馈信息,终端10’可以识别无人机20’的搜寻策略需要调整,从而发送调整指令给相应的无人机20’以调整搜寻路径,从而可以提高追踪的效率。In step S802, the drone 20 'that has received the adjustment instruction flies according to the adjustment instruction. Since the feedback information is not received within the preset time, the terminal 10 'can recognize that the search strategy of the drone 20' needs to be adjusted, so as to send an adjustment instruction to the corresponding drone 20 'to adjust the search path, which can improve tracking effectiveness.
请参看图12,一个或者多个无人机20’发送反馈信息之后,该方法还包括如下步骤。Referring to FIG. 12, after one or more drones 20 'send feedback information, the method further includes the following steps.
步骤S901,发送反馈信息的无人机20’分别追踪各自的需要追踪对象。In step S901, the unmanned aerial vehicles 20 'that send feedback information track their respective tracking objects.
步骤S902,发送反馈信息的无人机20’判断是否在预设时间内收到该追踪指令。若接收到该追踪指令,执行步骤S609;反之执行S603。由于无人机发送该反馈信息却长时间内没有收到该追踪指令,表示该反馈信息并不是追踪目标的信息,是无效的反馈信息,从而重新进行搜寻。可以理解地,本实施例可以快速地对可疑的追踪目标进行追踪,可以避免错过追踪目标。另外,如果在预设时间内未接收到该追踪指令,可以确定可疑对象并非追踪目标,并重新按照原来的搜寻策略进行搜寻,灵活性较强。In step S902, the unmanned aerial vehicle 20 'sending feedback information judges whether the tracking instruction is received within a preset time. If the tracking instruction is received, step S609 is executed; otherwise, S603 is executed. Since the UAV sends the feedback information but has not received the tracking instruction for a long time, it indicates that the feedback information is not the information of the tracking target, but is invalid feedback information, so the search is restarted. Understandably, in this embodiment, suspicious tracking targets can be quickly tracked, and missed tracking targets can be avoided. In addition, if the tracking instruction is not received within the preset time, it can be determined that the suspicious object is not the tracking target, and the search is performed again according to the original search strategy, which has greater flexibility.
请参看图13,请参看图5,展现了终端10’控制无人机20’具体步骤。具体地,该追踪方法还包括下面步骤。Please refer to FIG. 13 and FIG. 5 to show the specific steps of the terminal 10 ’controlling the drone 20’. Specifically, the tracking method further includes the following steps.
步骤S1000,终端10’发送若干搜寻指令和追踪目标信息给若干无人机20’。终端10’可同时向无人机20’搜寻指令和追踪目标信息,或者以按照一定的顺序向每个无人机20’发送搜寻指令和追踪目标信息。若干搜选指令各不相同。In step S1000, the terminal 10 'sends several search instructions and tracking target information to several drones 20'. The terminal 10 'may simultaneously search for commands and track target information to the drones 20', or send search commands and track target information to each drone 20 'in a certain order. Several search instructions are different.
步骤S1002,若终端10’接收到反馈信息,根据反馈信息识别出需要追踪指令的无人机20’。In step S1002, if the terminal 10 'receives the feedback information, the drone 20' that needs the tracking instruction is identified based on the feedback information.
步骤S1004,终端10’将追踪指令发送给需要追踪指令的无人机20’。In step S1004, the terminal 10 'sends the tracking instruction to the drone 20' that needs the tracking instruction.
步骤S1006,终端10’还根据需要该追踪指令的无人机产生返回指令,以 控制不需要发送该追踪指令的无人机20’返回。具体地,若识别出需要追踪指令的无人机后,可以理解地,其余仍在进行搜寻的无人机20’即认为无需在继续搜寻,因此,终端10’产生返回指令以控制其余仍在进行搜寻的无人机20’结束搜寻。In step S1006, the terminal 10 'also generates a return instruction according to the drone that needs the tracking instruction to control the drone 20' that does not need to send the tracking instruction to return. Specifically, if a drone that needs a tracking instruction is identified, it is understandable that the remaining unmanned aerial vehicles 20 'are not considered to continue searching. Therefore, the terminal 10' generates a return instruction to control the remaining unmanned The searching UAV 20 'ends the search.
请参看图14,根据反馈信息识别出需要追踪指令的无人机20’具体包括如下步骤。Referring to FIG. 14, the identification of the drone 20 'that needs to be tracked according to the feedback information specifically includes the following steps.
步骤S10021,终端10’在显示接收到的反馈信息,以供用户识别需要该追踪指令的无人机20’,并供用户根据该图片输入相应的追踪请求。在本实施例中,该反馈信息为包含需要追踪对象的图片。具体地,若用户从该图片中识别出追踪目标,该用户可以识别该无人机20’需要该追踪指令。在一些其他可行的实施例中,该反馈信息也可以为需要追踪对象的特征信息。例如,若追踪目标为车辆,该特征信息如车辆破损信息,车辆车牌被遮挡信息。例如,若追踪目标为人物,该特征信息可以人的身高,体型,年龄,人体缺陷信息等。具体地,该方法还包括下面步骤。In step S10021, the terminal 10 'displays the received feedback information for the user to identify the drone 20' requiring the tracking instruction, and the user to input the corresponding tracking request according to the picture. In this embodiment, the feedback information is a picture containing the object to be tracked. Specifically, if the user identifies the tracking target from the picture, the user can recognize that the drone 20 'needs the tracking instruction. In some other feasible embodiments, the feedback information may also be characteristic information of the object to be tracked. For example, if the tracking target is a vehicle, the feature information such as vehicle damage information and vehicle license plate information is blocked. For example, if the tracking target is a person, the characteristic information may be the person's height, body shape, age, and human body defect information. Specifically, the method further includes the following steps.
步骤S10022,若终端10’接收到用户输入的追踪请求,根据该追踪请求识别出需要该追踪指令的无人机20’,并产生相应的追踪指令。具体地,终端10’还显示与图片一一对应的确认按键。若用户识别出终端10’显示的图片中存在追踪目标图像,用户点击相应的确认按键即可产生该追踪请求。在一些其他可行的实施例中,若用户识别出终端10’显示的图片中存在追踪目标图像,终端10’还可以输入对应无人机20’的唯一标识码以产生该追踪请求。该标识码可以是无人机的编号或者是IP地址。Step S10022, if the terminal 10 'receives the tracking request input by the user, it identifies the drone 20' that needs the tracking instruction according to the tracking request, and generates a corresponding tracking instruction. Specifically, the terminal 10 'also displays a confirmation button corresponding to the pictures one-to-one. If the user recognizes that there is a tracking target image in the picture displayed on the terminal 10 ', the user clicks the corresponding confirmation button to generate the tracking request. In some other feasible embodiments, if the user recognizes that there is a tracking target image in the picture displayed by the terminal 10 ', the terminal 10' may also input a unique identification code corresponding to the drone 20 'to generate the tracking request. The identification code may be the number of the drone or the IP address.
请参看图15,本发明实施例提供的该终端根据该反馈信息发送该追踪指令给无人机的第二实施例方法流程图。在本实施例中,该反馈信息包含匹配度和/或采用的匹配算法类型。下面以反馈信息为匹配度进行举例说明。具体地,该方法还包括下面步骤。Referring to FIG. 15, a method flowchart of a second embodiment in which the terminal provides the tracking instruction to the drone according to the feedback information according to an embodiment of the present invention. In this embodiment, the feedback information includes the matching degree and / or the type of matching algorithm used. The following uses feedback information as an example of matching. Specifically, the method further includes the following steps.
步骤S1101,终端10’根据接收到的匹配度和预设条件识别出需要追踪指令的无人机20’。In step S1101, the terminal 10 'identifies the unmanned aerial vehicle 20' requiring the tracking instruction according to the received matching degree and the preset condition.
步骤S1103,该终端若识别出需要追踪指令的无人机,产生该追踪指令并发送给需要该追踪指令的无人机。Step S1103, if the terminal recognizes the drone that needs the tracking instruction, generates the tracking instruction and sends it to the drone that needs the tracking instruction.
本实施例中,可以根据匹配度进行识别是否需要追踪指令的无人机并向需 要追踪指令的无人机发送该追踪指令,降低了追踪错误的目标,提高了追踪的效率和成功率。In this embodiment, it is possible to identify whether a drone that needs a tracking instruction according to the matching degree and send the tracking instruction to the drone that needs a tracking instruction, which reduces the target of tracking errors and improves the efficiency and success rate of tracking.
请参看图16,其为步骤S1101第一实施例子步骤流程图。Please refer to FIG. 16, which is a flowchart of steps in the first implementation example of step S1101.
步骤S1202,终端10’对预设的时间内接收到的匹配度进行计数。若计数为1,执行步骤S1204;若计数大于1执行步骤S1206。In step S1202, the terminal 10 'counts the degree of matching received within a preset time. If the count is 1, execute step S1204; if the count is greater than 1, execute step S1206.
步骤S1204,终端10’若在预设时间内仅接收到一个无人机发送的匹配度,识别出该发送匹配度的无人机需要该追踪指令。In step S1204, if the terminal 10 'receives the matching degree sent by only one drone within a preset time, it is recognized that the drone sending the matching degree needs the tracking instruction.
步骤S1206,终端10’若在预设时间内接收到多个无人机20’发送过来的匹配度,终端10’对所接收的匹配度进行比较得出最高的匹配度。In step S1206, if the terminal 10 'receives the matching degree sent by the multiple drones 20' within the preset time, the terminal 10 'compares the received matching degree to obtain the highest matching degree.
步骤S1208,终端10’识别出发送该最高匹配度的无人机需要该追踪指令。In step S1208, the terminal 10 'recognizes that the UAV that sends the highest matching degree needs the tracking instruction.
请参看图17,其为步骤S1101第二实施例子步骤的方法流程图。Please refer to FIG. 17, which is a method flowchart of steps in the second implementation example of step S1101.
步骤S1302,终端10’对预设的时间内接收到的匹配度进行计数。若计数为1,执行步骤S1304;若计数大于1执行步骤1306。In step S1302, the terminal 10 'counts the matching degree received within a preset time. If the count is 1, execute step S1304; if the count is greater than 1, execute step 1306.
步骤S1304,终端10’若在预设时间内仅接收到一个无人机20’发送的匹配度,识别出发送匹配度的无人机20’需要该追踪指令。In step S1304, if the terminal 10 'receives the matching degree sent by only one drone 20' within a preset time, it is recognized that the drone 20 'sending the matching degree needs the tracking instruction.
步骤S1306,终端10’若在预设的时间内接收到多个无人机20’发送过来的匹配度,终端10’对接收到的匹配度进行比较,得出预设数量最高的匹配度。In step S1306, if the terminal 10 'receives the matching degrees sent by multiple drones 20' within a preset time, the terminal 10 'compares the received matching degrees to obtain the highest number of matching degrees.
步骤S1308,终端10’识别出发送预设数量最高匹配度的无人机20’需要该追踪指令。In step S1308, the terminal 10 'recognizes that sending the preset number of drones 20' with the highest matching degree requires the tracking instruction.
在一些可行的实施例中,若各无人机20’采用了不同的匹配算法类型,分析单元在分析最高匹配度时在将相同类型的匹配度进行比较,比对过程参照以上步骤,在此不再赘述。In some feasible embodiments, if each UAV 20 'uses a different matching algorithm type, the analysis unit compares the matching degrees of the same type when analyzing the highest matching degree. The comparison process refers to the above steps. No longer.
上述实施例分别从无人机20’和终端10’各自执行该追踪方法中相应的步骤进行描述,以展示无人机20’或者终端10’独自执行该追踪方法的处理过程。下面将从无人机20’和终端10’互相配合执行该追踪方法中相应的步骤进行描述,以展示无人机20’和终端10’执行该追踪方法时二者的交互过程。The above embodiments are described in terms of the respective steps in the tracking method performed by the UAV 20 'and the terminal 10', respectively, to show the processing procedure of the UAV 20 'or the terminal 10' performing the tracking method alone. The following will describe the corresponding steps in the UAV 20 'and the terminal 10' cooperating with each other to execute the tracking method to show the interaction process between the UAV 20 'and the terminal 10' when performing the tracking method.
请参看图18,该追踪方法包括下面步骤。Please refer to FIG. 18, the tracking method includes the following steps.
步骤S1000:终端10’发送若干搜寻指令和追踪目标信息给若干无人机20’。Step S1000: The terminal 10 'sends several search instructions and tracking target information to several drones 20'.
步骤S601:若干无人机10’根据获取的搜寻指令飞行并获取实时图像。Step S601: Several drones 10 'fly according to the obtained search instruction and obtain real-time images.
步骤S603,若干无人机20’分别根据来自终端10’的追踪目标信息和当前 获取的实时图像,从当前获取的实时图像中提取出需要追踪对象信息。In step S603, a number of drones 20 'extract the information of the object to be tracked from the currently acquired real-time image based on the tracking target information from the terminal 10' and the currently acquired real-time image, respectively.
步骤S605,若干无人机20’分别计算追踪目标信息与需要追踪对象信息的匹配度。In step S605, several drones 20 'calculate the matching degree of the tracking target information and the tracking target information, respectively.
步骤S607,一个或者多个无人机20’发送反馈信息给终端10以供终端10根据反馈信息产生追踪指令,其中,发送反馈信息的无人机20’为计算出的匹配度大于预设的匹配度的无人机20’。In step S607, one or more drones 20 'send feedback information to the terminal 10 for the terminal 10 to generate a tracking instruction according to the feedback information, wherein the drones 20' sending the feedback information are calculated that the matching degree is greater than the preset Match the degree of UAV 20 '.
步骤S1002,终端10’根据反馈信息识别出需要追踪指令的无人机20’。In step S1002, the terminal 10 'identifies the unmanned aerial vehicle 20' requiring the tracking instruction based on the feedback information.
步骤S1004,终端10’将追踪指令发送给需要追踪指令的无人机20’。In step S1004, the terminal 10 'sends the tracking instruction to the drone 20' that needs the tracking instruction.
步骤S1006,终端10’还根据需要该追踪指令的无人机产生返回指令,以控制不需要发送该追踪指令的无人机20’返回。具体地,若用户识别出需要追踪指令的无人机后,可以理解地,其余仍在进行搜寻的无人机20’即认为无需在继续搜寻,因此,终端10’产生返回指令以控制其余仍在进行搜寻的无人机20’结束搜寻。In step S1006, the terminal 10 'also generates a return instruction according to the drone that needs the tracking instruction to control the drone 20' that does not need to send the tracking instruction to return. Specifically, if the user identifies the drone that needs to track the command, it can be understood that the remaining unmanned drone 20 'is still not required to continue the search, so the terminal 10' generates a return command to control the remaining The search is completed at the UAV 20 'performing the search.
步骤S609,接收到该追踪指令的无人机20’发送相应的追踪信息给终端10’。In step S609, the drone 20 'that has received the tracking instruction sends corresponding tracking information to the terminal 10'.
步骤S609,接收到该追踪指令的无人机20’发送相应的追踪信息给终端10’。In step S609, the drone 20 'that has received the tracking instruction sends corresponding tracking information to the terminal 10'.
步骤S1008:终端10’根据需要该追踪指令的无人机产生返回指令。Step S1008: The terminal 10 'generates a return instruction according to the drone that needs the tracking instruction.
步骤S611,接收到该返回指令的无人机20’响应该返回指令返回。In step S611, the drone 20 'that has received the return instruction returns in response to the return instruction.
请参看图19,在一些可行的实施例中,该追踪方法还包括下面步骤。Referring to FIG. 19, in some feasible embodiments, the tracking method further includes the following steps.
步骤S901,发送反馈信息的无人机20’分别追踪各自的需要追踪对象。In step S901, the unmanned aerial vehicles 20 'that send feedback information track their respective tracking objects.
步骤S403,若在预设时间内未接收到追踪指令,发送反馈信息的无人机20’重新执行步骤S603及步骤S603之后的步骤。In step S403, if the tracking instruction is not received within the preset time, the UAV 20 'sending the feedback information re-executes steps S603 and the steps after step S603.
请结合参看图2和图20,其分别为第一实施例的追踪系统100和终端10的示意图。追踪系统100包括终端10、无人机20、以及地面站30。终端10通过地面站30与无人机20通讯连接,以控制无人机20对追踪目标进行追踪。Please refer to FIG. 2 and FIG. 20, which are schematic diagrams of the tracking system 100 and the terminal 10 of the first embodiment, respectively. The tracking system 100 includes a terminal 10, a drone 20, and a ground station 30. The terminal 10 communicates with the drone 20 through the ground station 30 to control the drone 20 to track the tracking target.
终端10包括设置单元11、第一通讯单元13、以及追踪管理单元14。The terminal 10 includes a setting unit 11, a first communication unit 13, and a tracking management unit 14.
设置单元11用于响应用户操作设置搜寻策略。具体地,例如,追踪目标是车辆或者人物,用户可以根据追踪目标可能的活动范围或者踪迹设置相应的搜寻策略。搜寻策略包括飞行策略,如无人机搜寻路径、姿态、方位、空速等。 搜寻策略还包括图像获取策略,如拍摄方式、拍摄光线、取景策略等。用户通过设置单元11设置相应的搜寻策略,并产生搜寻指令。设置单元11还用于通过第一通讯单元13发送该搜寻指令给无人机20,以控制无人机20按照搜寻策略进行飞行及获取实时图像。The setting unit 11 is used to set a search strategy in response to user operation. Specifically, for example, if the tracking target is a vehicle or a person, the user may set a corresponding search strategy according to the possible range or track of the tracking target. Search strategies include flight strategies, such as UAV search path, attitude, bearing, airspeed, etc. The search strategy also includes image acquisition strategies, such as shooting methods, shooting light, and framing strategies. The user sets the corresponding search strategy through the setting unit 11 and generates a search instruction. The setting unit 11 is also used to send the search instruction to the drone 20 through the first communication unit 13 to control the drone 20 to fly according to the search strategy and obtain real-time images.
设置单元11还用于响应用户的操作设置追踪目标信息,并通过第一通讯单元13发送该追踪目标信息给无人机20。例如,追踪目标是车辆,用户通过设置单元11设置车辆的车牌号码作为追踪目标信息。例如,追踪目标是人物,用户通过设置单元11设置追踪目标的人脸图像为追踪目标信息。其中,人脸图像可以从非无人机20发送过来的图片中提取,也可以是从无人机20发送过来的图片中提取。具体地,用户可以通过设置单元11在图片中将追踪目标的人脸轮廓进行选取或者是通过设置单元11利用矩形框将追踪对象的人脸区域进行选取,并设置为跟踪目标信息。或者图片中包含的人物图像仅是追踪目标,用户可以直接通过设置单元11将图片发送给无人机20。The setting unit 11 is further configured to set tracking target information in response to the user's operation, and send the tracking target information to the drone 20 through the first communication unit 13. For example, the tracking target is a vehicle, and the user sets the vehicle license plate number as the tracking target information through the setting unit 11. For example, the tracking target is a person, and the user sets the face image of the tracking target as tracking target information through the setting unit 11. The face image may be extracted from a picture sent by a non-drone 20, or may be extracted from a picture sent by the drone 20. Specifically, the user may select the face contour of the tracking target in the picture through the setting unit 11 or use the rectangular frame to select the face area of the tracking target through the setting unit 11 and set it as tracking target information. Or the image of the person included in the picture is only the tracking target, and the user can directly send the picture to the drone 20 through the setting unit 11.
追踪管理14还用于通过第一通讯单元13接收来自无人机20的反馈信息,并根据该反馈信息识别是否需要产生追踪指令。追踪管理14还用于通过第一通讯单元13接收来自无人机20的跟踪信息。其中,该反馈信息包括图片信息,图片信息包括特征信息或者图片等。在本实施例中以反馈信息为图片为例进行说明。该反馈信息和追踪信息的产生与发送具体描述参看下文。追踪管理14具体包括显示单元140、界面提供单元142、指令产生单元144。The tracking management 14 is also used to receive feedback information from the UAV 20 through the first communication unit 13 and identify whether tracking instructions need to be generated according to the feedback information. The tracking management 14 is also used to receive tracking information from the UAV 20 through the first communication unit 13. The feedback information includes picture information, and the picture information includes feature information or pictures. In this embodiment, the feedback information is taken as an example for description. See below for the detailed description of the generation and sending of the feedback information and tracking information. The tracking management 14 specifically includes a display unit 140, an interface providing unit 142, and an instruction generating unit 144.
显示单元140用于将接收到的图片显示于终端10,以供用户识别无人机20是否需要该追踪指令。具体地,若图片中包含追踪目标的图像,该用户可以识别出该无人机需要该追踪指令。The display unit 140 is used to display the received picture on the terminal 10 for the user to recognize whether the drone 20 needs the tracking instruction. Specifically, if the picture contains an image of the tracking target, the user can recognize that the drone needs the tracking instruction.
界面提供单元142用于提供按键并通过显示单元140显示该按键,供用户选择以产生追踪请求。具体地,用户可以通过输入装置,例如鼠标、键盘等选择该按键。The interface providing unit 142 is used to provide a key and display the key through the display unit 140 for the user to select to generate a tracking request. Specifically, the user can select the key through an input device, such as a mouse, a keyboard, or the like.
指令产生单元144用于若接收到用户输入的追踪请求,产生该追踪指令并通过第一通讯单元13发送给无人机20。The instruction generating unit 144 is used to generate the tracking instruction and send it to the drone 20 through the first communication unit 13 if a tracking request input by the user is received.
显示单元140还用于显示接收到的跟踪信息。The display unit 140 is also used to display the received tracking information.
请参看图20,无人机20包括第二通讯单元22、拍摄单元23、搜寻单元24、反馈单元26、追踪单元27。Referring to FIG. 20, the UAV 20 includes a second communication unit 22, a shooting unit 23, a search unit 24, a feedback unit 26, and a tracking unit 27.
拍摄单元23用于获取图像。拍摄单元23上的摄像设备,如摄像头。The shooting unit 23 is used to acquire an image. A camera device on the shooting unit 23, such as a camera.
搜寻单元24通过第二通讯单元22获取终端10发送的搜寻指令和追踪目标信息,并根据搜寻指令控制无人机20飞行和获取实时图像。搜寻单元24还用于追踪目标信息和当前获取的实时图像判断是否需要发送反馈信息给终端10。具体地,搜寻单元24包括飞行控制单元240、拍摄控制单元242、以及图像处理单元244。The search unit 24 obtains the search instruction and tracking target information sent by the terminal 10 through the second communication unit 22, and controls the drone 20 to fly and obtain real-time images according to the search instruction. The searching unit 24 is also used to track the target information and the currently acquired real-time image to determine whether it is necessary to send feedback information to the terminal 10. Specifically, the search unit 24 includes a flight control unit 240, a shooting control unit 242, and an image processing unit 244.
飞行控制单元240用于根据搜寻指令控制无人机20按照搜寻指令飞行。具体地,搜寻指令包括飞行策略,如无人机搜寻路径、姿态、方位、空速等。The flight control unit 240 is used to control the UAV 20 to fly according to the search instruction according to the search instruction. Specifically, the search instructions include flight strategies, such as the UAV search path, attitude, bearing, and airspeed.
拍摄控制单元242用于根据搜寻指令控制拍摄单元23按照搜寻指令获取实时图像。具体地,搜寻指令还包括图像获取策略如拍摄方式、拍摄光线、取景策略等。The shooting control unit 242 is used to control the shooting unit 23 to acquire a real-time image according to the search instruction according to the search instruction. Specifically, the search instruction also includes image acquisition strategies such as shooting mode, shooting light, and framing strategy.
图像处理单元244用于根据该无人机根据来自该终端的追踪目标信息和当前获取的实时图像,从当前获取的实时图像中提取出需要追踪对象信息,例如,该无人机获取到的追踪目标信息是文字信息,该无人机从当前获取的实时图像中提取出文字信息。例如,该无人机获取到的追踪目标信息是人脸图像,该无人机从当前获取的实时图像中提取出人脸信息。The image processing unit 244 is used to extract the information of the object to be tracked from the currently acquired real-time image according to the tracking target information from the terminal and the currently acquired real-time image by the drone, for example, the tracking acquired by the drone The target information is text information, and the drone extracts text information from the currently acquired real-time image. For example, the tracking target information acquired by the drone is a face image, and the drone extracts face information from the currently acquired real-time image.
图像处理单元242还用于根据追踪目标信息和需要追踪对象信息计算出匹配度。具体地,图像处理单元242将追踪目标信息和需要追踪对象信息进行比较计算出出匹配度。在本实施例,如果是文字信息,匹配度可以根据文字匹配的个数比例来设置,例如,60%的文字相同,匹配度为60%。如果是图像信息,匹配度可以根据图像信息的特征值进行比较,例如,将人脸轮廓设置为特征值,如果特征值有70%一致,匹配度为70%。The image processing unit 242 is also used to calculate the matching degree according to the tracking target information and the tracking target information. Specifically, the image processing unit 242 compares the tracking target information and the tracking target information to calculate a matching degree. In this embodiment, if it is text information, the matching degree can be set according to the proportion of the number of text matches, for example, 60% of the text is the same, and the matching degree is 60%. If it is image information, the matching degree can be compared according to the feature value of the image information. For example, the face contour is set as the feature value. If the feature values are 70% consistent, the matching degree is 70%.
反馈单元26用于判断计算出的匹配度是否大于预设的匹配度,若计算出的匹配度大于预设的匹配度,将反馈信息给发送给终端10以供终端10根据反馈信息产生追踪指令。具体地,在本实施例中,文字信息的预设匹配度大于图像信息的预设匹配度。可以理解地,由于文字的识别技术相对于图像的识别技术准确性高,因此,相应地将其匹配度提高,进一步降低误差。其中,在本实施例中,反馈信息优选为图片信息。The feedback unit 26 is used to determine whether the calculated matching degree is greater than the preset matching degree, and if the calculated matching degree is greater than the preset matching degree, send feedback information to the terminal 10 for the terminal 10 to generate a tracking instruction according to the feedback information . Specifically, in this embodiment, the preset matching degree of the text information is greater than the preset matching degree of the image information. Understandably, since the character recognition technology has higher accuracy than the image recognition technology, the matching degree is accordingly increased to further reduce errors. In this embodiment, the feedback information is preferably picture information.
跟踪单元27,用于响应终端10发送过来的追踪指令发送当前需要追踪对象的信息。跟踪单元27还用于对需要追踪对象进行追踪,并发送实时追踪信息给 终端10。追踪信息包括追踪目标的GPS定位信息、追踪路径、时间、匹配度、采用的匹配算法类型等。The tracking unit 27 is configured to send the information of the object to be tracked in response to the tracking instruction sent from the terminal 10. The tracking unit 27 is also used to track the object to be tracked and send real-time tracking information to the terminal 10. The tracking information includes the GPS positioning information of the tracking target, the tracking path, time, matching degree, and the type of matching algorithm used.
在一些可行的实施例中,飞行控制单元240还用于响应终端10发送过来的调整指令调整无人机20的飞行策略。In some feasible embodiments, the flight control unit 240 is also used to adjust the flight strategy of the drone 20 in response to the adjustment instruction sent from the terminal 10.
请结合参看图7和图21,其为本发明提供的追踪系统200的第二实施例示意图。追踪系统200包括终端10’、地面站30’和若干无人机20’。终端10’通过地面站30’与若干无人机20’通讯连接。Please refer to FIG. 7 and FIG. 21 together, which is a schematic diagram of a second embodiment of the tracking system 200 provided by the present invention. The tracking system 200 includes a terminal 10 ', a ground station 30' and a number of drones 20 '. The terminal 10 'is connected to a number of unmanned aerial vehicles 20' through a ground station 30 '.
终端10’包括设置单元11’、指令生成单元12’、第一通讯单元13’、以及追踪管理单元14’。The terminal 10 'includes a setting unit 11', an instruction generating unit 12 ', a first communication unit 13', and a tracking management unit 14 '.
设置单元11’用于响应用户操作设置搜寻策略。具体地,例如,追踪目标是车辆或者人物,用户可以根据追踪目标可能的活动范围或者踪迹设置相应的搜寻策略。搜寻策略包括飞行策略,如无人机搜寻路径、姿态、方位、空速等。搜寻策略还包括图像获取策略,如拍摄方式、拍摄光线、取景策略等。用户通过终端设置相应的搜寻策略从而生成若干不同的搜寻指令并分别发送给若干无人机20’以控制若干无人机20’按照不同的搜寻策略进行飞行及获取实时图像。设置单元11’与设置11不同之处在于,设置单元11’用于对若干无人机20’一一设置不同的若干搜寻策略。The setting unit 11 'is used to set a search strategy in response to user operation. Specifically, for example, if the tracking target is a vehicle or a person, the user may set a corresponding search strategy according to the possible range or track of the tracking target. Search strategies include flight strategies, such as UAV search path, attitude, bearing, airspeed, etc. The search strategy also includes image acquisition strategies, such as shooting methods, shooting light, and framing strategies. The user sets a corresponding search strategy through the terminal to generate several different search instructions and sends them to several drones 20 'to control several drones 20' to fly and acquire real-time images according to different search strategies. The setting unit 11 'is different from the setting 11 in that the setting unit 11' is used to set different search strategies for the unmanned aerial vehicles 20 'one by one.
设置单元11’还用于响应用户的操作设置追踪目标信息,并通过第一通讯单元13’发送该追踪目标信息给若干无人机20’。其中,发送给若干无人机20’追踪目标信息相同。设置单元11’功能基本上与设置单元11相同,在此不再赘述。The setting unit 11 'is also used for setting tracking target information in response to the user's operation, and sending the tracking target information to a plurality of drones 20' through the first communication unit 13 '. Among them, the tracking target information sent to several UAVs 20 'is the same. The function of the setting unit 11 'is basically the same as that of the setting unit 11 and will not be repeated here.
追踪管理单元14’还用于通过第一通讯单元13’接收来自无人机20’的反馈信息,并根据该反馈信息识别是否需要产生追踪指令。追踪管理14’还用于通过第一通讯单元13接收来自无人机20’的跟踪信息。其中,该反馈信息包括图片。该反馈信息和追踪信息的产生与发送具体描述参看上文。The tracking management unit 14 'is also used to receive feedback information from the unmanned aerial vehicle 20' through the first communication unit 13 ', and identify whether a tracking instruction needs to be generated based on the feedback information. The tracking management 14 'is also used to receive tracking information from the drone 20' through the first communication unit 13. Among them, the feedback information includes pictures. See the above for the detailed description of the generation and sending of the feedback information and tracking information.
请参看图22,其为第一实施例追踪管理单元14’的功能模块图。追踪管理单元14’具体包括显示单元14001、界面提供单元14002、以及指令产生单元14004。其中,追踪管理14’各个功能模块与追踪管理单元14基本相同。不同之处在于:界面提供单元14002用于提供若干按键,供用户选择以产生追踪请求。该若干按键与显示的图片一一对应,从而可以产生与图片一一对应的追踪 请求。在本实施例中,每一图片与无人机20’的唯一标识码相关联,以区分发送该图片的无人机20’。进一步地,界面提供单元14002根据唯一标识码产生该追踪请求,即该追踪请求包括无人机20’的唯一标识码。具体地,在一些其他可行的实施例中,用户也可以通过界面提供单元14002输入需要追踪指令的无人机20’的唯一标识码以产生该追踪请求。Please refer to FIG. 22, which is a functional block diagram of the tracking management unit 14 'according to the first embodiment. The tracking management unit 14 'specifically includes a display unit 14001, an interface providing unit 14002, and an instruction generating unit 14004. Among them, each functional module of the tracking management 14 'is basically the same as the tracking management unit 14. The difference is that the interface providing unit 14002 is used to provide several buttons for the user to select to generate a tracking request. The several keys correspond to the displayed pictures one by one, so that a tracking request corresponding to the pictures can be generated. In this embodiment, each picture is associated with the unique identification code of the drone 20 'to distinguish the drone 20' that sent the picture. Further, the interface providing unit 14002 generates the tracking request according to the unique identification code, that is, the tracking request includes the unique identification code of the drone 20 '. Specifically, in some other feasible embodiments, the user may also input the unique identification code of the drone 20 'requiring the tracking instruction through the interface providing unit 14002 to generate the tracking request.
指令产生单元14004用于根据用户输入的追踪请求,产生相应的追踪指令,并通过第一通讯单元13’发送给无人机20’。相应地,追踪指令包括无人机20’的唯一标识码,以供相应的无人机20’获取。指令产生单元14004还用于根据需要该追踪指令的无人机20’产生返回指令,以控制不需要发送该追踪指令的无人机20’返回。相应地,该返回指令包含无人机20’的唯一标识码。可以理解地,若用户识别出需要追踪指令的无人机后,其余仍在进行搜寻的无人机被认为无需在继续搜寻,因此,指令产生单元14004通过第一通讯单元13’发送返回指令给对应的无人机20’以控制其余仍在进行搜寻的无人机结束搜寻。The instruction generating unit 14004 is used to generate a corresponding tracking instruction according to the tracking request input by the user, and send it to the drone 20 'through the first communication unit 13'. Accordingly, the tracking instruction includes the unique identification code of the drone 20 'for the corresponding drone 20' to obtain. The instruction generating unit 14004 is also used to generate a return instruction according to the drone 20 'that needs the tracking instruction to control the return of the drone 20' that does not need to send the tracking instruction. Accordingly, the return instruction contains the unique identification code of the drone 20 '. Understandably, if the user identifies the drone that needs to track the command, the rest of the drones that are still searching are considered to not need to continue searching. Therefore, the command generation unit 14004 sends a return command to the first communication unit 13 ' The corresponding unmanned aerial vehicle 20 'ends the search by controlling the remaining unmanned aerial vehicles.
请参看图23,其为第二实施例追踪管理单元14’的功能模块图。追踪管理单元14’用于通过第一通讯单元13’接收来自无人机20’的反馈信息,并根据该反馈信息识别是否需要产生追踪指令。在本实施例中,该反馈信息包含匹配度。具体地,追踪管理单元14’包括识别单元14011、以及指令产生单元14012。Please refer to Fig. 23, which is a functional block diagram of the tracking management unit 14 'according to the second embodiment. The tracking management unit 14 'is used to receive feedback information from the unmanned aerial vehicle 20' through the first communication unit 13 ', and identify whether a tracking instruction needs to be generated based on the feedback information. In this embodiment, the feedback information includes the matching degree. Specifically, the tracking management unit 14 'includes an identification unit 14011, and an instruction generation unit 14012.
识别单元14011用于根据接收到的匹配度和预设条件识别出需要追踪指令的无人机20’。其中,识别单元14011包括计数单元14013和分析单元14015。The recognizing unit 14011 is used to recognize the drone 20 'requiring the tracking instruction according to the received matching degree and preset conditions. The identification unit 14011 includes a counting unit 14013 and an analysis unit 14015.
计数单元14013用于对预设的时间内接收到的匹配度进行计数。The counting unit 14013 is used to count the matching degree received within a preset time.
分析单元14015用于若在预设时间内仅接收到一个无人机20’发送的匹配度,即计数为1,识别出该无人机需要该追踪指令。分析单元14015还用于若预设时间内接收到多个无人机20’发送过来的匹配度,即计数大于1,对所接收的匹配度进行比较得出最高的匹配度,并识别出发送最高匹配度的无人机20’需要该追踪指令。在一些其他可行的实施例中,分析单元14013用于在预设的时间内接收到多个无人机20’发送过来的匹配度,对所接收的匹配度进行比较得出预设数量最高的匹配度,并识别出发送预设数量最高匹配度的无人机20’需要该追踪指令。The analysis unit 14015 is used to recognize that the matching degree sent by only one drone 20 'within a preset time, that is, a count of 1, recognizes that the drone needs the tracking instruction. The analysis unit 14015 is also used to receive the matching degree sent by multiple drones 20 'within a preset time, that is, the count is greater than 1, compare the received matching degree to obtain the highest matching degree, and identify the sending The highest matching degree UAV 20 'needs this tracking instruction. In some other feasible embodiments, the analysis unit 14013 is configured to receive the matching degrees sent by multiple drones 20 'within a preset time, and compare the received matching degrees to obtain the highest preset number. Matching degree, and recognizes that the UAV 20 'sending the preset number of highest matching degree needs the tracking instruction.
指令产生单元14012用于根据用户输入的追踪请求,产生相应的追踪指令,并通过第一通讯单元13’发送给无人机20。相应地,追踪指令包括无人机20’ 的唯一标识码,以供相应的无人机20’获取。指令产生单元14012还用于根据需要该追踪指令的无人机20’产生返回指令,以控制不需要发送该追踪指令的无人机20’返回。相应地,该返回指令包含无人机20’的唯一标识码。可以理解地,若用户识别出需要追踪指令的无人机后,其余仍在进行搜寻的无人机被认为无需在继续搜寻,因此,指令产生单元14012通过第一通讯单元13’发送返回指令给对应的无人机20’以控制其余仍在进行搜寻的无人机结束搜寻。The instruction generating unit 14012 is used to generate a corresponding tracking instruction according to the tracking request input by the user, and send it to the drone 20 through the first communication unit 13 '. Accordingly, the tracking instruction includes the unique identification code of the drone 20 'for the corresponding drone 20' to obtain. The instruction generating unit 14012 is also used to generate a return instruction according to the drone 20 'that needs the tracking instruction to control the return of the drone 20' that does not need to send the tracking instruction. Accordingly, the return instruction contains the unique identification code of the drone 20 '. Understandably, if the user identifies the drone that needs to track the command, the rest of the drones that are still searching are considered not to need to continue searching. Therefore, the command generation unit 14012 sends a return command to the first communication unit 13 ' The corresponding unmanned aerial vehicle 20 'ends the search by controlling the remaining unmanned aerial vehicles.
本实施例中,可以根据匹配度进行识别是否需要追踪指令的无人机并向需要追踪指令的无人机发送该追踪指令,降低了追踪错误的目标,提高了追踪的效率和成功率。In this embodiment, it is possible to identify whether a drone that needs a tracking instruction according to the matching degree and send the tracking instruction to the drone that needs a tracking instruction, which reduces the tracking error target and improves the tracking efficiency and success rate.
本实施例中,无人机20’与无人机20的功能基本相同,区别在于无人机20’还用于响应返回指令返回。无人机20’具体功能模块可以参照无人机20,在此不再赘述。In this embodiment, the functions of the UAV 20 'and the UAV 20 are basically the same, the difference is that the UAV 20' is also used to return in response to a return instruction. For the specific function module of the UAV 20 ', please refer to the UAV 20, which will not be repeated here.
图24为本发明终端的另一实施例的结构组成示意图。如图18所示,终端1000可包括:第一存储器1005以及第一处理器1006,其中:24 is a schematic structural composition diagram of another embodiment of a terminal of the present invention. As shown in FIG. 18, the terminal 1000 may include: a first memory 1005 and a first processor 1006, where:
第一存储器1005,用于存储第一计算机可读程序。具体实现中,本发明实施例的第一存储器105可以是系统存储器,比如,挥发性的(诸如RAM),非易失性的(诸如ROM,闪存等),或者两者的结合。具体实现中,本发明实施例的第一存储器105还可以是系统之外的外部存储器,比如,磁盘、光盘、磁带等。The first memory 1005 is used to store a first computer-readable program. In a specific implementation, the first memory 105 of the embodiment of the present invention may be a system memory, for example, volatile (such as RAM), non-volatile (such as ROM, flash memory, etc.), or a combination of both. In a specific implementation, the first memory 105 in this embodiment of the present invention may also be an external memory outside the system, such as a magnetic disk, an optical disk, or a magnetic tape.
第一处理器1006,用于调用第一存储器1005中存储的第一计算机可读程序,并执行如下操作:The first processor 1006 is configured to call the first computer-readable program stored in the first memory 1005 and perform the following operations:
发送搜寻指令给无人机;Send search commands to the drone;
发送追踪目标信息给无人机;Send tracking target information to the drone;
若接收到无人机的反馈信息,根据反馈信息识别无人机是否需要追踪指令。If the feedback information of the drone is received, it is recognized whether the drone needs to track the instruction according to the feedback information.
优选地,第一处理器1006还执行如下操作:Preferably, the first processor 1006 also performs the following operations:
显示接收到的图片,以供用户识别是否该无人机是否需要该追踪指令。具体地,若图片中包含追踪目标的图像,用户可以识别出该无人机需要该追踪指令。The received picture is displayed for the user to identify whether the drone needs the tracking instruction. Specifically, if the picture contains an image of the tracking target, the user can recognize that the drone needs the tracking instruction.
若接收到用户输入的追踪请求,产生该追踪指令。具体地,该终端还显示相应的确认按键。若用户识别出该终端显示的图片中存在追踪目标,用户点击相应的确认按键即可产生该追踪请求。If the tracking request input by the user is received, the tracking instruction is generated. Specifically, the terminal also displays a corresponding confirmation button. If the user recognizes that there is a tracking target in the picture displayed on the terminal, the user clicks the corresponding confirmation button to generate the tracking request.
将该追踪指令发送给该无人机。Send the tracking instruction to the drone.
在一些其他可行的实施例中,第一处理器1006执行如下操作:In some other feasible embodiments, the first processor 1006 performs the following operations:
发送若干搜寻指令给若干无人机;Send some search commands to some drones;
发送追踪目标信息给若干无人机;Send tracking target information to several drones;
若接收到无人机的反馈信息,根据反馈信息识别无人机是否需要追踪指令。If the feedback information of the drone is received, it is recognized whether the drone needs to track the instruction according to the feedback information.
在一些可行的实施例中,第一处理器1006还执行如下操作:In some feasible embodiments, the first processor 1006 also performs the following operations:
在预设的时间内显示接收到的图片,以供用户识别需要追踪指令的无人机,并供用户根据图片输入相应的追踪请求;Display the received picture within a preset time for the user to identify the drone that needs the tracking instruction, and for the user to input the corresponding tracking request according to the picture;
若接收到用户输入的追踪请求,根据该追踪请求产生该追踪指令;If a tracking request input by the user is received, the tracking instruction is generated according to the tracking request;
将该追踪指令发送给需要追踪指令的无人机;Send the tracking instruction to the drone that needs the tracking instruction;
根据需要该追踪指令的无人机产生返回指令,以控制不需要发送该追踪指令的无人机返回。The unmanned aerial vehicle that needs the tracking instruction generates a return instruction to control the unmanned aerial vehicle that does not need to send the tracking instruction to return.
在一些可行的实施例中,第一处理器1006还执行如下操作:In some feasible embodiments, the first processor 1006 also performs the following operations:
根据接收到的匹配度和预设条件识别出需要追踪指令的无人机。According to the received matching degree and the preset condition, the drone that needs the tracking instruction is identified.
若识别出需要向发送追踪指令的无人机,产生该追踪指令并发送给需要该追踪指令的无人机。If a drone that needs to send a tracking instruction is identified, the tracking instruction is generated and sent to the drone that needs the tracking instruction.
在一些可行的实施例中,第一处理器1006还执行如下操作:In some feasible embodiments, the first processor 1006 also performs the following operations:
对预设的时间内接收到的匹配度进行计数;Count the matching degree received within the preset time;
若终端若在预设时间内仅接收到一个无人机发送的匹配度,识别出发送匹配度的无人机需要追踪指令;If the terminal receives the matching degree sent by only one drone within a preset time, it is recognized that the drone sending the matching degree needs to follow the instruction;
若在预设时间内接收到多个无人机发送过来的匹配度,所述终端对所述接收的匹配度进行比较得出最高的匹配度;If a matching degree sent by multiple drones is received within a preset time, the terminal compares the received matching degree to obtain the highest matching degree;
识别出发送所述最高匹配度的无人机需要所述追踪指令。It is recognized that the drone sending the highest matching degree needs the tracking instruction.
在一些可行的实施例中,第一处理器1006还执行如下操作:In some feasible embodiments, the first processor 1006 also performs the following operations:
对预设的时间内接收到的匹配度进行计数;Count the matching degree received within the preset time;
若在预设时间内仅接收到一个无人机发送的匹配度,识别出所述发送匹配度的无人机需要所述追踪指令;If the matching degree sent by only one drone is received within a preset time, it is recognized that the drone sending the matching degree needs the tracking instruction;
若在预设的时间内接收到多个无人机发送过来的匹配度,对所述接收的匹配度进行比较得出预设数量最高的匹配度;If the matching degree sent by multiple drones is received within a preset time, the received matching degree is compared to obtain the highest number of matching degrees;
识别出发送预设数量最高匹配度的无人机需要该追踪指令。It is recognized that the UAV that sends the preset number of highest matching degrees needs the tracking instruction.
图25为本发明无人机的另一实施例的结构组成示意图。如图19所示,无人机2000可包括:第二存储器2005以及第二处理器2006,其中:FIG. 25 is a schematic structural composition diagram of another embodiment of the drone of the present invention. As shown in FIG. 19, the drone 2000 may include: a second memory 2005 and a second processor 2006, where:
第二存储器2005,用于存储第一计算机可读程序。具体实现中,本发明实施例的第二存储器2005以是系统存储器,比如,挥发性的(诸如RAM),非易失性的(诸如ROM,闪存等),或者两者的结合。具体实现中,本发明实施例的第二存储器2005还可以是系统之外的外部存储器,比如,磁盘、光盘、磁带等。The second memory 2005 is used to store the first computer readable program. In a specific implementation, the second memory 2005 of the embodiment of the present invention may be a system memory, for example, volatile (such as RAM), non-volatile (such as ROM, flash memory, etc.), or a combination of both. In a specific implementation, the second memory 2005 in this embodiment of the present invention may also be an external memory outside the system, such as a magnetic disk, an optical disk, or a magnetic tape.
第二处理器2006,用于调用所述第二存储器2005中存储的第一计算机可读程序,并执行如下操作:The second processor 2006 is configured to call the first computer-readable program stored in the second memory 2005 and perform the following operations:
根据获取的来自该终端的搜寻指令飞行并获取实时图像;Fly and obtain real-time images according to the obtained search instructions from the terminal;
根据来自该终端的追踪目标信息和当前获取的实时图像,从所述当前获取的实时图像中提取出需要追踪对象信息;Extract the information of the object to be tracked from the currently acquired real-time image according to the tracking target information from the terminal and the currently acquired real-time image;
计算该追踪目标信息与需要追踪对象信息的匹配度;Calculate the matching degree between the tracking target information and the information of the tracking object;
若该计算出的匹配度大于预设的匹配度,将反馈信息给发送给该终端以供该终端根据反馈信息产生追踪指令;If the calculated matching degree is greater than the preset matching degree, feedback information is sent to the terminal for the terminal to generate a tracking instruction according to the feedback information;
若接收到来自该终端的追踪指令,对需要追踪的对象进行追踪,并发送相应的追踪信息给该终端。If a tracking instruction from the terminal is received, the object to be tracked is tracked, and corresponding tracking information is sent to the terminal.
优选地,第二处理器2006还执行如下步骤:Preferably, the second processor 2006 also performs the following steps:
检测是否接收到来自该终端的调整指令;Check whether the adjustment instruction from the terminal is received;
若检测到调整指令,机根据所述调整指令飞行。If an adjustment instruction is detected, the aircraft flies according to the adjustment instruction.
优选地,第二处理器2006还可执行如下步骤:Preferably, the second processor 2006 can also perform the following steps:
追踪该需要追踪对象;Track the object that needs to be tracked;
判断是否收到追踪指令;Determine whether the tracking instruction is received;
若接收到追踪指令,对需要追踪的对象进行追踪,并发送相应的追踪信息给该终端;If a tracking instruction is received, track the object to be tracked and send the corresponding tracking information to the terminal;
若未接收到追踪指令,按照接收到的搜寻指令重新飞行,以继续搜寻追踪目标。If no tracking command is received, re-flight according to the received search command to continue searching for the tracking target.
另外,本发明实施例还提供了一种计算机存储介质,该计算机存储介质可存储有程序,该程序执行时可运行本发明实施例所述的方法的部分或全部步骤。具体实现中,本发明实施例的计算机存储介质包括:RAM、ROM、EEPROM、闪存、CD-ROM、DVD或其他光存储器,磁带、磁盘或其他磁存储器,或者其他任何可以 用于存储所需信息并可被计算机设备所访问的介质。In addition, an embodiment of the present invention further provides a computer storage medium. The computer storage medium may store a program, and when the program is executed, part or all of the steps of the method described in the embodiment of the present invention may be executed. In a specific implementation, the computer storage medium in the embodiments of the present invention includes: RAM, ROM, EEPROM, flash memory, CD-ROM, DVD, or other optical storage, magnetic tape, magnetic disk, or other magnetic storage, or any other storage medium that can be used to store required information Media that can be accessed by computer equipment.
显然,本领域的技术人员可以对本发明进行各种改动和变型而不脱离本发明的精神和范围。这样,倘且本发明的这些修改和变型属于本发明权利要求及其等同技术的范围之内,则本发明也意图包含这些改动和变型在内。Obviously, those skilled in the art can make various modifications and variations to the present invention without departing from the spirit and scope of the present invention. In this way, provided that these modifications and variations of the present invention fall within the scope of the claims of the present invention and equivalent technologies thereof, the present invention is also intended to include these modifications and variations.
以上所列举的仅为本发明较佳实施例而已,当然不能以此来限定本发明之权利范围,因此依本发明权利要求所作的等同变化,仍属本发明所涵盖的范围。The above list is only preferred embodiments of the present invention. Of course, it cannot be used to limit the scope of the present invention. Therefore, equivalent changes made according to the claims of the present invention still fall within the scope of the present invention.

Claims (10)

  1. 一种基于无人机的追踪方法,其特征在于,所述方法包括:A drone-based tracking method, characterized in that the method includes:
    所述无人机根据获取的搜寻指令飞行并获取实时图像;The drone flies according to the obtained search instruction and obtains real-time images;
    所述无人机根据来自终端的目标信息和当前获取的实时图像,从所述当前获取的实时图像中提取出需要追踪对象信息;The drone extracts the information of the object to be tracked from the currently acquired real-time image according to the target information from the terminal and the currently acquired real-time image;
    所述无人机计算所述目标信息与需要追踪对象信息的匹配度;The UAV calculates the matching degree between the target information and the object information to be tracked;
    若所述计算出的匹配度大于预设的匹配度,所述无人机将反馈信息发送给所述终端以供所述终端根据反馈信息产生追踪指令;If the calculated matching degree is greater than the preset matching degree, the drone sends feedback information to the terminal for the terminal to generate a tracking instruction according to the feedback information;
    若接收到来自所述终端的追踪指令,所述无人机发送需要追踪对象的当前追踪信息给所述终端。If a tracking instruction from the terminal is received, the drone sends the current tracking information of the object to be tracked to the terminal.
  2. 如权利要求1所述的追踪方法,其特征在于,所述反馈信息包括包含所述需要追踪对象的图片,所述方法还包括:The tracking method according to claim 1, wherein the feedback information includes a picture including the object to be tracked, and the method further includes:
    所述终端显示所述图片,以供用户识别是否需要发送追踪请求;The terminal displays the picture for the user to identify whether a tracking request needs to be sent;
    若所述终端接收到用户输入的追踪请求判断出需要发送所述追踪指令,并产生所述追踪指令;If the terminal receives the tracking request input by the user and determines that the tracking instruction needs to be sent, and generates the tracking instruction;
    所述终端将所述追踪指令发送给所述无人机。The terminal sends the tracking instruction to the drone.
  3. 如权利要求1所述的追踪方法,其特征在于,所述方法还包括:The tracking method according to claim 1, wherein the method further comprises:
    若所述无人机接收到来自所述终端的调整指令,所述无人机根据所述调整指令飞行,其中,所述调整指令为所述终端在预设的时间内未接收到反馈信息发送给所述无人机。If the drone receives an adjustment instruction from the terminal, the drone flies according to the adjustment instruction, wherein the adjustment instruction is that the terminal does not receive feedback information within a preset time and sends Give the drone.
  4. 如权利要求3所述的追踪方法,其特征在于,The tracking method according to claim 3, wherein:
    若无人机发送所述匹配度后在预设的时间未接收到所述追踪指令或者所述调整指令,重新执行所述无人机根据来自终端的目标信息和当前获取的实时图像,从所述当前获取的实时图像中提取出需要追踪对象信息。If the drone does not receive the tracking instruction or the adjustment instruction within a preset time after sending the matching degree, re-execute the drone according to the target information from the terminal and the currently acquired real-time image The object information to be tracked is extracted from the currently acquired real-time image.
  5. 如权利要求1所述的追踪方法,其特征在于,若无人机发送所述反馈信息后,所述方法还包括:The tracking method according to claim 1, wherein if the drone sends the feedback information, the method further comprises:
    所述无人机追踪需要追踪对象;The UAV tracking needs to track objects;
    若预设时间内未接收到所述追踪指令,重新执行所述无人机根据来自终端的目标信息和当前获取的实时图像,从所述当前获取的实时图像中提取出需要追踪对象信息。If the tracking instruction is not received within a preset time, the drone re-executes the object information to be tracked from the currently acquired real-time image according to the target information from the terminal and the currently acquired real-time image.
  6. 一种无人机,其特征在于,所述无人机包括:A drone, characterized in that the drone includes:
    第一存储器,用于存储第一计算机可读程序;以及A first memory for storing a first computer-readable program; and
    第一处理器,用于执行所述第一计算机可读程序以实现追踪方法,所述方法包括:A first processor is configured to execute the first computer-readable program to implement a tracking method, and the method includes:
    根据获取的搜寻指令飞行并获取实时图像;Fly according to the obtained search instruction and obtain real-time images;
    根据来自终端的目标信息和当前获取的实时图像,从所述当前获取的实时图像中提取出需要追踪对象信息;According to the target information from the terminal and the currently acquired real-time image, extract the object information to be tracked from the currently acquired real-time image;
    计算所述目标信息与需要追踪对象信息的匹配度;Calculating the matching degree between the target information and the information of the object to be tracked;
    若所述计算出的匹配度大于预设的匹配度,将反馈信息给发送给所述终端以供所述终端根据反馈信息产生追踪指令;以及If the calculated matching degree is greater than the preset matching degree, sending feedback information to the terminal for the terminal to generate a tracking instruction according to the feedback information; and
    若接收到来自所述终端的追踪指令,对所述相应对象进行追踪,并发送相应的追踪信息给终端。If a tracking instruction from the terminal is received, the corresponding object is tracked, and corresponding tracking information is sent to the terminal.
  7. 如权利要求6所述的无人机,其特征在于,若发送所述反馈信息后,所述追踪方法还包括:The drone according to claim 6, wherein if the feedback information is sent, the tracking method further includes:
    追踪所述需要追踪对象;Track the object that needs to be tracked;
    若预设时间内未接收到所述追踪指令,重新执行根据来自终端的目标信息和当前获取的实时图像,从所述当前获取的实时图像中提取出需要追踪对象信息。If the tracking instruction is not received within a preset time, the target information from the terminal and the currently acquired real-time image are re-executed to extract the information of the object to be tracked from the currently acquired real-time image.
  8. 如权利要求7所述的无人机,其特征在于,The drone according to claim 7, characterized in that
    若无人机发送所述匹配度后在预设的时间未接收到所述追踪指令或者所述调整指令,重新执行所述无人机根据来自终端的目标信息和当前获取的实时图 像,从所述当前获取的实时图像中提取出需要追踪对象信息。If the drone does not receive the tracking instruction or the adjustment instruction within a preset time after sending the matching degree, re-execute the drone according to the target information from the terminal and the currently acquired real-time image The object information to be tracked is extracted from the currently acquired real-time image.
  9. 一种终端,其特征在于,所述终端包括:A terminal is characterized in that the terminal includes:
    第二存储器,用于存储第二计算机可读程序;以及A second memory for storing a second computer-readable program; and
    第二处理器,用于执行所述第二计算机可读程序以实现追踪方法,所述方法包括:A second processor, configured to execute the second computer-readable program to implement a tracking method, the method including:
    发送搜寻指令给无人机以控制所述无人机按照搜寻指令飞行以及获取实时图像;Sending a search instruction to the drone to control the drone to fly according to the search instruction and obtain real-time images;
    若接收到来自无人机的反馈信息,根据所述反馈信息识别是否向需要所述无人机发送追踪指令;If the feedback information from the drone is received, identify whether to send a tracking instruction to the drone according to the feedback information;
    若识别出需要向所述无人机发送所述追踪指令,向所述无人机发送追踪指令以控制所述无人机回传需要追踪对象的当前追踪信息。If it is identified that the tracking instruction needs to be sent to the drone, the tracking instruction is sent to the drone to control the drone to return the current tracking information of the object to be tracked.
  10. 一种基于无人机的追踪系统,所述追踪系统包括无人机和终端,其特征在于,所述无人机包括:A tracking system based on a drone, the tracking system includes a drone and a terminal, characterized in that the drone includes:
    第一存储器,用于存储第一计算机可读程序;以及A first memory for storing a first computer-readable program; and
    第一处理器,用于执行所述第一计算机可读程序以实现追踪方法,所述方法包括:A first processor is configured to execute the first computer-readable program to implement a tracking method, and the method includes:
    根据获取的搜寻指令飞行并获取实时图像;Fly according to the obtained search instruction and obtain real-time images;
    根据来自所述终端的目标信息和当前获取的实时图像,从所述当前获取的实时图像中提取出需要追踪对象信息;According to the target information from the terminal and the currently acquired real-time image, extract the object information to be tracked from the currently acquired real-time image;
    计算所述目标信息与需要追踪对象信息的匹配度;Calculating the matching degree between the target information and the information of the object to be tracked;
    若所述计算出的匹配度大于预设的匹配度,将反馈信息给发送给所述终端以供所述终端根据反馈信息产生追踪指令;以及If the calculated matching degree is greater than the preset matching degree, sending feedback information to the terminal for the terminal to generate a tracking instruction according to the feedback information; and
    若接收到来自所述终端的追踪指令,对所述相应对象进行追踪,并发送相应的追踪信息给终端;If a tracking instruction from the terminal is received, track the corresponding object and send corresponding tracking information to the terminal;
    所述终端,包括:The terminal includes:
    第二存储器,用于存储第二计算机可读程序;以及A second memory for storing a second computer-readable program; and
    第二处理器,用于执行所述第二计算机可读程序以实现追踪方法,所述方法包括:A second processor, configured to execute the second computer-readable program to implement a tracking method, the method including:
    发送所述搜寻指令给所述无人机以控制所述无人机按照搜寻指令飞行以及获取实时图像;Sending the search command to the drone to control the drone to fly according to the search command and obtain real-time images;
    若接收到来自无人机的反馈信息,根据所述反馈信息判断是否向需要所述无人机发送所述追踪指令;If the feedback information from the drone is received, determine whether to send the tracking instruction to the drone according to the feedback information;
    若判断出需要向所述无人机发送所述追踪指令,向所述无人机发送追踪指令。If it is determined that the tracking instruction needs to be sent to the drone, the tracking instruction is sent to the drone.
PCT/CN2019/109558 2018-10-17 2019-09-30 Unmanned aerial vehicle-based tracking method and system, unmanned aerial vehicle and terminal WO2020078217A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201811211036.9A CN109445465A (en) 2018-10-17 2018-10-17 Method for tracing, system, unmanned plane and terminal based on unmanned plane
CN201811211036.9 2018-10-17

Publications (1)

Publication Number Publication Date
WO2020078217A1 true WO2020078217A1 (en) 2020-04-23

Family

ID=65547302

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/109558 WO2020078217A1 (en) 2018-10-17 2019-09-30 Unmanned aerial vehicle-based tracking method and system, unmanned aerial vehicle and terminal

Country Status (2)

Country Link
CN (1) CN109445465A (en)
WO (1) WO2020078217A1 (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109445465A (en) * 2018-10-17 2019-03-08 深圳市道通智能航空技术有限公司 Method for tracing, system, unmanned plane and terminal based on unmanned plane
CN110162102A (en) * 2019-05-17 2019-08-23 广东技术师范大学 Unmanned plane automatic identification tracking and system based on cloud platform and machine vision
CN110580053A (en) * 2019-08-13 2019-12-17 深圳市道通智能航空技术有限公司 Target tracking method, aircraft and flight system
CN110717386A (en) * 2019-08-30 2020-01-21 深圳壹账通智能科技有限公司 Method and device for tracking affair-related object, electronic equipment and non-transitory storage medium
CN114096463A (en) * 2020-04-28 2022-02-25 深圳市大疆创新科技有限公司 Control method and device for movable platform, movable platform and storage medium
CN112163455B (en) * 2020-08-27 2023-08-25 东风汽车集团有限公司 Method for searching target object and vehicle cloud platform
CN112130588B (en) * 2020-08-27 2022-03-01 东风汽车集团有限公司 Method for searching target person, vehicle-mounted terminal and unmanned aerial vehicle
CN112344798B (en) * 2020-11-19 2022-12-30 中国人民解放军国防科技大学 Non-cooperative flight target flexible capturing system inspired by humane magic spider creatures
CN113516106B (en) * 2021-09-08 2021-12-10 深圳联和智慧科技有限公司 Unmanned aerial vehicle intelligent vehicle identification method and system based on city management
CN113759986A (en) * 2021-09-27 2021-12-07 深圳市道通智能航空技术股份有限公司 Unmanned aerial vehicle monitoring and tracking method, device, equipment and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140200744A1 (en) * 2011-05-26 2014-07-17 Saab Ab Method and system for steering an unmanned aerial vehicle
CN105759839A (en) * 2016-03-01 2016-07-13 深圳市大疆创新科技有限公司 Unmanned aerial vehicle (UAV) visual tracking method, apparatus, and UAV
CN106292721A (en) * 2016-09-29 2017-01-04 腾讯科技(深圳)有限公司 A kind of aircraft that controls follows the tracks of the method for destination object, equipment and system
CN108375986A (en) * 2018-03-30 2018-08-07 深圳市道通智能航空技术有限公司 Control method, device and the terminal of unmanned plane
CN108398960A (en) * 2018-03-02 2018-08-14 南京航空航天大学 A kind of multiple no-manned plane collaboration target tracking method for improving APF and being combined with segmentation Bezier
CN108513641A (en) * 2017-05-08 2018-09-07 深圳市大疆创新科技有限公司 Unmanned plane filming control method, unmanned plane image pickup method, control terminal, unmanned aerial vehicle (UAV) control device and unmanned plane
CN109445465A (en) * 2018-10-17 2019-03-08 深圳市道通智能航空技术有限公司 Method for tracing, system, unmanned plane and terminal based on unmanned plane

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106056075A (en) * 2016-05-27 2016-10-26 广东亿迅科技有限公司 Important person identification and tracking system in community meshing based on unmanned aerial vehicle
CN106406343B (en) * 2016-09-23 2020-07-10 北京小米移动软件有限公司 Control method, device and system of unmanned aerial vehicle
CN106598226B (en) * 2016-11-16 2019-05-21 天津大学 A kind of unmanned plane man-machine interaction method based on binocular vision and deep learning
CN106778669A (en) * 2016-12-30 2017-05-31 易瓦特科技股份公司 The method and device that destination object is identified is carried out based on unmanned plane
CN107255468B (en) * 2017-05-24 2019-11-19 纳恩博(北京)科技有限公司 Method for tracking target, target following equipment and computer storage medium
CN107748860A (en) * 2017-09-01 2018-03-02 中国科学院深圳先进技术研究院 Method for tracking target, device, unmanned plane and the storage medium of unmanned plane
CN107908195B (en) * 2017-11-06 2021-09-21 深圳市道通智能航空技术股份有限公司 Target tracking method, target tracking device, tracker and computer-readable storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140200744A1 (en) * 2011-05-26 2014-07-17 Saab Ab Method and system for steering an unmanned aerial vehicle
CN105759839A (en) * 2016-03-01 2016-07-13 深圳市大疆创新科技有限公司 Unmanned aerial vehicle (UAV) visual tracking method, apparatus, and UAV
CN106292721A (en) * 2016-09-29 2017-01-04 腾讯科技(深圳)有限公司 A kind of aircraft that controls follows the tracks of the method for destination object, equipment and system
CN108513641A (en) * 2017-05-08 2018-09-07 深圳市大疆创新科技有限公司 Unmanned plane filming control method, unmanned plane image pickup method, control terminal, unmanned aerial vehicle (UAV) control device and unmanned plane
CN108398960A (en) * 2018-03-02 2018-08-14 南京航空航天大学 A kind of multiple no-manned plane collaboration target tracking method for improving APF and being combined with segmentation Bezier
CN108375986A (en) * 2018-03-30 2018-08-07 深圳市道通智能航空技术有限公司 Control method, device and the terminal of unmanned plane
CN109445465A (en) * 2018-10-17 2019-03-08 深圳市道通智能航空技术有限公司 Method for tracing, system, unmanned plane and terminal based on unmanned plane

Also Published As

Publication number Publication date
CN109445465A (en) 2019-03-08

Similar Documents

Publication Publication Date Title
WO2020078217A1 (en) Unmanned aerial vehicle-based tracking method and system, unmanned aerial vehicle and terminal
CN109584276B (en) Key point detection method, device, equipment and readable medium
US11073389B2 (en) Hover control
CN106657779B (en) Surrounding shooting method and device and unmanned aerial vehicle
CN110310326B (en) Visual positioning data processing method and device, terminal and computer readable storage medium
EP3644015A1 (en) Position estimation system and position estimation method
WO2018214068A1 (en) Flight control method, device and system, and machine readable storage medium
US20200125100A1 (en) Movable object control method, device and system
WO2020024104A1 (en) Return control method, apparatus and device
WO2018210305A1 (en) Image identification and tracking method and device, intelligent terminal and readable storage medium
WO2019061111A1 (en) Path adjustment method and unmanned aerial vehicle
US20180032793A1 (en) Apparatus and method for recognizing objects
KR101827249B1 (en) Intelligent system for object detection drone based on smart device using VPN, and Intelligent object detection method for the same
WO2019144286A1 (en) Obstacle detection method, mobile platform, and computer readable storage medium
JP2022130588A (en) Registration method and apparatus for autonomous vehicle, electronic device, and vehicle
CN112631333B (en) Target tracking method and device of unmanned aerial vehicle and image processing chip
WO2018121794A1 (en) Control method, electronic device and storage medium
CN110554420B (en) Equipment track obtaining method and device, computer equipment and storage medium
CN110853098A (en) Robot positioning method, device, equipment and storage medium
CN109283942A (en) For controlling the flying method and device that unmanned plane is tracked
CN109240319A (en) The method and device followed for controlling unmanned plane
CN109472258A (en) Tracking and device
CN205356525U (en) Unmanned aerial vehicle
WO2020252688A1 (en) Target recognition based on image information, system for target recognition based on image information
KR20220068606A (en) Automatic landing algorithm of drone considering partial images

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19873764

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19873764

Country of ref document: EP

Kind code of ref document: A1