CN109656319A - A kind of action of ground for rendering auxiliary information method and apparatus - Google Patents

A kind of action of ground for rendering auxiliary information method and apparatus Download PDF

Info

Publication number
CN109656319A
CN109656319A CN201811397300.2A CN201811397300A CN109656319A CN 109656319 A CN109656319 A CN 109656319A CN 201811397300 A CN201811397300 A CN 201811397300A CN 109656319 A CN109656319 A CN 109656319A
Authority
CN
China
Prior art keywords
information
unmanned plane
ground
auxiliary information
action
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201811397300.2A
Other languages
Chinese (zh)
Other versions
CN109656319B (en
Inventor
杜威
许家文
杜虎
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Bright Wind Taiwan (shanghai) Mdt Infotech Ltd
Original Assignee
Bright Wind Taiwan (shanghai) Mdt Infotech Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Bright Wind Taiwan (shanghai) Mdt Infotech Ltd filed Critical Bright Wind Taiwan (shanghai) Mdt Infotech Ltd
Priority to CN201811397300.2A priority Critical patent/CN109656319B/en
Publication of CN109656319A publication Critical patent/CN109656319A/en
Application granted granted Critical
Publication of CN109656319B publication Critical patent/CN109656319B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/187Segmentation; Edge detection involving region growing; involving region merging; involving connected component labelling
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B7/00Radio transmission systems, i.e. using radiation field
    • H04B7/14Relay systems
    • H04B7/15Active relay systems
    • H04B7/185Space-based or airborne stations; Stations for satellite systems
    • H04B7/18502Airborne stations
    • H04B7/18506Communications with or from aircraft, i.e. aeronautical mobile service
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/183Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
    • H04N7/185Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source from a mobile camera, e.g. for remote control
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker

Abstract

The purpose of the application is to provide a kind of method and apparatus of the action of ground for rendering auxiliary information, and unmanned aerial vehicle (UAV) control equipment sends unmanned plane auxiliary information to corresponding user equipment;The user equipment receives unmanned plane auxiliary information transmitted by the unmanned aerial vehicle (UAV) control equipment, and the action of ground corresponding to unmanned plane auxiliary information auxiliary information is presented;Wherein, the ground action auxiliary information is used for ancillary terrestrial action.The application can improve the ground action efficiency of team.

Description

A kind of action of ground for rendering auxiliary information method and apparatus
Technical field
This application involves computer fields, more particularly to a kind of technology of the action of ground for rendering auxiliary information.
Background technique
With the development of technology, unmanned plane is gradually used widely.In general, one group of unmanned machine equipment includes for nobody Machine (ontology) and unmanned aerial vehicle (UAV) control equipment for controlling unmanned plane.Due to taking action flexibly, unmanned plane is often used in auxiliary Ground action, the field usually shot by the user of operation unmanned aerial vehicle (UAV) control equipment (or being unmanned plane " winged hand ") according to unmanned plane Personnel provide action guide to scape picture (or being " picture of taking photo by plane ") to the ground, such as personnel describe surrounding ring to winged hand to the ground Border, offer course of action suggestion etc..Wherein unmanned plane flies hand and ground staff and passes through the contact of the means such as radio.
Although unmanned plane enriches the information that ground staff can obtain, the information that ground staff obtains still have compared with Big limitation.On the one hand, unmanned plane flies information that hand is provided by radio (for example, intercom) and non-at-scene original letter Breath, therefore ground staff's information obtained may malfunction in communication process;On the other hand, retouching for hand is flown according to unmanned plane It states, ground staff may cause field conditions to judge by accident and make mistakes.This will reduce the action efficiency of team.In addition, on ground When personnel are policeman, policeman needs to hold the police service equipment for flying hand communication with unmanned plane, is unfavorable for development task.
Summary of the invention
The purpose of the application is to provide a kind of method of the action of ground for rendering auxiliary information.
According to the one aspect of the application, provide a kind of in user equipment end ground action auxiliary information for rendering Method, this method comprises:
Receive unmanned plane auxiliary information transmitted by corresponding unmanned aerial vehicle (UAV) control equipment;
The action of ground corresponding to unmanned plane auxiliary information auxiliary information is presented;
Wherein, the ground action auxiliary information is used for ancillary terrestrial action.
According to further aspect of the application, providing one kind, ground action is auxiliary for rendering in unmanned aerial vehicle (UAV) control equipment end The method of supplementary information, this method comprises:
Unmanned plane auxiliary information is sent to corresponding user equipment, so that corresponding ground action is presented in the user equipment Auxiliary information.
According to the one aspect of the application, a kind of user equipment of the action of ground for rendering auxiliary information is provided, it should User equipment includes:
One one module, for receiving unmanned plane auxiliary information transmitted by corresponding unmanned aerial vehicle (UAV) control equipment;
One or two module, for rendering ground corresponding to the unmanned plane auxiliary information take action auxiliary information;
Wherein, the ground action auxiliary information is used for ancillary terrestrial action.
According to further aspect of the application, a kind of unmanned aerial vehicle (UAV) control of the action of ground for rendering auxiliary information is provided Equipment, the unmanned aerial vehicle (UAV) control equipment include:
2nd 1 module, for sending unmanned plane auxiliary information to corresponding user equipment, so that the user equipment is in Existing corresponding ground action auxiliary information.
According to the one aspect of the application, a kind of method of the action of ground for rendering auxiliary information, this method are provided Include:
Unmanned aerial vehicle (UAV) control equipment sends unmanned plane auxiliary information to corresponding user equipment;
The user equipment receives unmanned plane auxiliary information transmitted by the unmanned aerial vehicle (UAV) control equipment, and the nothing is presented The action of ground corresponding to man-machine auxiliary information auxiliary information;
Wherein, the ground action auxiliary information is used for ancillary terrestrial action.
According to the one aspect of the application, a kind of equipment of the action of ground for rendering auxiliary information, the equipment are provided Include:
Processor;And
It is arranged to the memory of storage computer executable instructions, the executable instruction makes the place when executed Manage the operation that device executes approach described above.
According to further aspect of the application, a kind of computer-readable medium including instruction is provided, described instruction exists It is performed so that system executes the operation of approach described above.
The application by unmanned aerial vehicle (UAV) control equipment to the ground personnel user equipment send unmanned plane auxiliary information, for institute It states user equipment and corresponding ground action auxiliary information is presented with ancillary terrestrial action, so that strengthening unmanned plane flies hand and ground Interaction between personnel.Compared with prior art, the information that ground staff obtains is more intuitive and diversified, and ground staff's energy The raw information at scene is obtained, so that a possibility that ground staff judges by accident greatly reduces, team's action efficiency is also greatly mentioned It is high.In addition, can be the head-mounted display apparatus such as intelligent glasses by its user device configuration when ground staff is policeman, thus Ground staff is not necessarily to hold the police service equipment for flying hand communication with unmanned plane, is convenient for development task.
Detailed description of the invention
By reading a detailed description of non-restrictive embodiments in the light of the attached drawings below, the application's is other Feature, objects and advantages will become more apparent upon:
Fig. 1 shows the unmanned plane according to the application one embodiment, unmanned aerial vehicle (UAV) control equipment, assists between user equipment Make with the system topological of ancillary terrestrial action;
Fig. 2 is the method according to the application one embodiment in user equipment end ground action auxiliary information for rendering Flow chart;
Fig. 3 is according to the application another embodiment in the side of user equipment end ground action auxiliary information for rendering Method flow chart;
Fig. 4 is according to the application one embodiment in unmanned aerial vehicle (UAV) control equipment end ground action auxiliary information for rendering Method flow diagram;
Fig. 5 is according to the application another embodiment in unmanned aerial vehicle (UAV) control equipment end ground action auxiliary letter for rendering The method flow diagram of breath;
Fig. 6 is according to the application another embodiment in unmanned aerial vehicle (UAV) control equipment end ground action auxiliary letter for rendering The method flow diagram of breath;
Fig. 7 is the functional block according to the user equipment of the ground for rendering of the application one embodiment action auxiliary information Figure;
Fig. 8 is the function according to the user equipment of the action of the ground for rendering auxiliary information of the application another embodiment Block diagram;
Fig. 9 is the unmanned aerial vehicle (UAV) control equipment according to the ground for rendering of the application one embodiment action auxiliary information Functional block diagram;
Figure 10 is set according to the unmanned aerial vehicle (UAV) control of the action of the ground for rendering auxiliary information of the application another embodiment Standby functional block diagram;
Figure 11 is set according to the unmanned aerial vehicle (UAV) control of the action of the ground for rendering auxiliary information of the application another embodiment Standby functional block diagram;
Figure 12 shows the exemplary system of the application.
The same or similar appended drawing reference represents the same or similar component in attached drawing.
Specific embodiment
The application is described in further detail with reference to the accompanying drawing.
In a typical configuration of this application, terminal, the equipment of service network and trusted party include one or more Processor (CPU), input/output interface, network interface and memory.
Memory may include the non-volatile memory in computer-readable medium, random access memory (RAM) and/or The forms such as Nonvolatile memory, such as read-only memory (ROM) or flash memory (flash RAM).Memory is computer-readable medium Example.
Computer-readable medium includes permanent and non-permanent, removable and non-removable media can be by any method Or technology come realize information store.Information can be computer readable instructions, data structure, the module of program or other data. The example of the storage medium of computer includes, but are not limited to phase change memory (PRAM), static random access memory (SRAM), moves State random access memory (DRAM), other kinds of random access memory (RAM), read-only memory (ROM), electric erasable Programmable read only memory (EEPROM), flash memory or other memory techniques, read-only disc read only memory (CD-ROM) (CD-ROM), Digital versatile disc (DVD) or other optical storage, magnetic cassettes, magnetic tape disk storage or other magnetic storage devices or Any other non-transmission medium, can be used for storage can be accessed by a computing device information.
The application meaning equipment includes but is not limited to that user equipment, the network equipment or user equipment and the network equipment pass through Network is integrated constituted equipment.The user equipment includes but is not limited to that any one can carry out human-computer interaction with user The mobile electronic product, such as smart phone, tablet computer, intelligent glasses etc. of (such as human-computer interaction is carried out by touch tablet), The mobile electronic product can use any operating system, such as Android operation system, iOS operating system.Wherein, institute Stating the network equipment includes that one kind can be according to the instruction for being previously set or storing, the automatic electricity for carrying out numerical value calculating and information processing Sub- equipment, hardware include but is not limited to microprocessor, specific integrated circuit (ASIC), programmable logic device (PLD), scene Programmable gate array (FPGA), digital signal processor (DSP), embedded device etc..The network equipment includes but is not limited to The cloud that computer, network host, single network server, multiple network server collection or multiple servers are constituted;Here, cloud by A large number of computers or network servers based on cloud computing (Cloud Computing) is constituted, wherein cloud computing is distributed meter One kind of calculation, a virtual supercomputer consisting of a loosely coupled set of computers.The network includes but unlimited In internet, wide area network, Metropolitan Area Network (MAN), local area network, VPN network, wireless self-organization network (Ad Hoc network) etc..Preferably, institute It states equipment and can also be and run on the user equipment, the network equipment or user equipment and the network equipment, the network equipment, touch Terminal or the network equipment and touch terminal are integrated the program in constituted equipment by network.
Certainly, those skilled in the art will be understood that above equipment is only for example, other are existing or are likely to occur from now on Equipment be such as applicable to the application, should also be included within the application protection scope, and be incorporated herein by reference.
In the description of the present application, the meaning of " plurality " is two or more, unless otherwise specifically defined.
The user equipment of the application meaning includes but is not limited to the meter such as smart phone, tablet computer, intelligent glasses or helmet Calculate equipment.In some embodiments, which further includes the photographic device for acquiring image information, the photographic device one As include photosensitive element for converting optical signals to electric signal, as needed also may include the biography for adjusting incident ray Broadcast the light folding/reflection component (such as camera lens or lens assembly) in path.To be operated convenient for user, in some embodiments In, the user equipment further includes display device, for presenting to user and/or for augmented reality content to be arranged, wherein In some embodiments, the augmented reality content superposition be presented on destination apparatus, and destination apparatus by user equipment (such as thoroughly Penetrate formula glasses or the other users equipment with display screen) it presents;Wherein, which is touch-control in some embodiments Screen, the Touch Screen can be used not only for output pattern picture, also act as the input unit of user equipment to receive user's Operational order (such as operational order with aforementioned augmented reality content interaction).Certainly, those skilled in the art will be understood that use The input unit of family equipment is not limited only to Touch Screen, other existing input technologies such as can be suitably used for the application, is also contained in In the protection scope of the application, and it is incorporated herein by reference.For example, in some embodiments, for receiving the behaviour of user The input technology for making to instruct is realized based on voice control, gesture control and/or eyeball tracking.
With reference to system topological shown in fig. 1, unmanned aerial vehicle (UAV) control equipment and unmanned plane are communicated to transmit data, for nothing Heading, posture of man-machine winged hand control unmanned plane etc. and unmanned plane send data to unmanned aerial vehicle (UAV) control equipment and (such as wrap Include but be not limited to one or more heat transfer agents such as unmanned plane oneself state, scene image information).Meanwhile unmanned aerial vehicle (UAV) control is set Standby and ground staff user equipment communicates, so that unmanned aerial vehicle (UAV) control equipment sends unmanned plane auxiliary information to user equipment (scenic picture for example including unmanned plane shooting, or the other information determining according to the operation that unmanned plane flies hand), and by with Ground action auxiliary information corresponding with the unmanned plane auxiliary information is presented in family equipment, with the action of ancillary terrestrial personnel.Its In, unmanned plane can carry multiple sensors, these sensors are for sensing the data such as orientation, the posture of unmanned plane itself or use In the relevant information of acquisition external environment.For example, unmanned plane is based on GPS sensor, in real time dynamic (Real-Time Kinematic, RTK) angular speed of the acquisition such as module, baroceptor, gyroscope, electronic compass itself, posture, position plus The information such as speed, height, air speed, and it is based on imaging sensor photographed scene picture, which can be transmitted to unmanned plane control Control equipment.In some cases, holder can be set on unmanned plane to install camera, UAV Attitude variation, body is isolated The external disturbances such as vibration and extraneous moment of wind resistance guarantee the optic central extract of Airborne Camera to shooting work bring adverse effect.
Based on system shown in fig. 1, this application provides a kind of method of the action of ground for rendering auxiliary information, the party Method the following steps are included:
Unmanned aerial vehicle (UAV) control equipment sends unmanned plane auxiliary information to corresponding user equipment;And
The user equipment receives unmanned plane auxiliary information transmitted by the unmanned aerial vehicle (UAV) control equipment, and the nothing is presented The action of ground corresponding to man-machine auxiliary information auxiliary information;
Wherein, the ground action auxiliary information is used for ancillary terrestrial action.
Individually below in terms of user equipment and unmanned aerial vehicle (UAV) control equipment two, the application is described in detail.
According to the one aspect of the application, provide a kind of in user equipment end ground action auxiliary information for rendering Method.With reference to Fig. 2, the method comprising the steps of S110 and step S120.
In step s 110, user equipment receives unmanned plane auxiliary information transmitted by corresponding unmanned aerial vehicle (UAV) control equipment. In some embodiments, the unmanned plane auxiliary information includes but is not limited to one or more below:
1) the destination position information of target related data information, including but not limited to ground action, periphery terrestrial reference The relevant informations such as information, the title of related ground target (including but not limited to target object and target person) or feature;
2) image information captured by unmanned plane (calling unmanned plane image information in the following text), including but not limited to static image information With dynamic image data (such as video);
3) markup information, the including but not limited to described user, unmanned plane fly the mark of hand or other users based on addition Information, the object marked include but is not limited in scene certain point (for example, the point be based on ground practical action or simulation row The latitude and longitude information of dynamic destination and determine) or a certain region (such as latitude and longitude information of the region based on its each vertex And determine), the content of mark includes but is not limited to color dot, lines, model, image, text etc.;
4) target position information can be used for including but not limited to for latitude and longitude information where the target of ground staff's reference Determine target relative to orientation locating for ground staff.
Wherein, in some embodiments, ground staff needs to recognize related target that (such as ground staff is police Member), above-mentioned target related data information may also include face, macroscopic features, height, the property of target (such as suspect) Not, the information such as age.
In the case where unmanned plane auxiliary information includes periphery landmark information, which can be based on unmanned plane Heat transfer agent and determine.For example, unmanned plane is mounted with multiple sensors, including GPS sensor, baroceptor, gyroscope, electricity Sub- compass etc., these sensors be used to acquire the longitude and latitude position of unmanned plane, posture, speed, angular speed, acceleration, height and The information such as air speed;The unmanned aerial vehicle (UAV) control equipment communicated with unmanned plane obtains the latitude and longitude information of unmanned plane, then to geography Information system (Geographic Information System, GIS) issues association requests, and GIS-Geographic Information System is according to reception The building landmark information on periphery is returned to unmanned plane longitude and latitude data to unmanned aerial vehicle (UAV) control equipment.Later, unmanned aerial vehicle (UAV) control equipment Sensing data needed for obtaining, and the data such as height, orientation, longitude and latitude position for combining unmanned plane, the building terrestrial reference on periphery It is added in the image of current unmanned plane shooting, understands surrounding geography information for policeman and winged hand.
In the step s 120, the action of ground corresponding to unmanned plane auxiliary information auxiliary information is presented in user equipment. For example, above-mentioned target related data information, unmanned plane image information or markup information is presented in user equipment.Alternatively, user sets It is standby to include based on the corresponding ground action auxiliary information of above-mentioned unmanned plane auxiliary information generation, such as the unmanned plane auxiliary information Target position information;Correspondingly, in the step s 120, user equipment is based on the target position information, determines the unmanned plane The action of ground corresponding to auxiliary information auxiliary information (such as the position based on above-mentioned target position information and user equipment itself Confidence breath, invocation map application interface (for example, based on transmission-type glasses, are superimposed empty with generating real scene navigation information in outdoor scene Quasi- arrow etc. is for navigating;Or independent scene map and navigation information are provided), and the ground action auxiliary letter is presented Breath, to guide ground staff's action.
Wherein in some embodiments, the ground action auxiliary information includes the letter of unmanned plane image captured by unmanned plane Breath, the above method is further comprising the steps of: according to the operation of the unmanned plane image labeling of user (such as user on the touch screen into The operation such as capable click, touching, sliding), determine corresponding unmanned plane image labeling information;And Xiang Suoshu unmanned aerial vehicle (UAV) control Equipment sends the unmanned plane image labeling information so that unmanned plane user checks, such as the unmanned plane image labeling information is ground The user equipment that dough figurine person passes through corresponding to it send back to unmanned plane control after marking on the basis of above-mentioned unmanned plane image information Control equipment, so that the user of unmanned aerial vehicle (UAV) control equipment refers to, for example, unmanned aerial vehicle (UAV) control equipment receives and the user equipment is presented The transmitted unmanned plane image labeling information about the unmanned plane image information.
Wherein, above-described user equipment can be the head-wearing type intelligents equipment such as intelligent glasses/helmet, or mobile electricity Words, tablet computer, navigation equipment (such as hand-held or fixed calculating equipment on a vehicle);In some cases, Above-described user equipment can be used for capturing the first multi-view video of ground staff, above-mentioned ground action auxiliary information and its The interactive information of his user, take charge command scheduling information etc. transmitted by the maneuvering platform of ground action.In some implementations In example, fixed area (such as rectangular area or whole of the ground action auxiliary information in the display device of user equipment It is a can display area) in present;In further embodiments, the ground action auxiliary information is in a manner of augmented reality It is existing, such as the presentation of the grenade instrumentation based on transmission-type glasses, so that virtual information is superimposed in the relevant range of real world, with reality The experience that existing actual situation combines.Those skilled in the art will be understood that above-described user equipment is only for example, other are existing Or the user equipment being likely to occur from now on such as can be suitably used for the application, be also contained in the protection scope of the application, and to draw Mode is incorporated herein.
In some embodiments, in step s 110, user equipment via the corresponding network equipment (e.g., including but not It is limited to cloud server), unmanned plane auxiliary information transmitted by corresponding unmanned aerial vehicle (UAV) control equipment is received, to realize the letter of multiterminal Breath is shared, such as other unmanned planes fly hand or ground in the case where flying hand perhaps multiple ground staff there are multiple unmanned planes Dough figurine person can also obtain above-mentioned unmanned plane auxiliary information by the network equipment.In some embodiments, the unmanned plane auxiliary Information includes unmanned plane image information (such as video information) and by unmanned aerial vehicle (UAV) control equipment plug-flow to the network equipment, so as to Each participant real time inspection or readjustment thumb corresponding image document.
In some embodiments, with reference to Fig. 3, this method further includes step S130.In step s 130, user equipment obtains The ground image information of scene where the user equipment, and the ground image information is sent to the unmanned aerial vehicle (UAV) control and is set It is standby.For example, the user equipment is made based on the image for being fixed at photographic device captured in real-time ground staff present position thereon For ground image information, and send it to the unmanned aerial vehicle (UAV) control equipment.It is similar with above-described situation, in some implementations In example, the ground image information is sent to the unmanned aerial vehicle (UAV) control equipment, example via the corresponding network equipment by user equipment Ground image information plug-flow is to the network equipment as will be described, so that each participant real time inspection or readjustment are thumbed accordingly Image document.
In some embodiments, the above method further includes step S140 (not shown) and step S150 (not shown).
Specifically, user equipment is in step S140 to the ground image after obtaining above-mentioned ground image information The identification operation of information performance objective.For example, specific object (is fixed or is not fixed, example for identification for the target identification operation Such as object is building or the vehicles) or personage.In a specific embodiment, target identification operation is based on deep It spends learning algorithm to realize, first preparation training set (such as the image for wearing the pedestrian of different colours clothes) and corresponding label (example Such as the position of pedestrian in the picture);Then deep learning model is trained, and straight according to the parameter of the continuous iteration model of training set It is restrained to model;Finally the image by user equipment shooting inputs trained deep learning model, can be obtained with specific Position of the pedestrian of clothes color in picture, so that target identification operation is completed.
In step S150, the ground image information, and the behaviour based on target identification operation is presented in user equipment Make result and corresponding target following information is presented.For example, connecting example, pedestrian is being obtained in picture by target identification operation Position after, user equipment is superimposed in the position is presented corresponding target following information, by the pedestrian in picture with other Object or pedestrian mutually distinguish, for example, the target following information be target around highlight contour line or box, vitta, color dot, Arrow etc..In some embodiments, when the ground image information changes over time (such as user equipment shooting and it is non-static Image but video), to keep the target and the state mutually distinguished of other objects or pedestrian in picture that identify, the mesh Mark tracking information can follow the target identified mobile, such as execute above-mentioned target frame by frame to video captured by user equipment and know It does not operate, or above-mentioned target identification is executed to multiple key frames in video and is operated.
Wherein, above-mentioned target following information is in addition to being operated based on user equipment in the target identification locally carried out to obtain it Outside, it may be based on the acquisition of unmanned plane auxiliary information transmitted by unmanned plane.In some embodiments, the unmanned plane auxiliary information Tracking information is assisted including target, the ground action auxiliary information includes target corresponding with target auxiliary tracking information Tracking information.For example, the unmanned aerial vehicle (UAV) control equipment is to the identification operation of image performance objective captured by unmanned plane to identify this Certain objects or personage in image, to obtain above-mentioned target auxiliary tracking information (for example, certain objects or personage are at nobody Position in the image of machine shooting).The process of unmanned aerial vehicle (UAV) control equipment performance objective identification operation and above-mentioned user equipment execute The process of target identification operation is identical or essentially identical, repeats no more, and be incorporated herein by reference.
In some embodiments, the unmanned plane auxiliary information includes markup information (for example, the markup information is by unmanned plane Fly hand to add by the unmanned aerial vehicle (UAV) control equipment, or other action participants by that can communicate with the user equipment, Such as maneuvering platform, addition), the markup information include mark element (including but not limited to box, vitta, color dot, arrow, Picture/video, animation, threedimensional model etc.) and its position of appearing information (for determining that aforementioned mark element is locating in picture Position).Correspondingly, in the step s 120, user equipment is based on the markup information, and it is right that the unmanned plane auxiliary information institute is presented The ground action auxiliary information (such as ground action auxiliary information includes the markup information) answered.For example, working as the mark Note information corresponds to certain point (such as markup information includes ground practical action or the latitude and longitude information for simulating the destination taken action) When, user equipment superposition presentation color in the position in outdoor scene or in map picture or in the image frame of unmanned plane transmission Point;And when the markup information corresponds to a certain region (such as markup information include each vertex in a certain region latitude and longitude information) When, user equipment corresponds to region superposition and a color lump is presented.Specifically, in some embodiments, above-mentioned markup information can Including but not limited to the following aspects: route planning information, such as can be according to the position of the suspect currently tracked, before addition Line policeman arrests the programme path of action;Strategic plan information, such as (the burst thing of tab area in picture can be passed in unmanned plane figure Specified region etc. when enclosure region when part, task on duty when affiliated area, tactics deployment), before capable of being allowed in arresting action Line police intuitively obtain specific region position;Combat exercise (such as surface state is labeled with auxiliary tactics Rehearsal) information, the position of the suspect of simulation can be demarcated and highlighted with red circle.While neighbouring ground staff goes to destination It can check monitoring unmanned picture, by these markup informations, ground staff better understood when and execute task, action effect Rate is also promoted.
Wherein, in some embodiments, the unmanned plane auxiliary information further includes unmanned plane image information, and the unmanned plane Image information is video.Under the premise of ignoring network delay, unmanned aerial vehicle (UAV) control equipment to user equipment send the video and After above-mentioned markup information (including mark element and its position of appearing information), the video is presented in user equipment, and based on mark member The mark element is presented in the position of appearing information of element, flies the real-time of hand marked content to unmanned plane to realize at user equipment end It presents, so that user is based on the marked content fast reaction, flies the cooperation efficiency of hand to promote user and unmanned plane.In some feelings Under shape, for example including but be not limited to and in the ignorable situation of network delay and need to adjust back and thumb video and markup information Situation, the markup information further include time shaft location information corresponding to the mark element, which uses In determine the mark element accurate corresponding video frame (for example, by determining the position of associated video frame on a timeline), And the mark element is superimposed in the video frame, to avoid mark caused by due to marking element overlaid in not corresponding video frame Note dislocation.
Further to promote team collaboration's efficiency, same user equipment simultaneously or can divide with multiple unmanned aerial vehicle (UAV) control equipment It does not communicate, such as unmanned plane corresponding to multiple unmanned aerial vehicle (UAV) control equipment is covered each by the different piece with a piece of operational region, It is interacted simultaneously or respectively by multiple unmanned aerial vehicle (UAV) control equipment with unmanned plane, team's resource is rationally utilized.In some realities It applies in example, in step s 110, user equipment receives transmitted by least one of corresponding multiple unmanned aerial vehicle (UAV) control equipment Unmanned plane auxiliary information;In the step s 120, user equipment is presented in the multiple unmanned aerial vehicle (UAV) control equipment, at least one nobody Machine controls the action of ground corresponding to unmanned plane auxiliary information transmitted by equipment auxiliary information.For example, user equipment can be in turn Unmanned plane auxiliary information transmitted by multiple unmanned aerial vehicle (UAV) control equipment is presented, or present simultaneously (such as a variety of different information are same When superposition present) multinomial unmanned plane auxiliary information, either one or more according to needed for the selection operation presentation user of user Item unmanned plane auxiliary information;For each unmanned plane auxiliary information, presentation mode and above-described unmanned plane are assisted The presentation mode of information is identical or essentially identical, repeats no more, and is incorporated herein by reference.
According to further aspect of the application, providing one kind, ground action is auxiliary for rendering in unmanned aerial vehicle (UAV) control equipment end The method of supplementary information.With reference to Fig. 4, the method comprising the steps of S210.In step S210, unmanned aerial vehicle (UAV) control equipment is to corresponding use Family equipment sends unmanned plane auxiliary information, so that corresponding ground action auxiliary information is presented in the user equipment.In some realities It applies in example, the unmanned plane auxiliary information includes but is not limited to one or more below:
1) the destination position information of target related data information, including but not limited to ground action, periphery terrestrial reference The relevant informations such as information, the title of related ground target (including but not limited to target object and target person) or feature;One In a little embodiments, unmanned aerial vehicle (UAV) control equipment carries out target identification operation to the image obtained captured by unmanned plane, specific to identify Object or person target, and then unmanned aerial vehicle (UAV) control equipment read in local or its addressable database target correlation money Information is expected, to promote task execution efficiency;
2) unmanned plane image information, including but not limited to static image information and dynamic image data, such as video;
3) markup information, the including but not limited to described user, unmanned plane fly the mark of hand or other users based on addition Information, wherein the object marked include but is not limited in scene certain point (for example, the point be based on ground practical action or mould Plan to implement the latitude and longitude information of dynamic destination and determine) or a certain region (such as longitude and latitude of the region based on its each vertex Information and determine), the form of mark includes but is not limited to color dot, lines, model, image, text etc.;
4) target position information can be used for including but not limited to for latitude and longitude information where the target of ground staff's reference Determine target relative to orientation locating for ground staff.
In some embodiments, unmanned aerial vehicle (UAV) control equipment via the corresponding network equipment (e.g., including but be not limited to cloud Server), the unmanned plane auxiliary information sent to corresponding user equipment to realize the information sharing of multiterminal, such as exists Multiple unmanned planes fly hand, and perhaps other unmanned planes fly hand or ground staff and can also pass through the net in the case where multiple ground staff Network equipment obtains above-mentioned unmanned plane auxiliary information.In some embodiments, the unmanned plane auxiliary information includes unmanned plane image Information (such as video information) and by unmanned aerial vehicle (UAV) control equipment plug-flow to the network equipment, so as to each participant real time inspection or Person's readjustment thumbs corresponding image document.
Other than sending unmanned plane auxiliary information to user equipment, unmanned aerial vehicle (UAV) control equipment also can receive user equipment institute The ground image information of transmission is for reference;Correspondingly, in some embodiments, the above method further includes step S220, such as Fig. 5 It is shown.Unmanned aerial vehicle (UAV) control equipment receives in step S220 and ground image information transmitted by the user equipment is presented.This Outside, based on ground image information transmitted by user equipment, unmanned aerial vehicle (UAV) control equipment can also execute mesh to the ground image information Mark does not operate, and the operating result based on target identification operation, sends corresponding target to user equipment and assists tracking information (for example, the position of the target identified in picture), so that user equipment is based on target auxiliary tracking information people to the ground Corresponding target following information (such as the contour line highlighted around target or box, vitta, color dot, arrow etc.) is presented in member.It is logical It crosses and receives ground image information transmitted by user equipment, unmanned plane, which flies hand, can obtain the first visual angle picture of user, comprehensively Field conditions are grasped, ground staff is can also assist in and operation is identified to the ground image information performance objective, to further be promoted Cooperation efficiency.For example, specific object (is fixed or is not fixed, such as the object is to build for identification for the target identification operation Build object or the vehicles) or personage.In a specific embodiment, target identification operation is real based on deep learning algorithm Existing, (such as pedestrian is in image for preparation training set (such as the image for wearing the pedestrian of different colours clothes) and corresponding label first In position);Then deep learning model is trained, and according to the parameter of the continuous iteration model of training set until model is restrained; Finally the image by user equipment shooting inputs trained deep learning model, and the row with specific clothes color can be obtained Position of the people in picture, so that target identification operation is completed.
Wherein in some embodiments, the above method is further comprising the steps of: being believed according to user about the ground image The ground image labeling operation of breath determines corresponding ground image markup information.Wherein, the unmanned plane auxiliary information includes institute State ground image markup information.For example, receiving the ground image captured by user equipment corresponding to ground staff and sent Afterwards, the user of unmanned aerial vehicle (UAV) control equipment adds corresponding markup information on this image, and the image and corresponding mark are believed Breath send back to aforementioned user equipment again.
In some cases, above-described unmanned plane auxiliary information can be believed based on the unmanned plane image captured by unmanned plane Breath obtains.In some embodiments, with reference to Fig. 6, the above method includes step S250.Unmanned aerial vehicle (UAV) control equipment is in step s 250 Unmanned plane image information captured by corresponding unmanned plane (including but not limited to still image, video etc.) is obtained, later in step It is based on the unmanned plane image information in rapid S210, unmanned plane auxiliary information is sent to corresponding user equipment, for the use Corresponding ground action auxiliary information is presented in family equipment.For example, the ground action auxiliary information includes above-mentioned unmanned plane image Information;Alternatively, unmanned aerial vehicle (UAV) control equipment identifies operation to the unmanned plane image information performance objective to identify specific objective, and really Fixed corresponding target assists tracking information;Alternatively, the quantity of the specific objective in unmanned plane identification picture, and by the quantity information A part as ground action auxiliary information is sent to user equipment.
On this basis, in some embodiments, the above method further includes step S260 (not shown).In step S260 In, unmanned aerial vehicle (UAV) control equipment based on unmanned plane user user's operation (e.g., including but be not limited to click, frame choosing, dragging behaviour Make or text input operate), determine the image labeling information about the unmanned plane image information, wherein described image mark Infusing information includes mark element (including but not limited to box, vitta, color dot, arrow, picture/video, animation, threedimensional model etc.) And its position of appearing information (for determining aforementioned mark element the location of in picture).Correspondingly, in step S210, Unmanned aerial vehicle (UAV) control equipment is based on the unmanned plane image information and described image markup information, sends to corresponding user equipment Unmanned plane auxiliary information, so that corresponding ground action auxiliary information is presented in the user equipment.
Wherein, in some embodiments, the unmanned plane auxiliary information further includes unmanned plane image information, and the unmanned plane Image information is video.For example, in step S260, user's operation of the unmanned aerial vehicle (UAV) control equipment based on unmanned plane user is determined About the image labeling information of the unmanned plane image information, wherein described image markup information includes mark element and its presentation Location information further includes time shaft location information corresponding to the mark element, such as under the premise of ignoring network delay, Unmanned aerial vehicle (UAV) control equipment sends the video and above-mentioned markup information (including mark element and its Presence Bit confidence to user equipment Breath) after, user equipment is presented the video, and the mark element is presented in the position of appearing information based on mark element, thus with Family equipment end realizes the real-time presentation for flying hand marked content to unmanned plane, so that user is based on the marked content fast reaction, with It promotes user and unmanned plane flies the cooperation efficiency of hand.In some cases, for example including but be not limited to and network delay can not The situation and needs ignored adjust back the situation for thumbing video and markup information, and the markup information further includes mark element institute Corresponding time shaft location information, the time shaft location information is for determining the accurate corresponding video frame (example of mark element institute Such as, by determining the position of associated video frame on a timeline), and the mark element is superimposed in the video frame, to avoid Dislocation is marked caused by due to marking element overlaid in not corresponding video frame.
In some embodiments, the above method further includes step S270 (not shown).In step S270, unmanned aerial vehicle (UAV) control Equipment is based on corresponding unmanned plane and the relative orientation information of specified target and the spatial positional information of the unmanned plane, really The target position information of the fixed specified target.Wherein in some embodiments, target is specified to fly hand in unmanned plane by unmanned plane It controls and is determined in equipment, such as determined on the display screen of unmanned aerial vehicle (UAV) control equipment by modes such as click, frame choosings.For example, In one embodiment, selected operation of the unmanned aerial vehicle (UAV) control equipment based on user determines corresponding specified target, subsequent unmanned plane Control equipment control unmanned plane measures the linear distance of the specified target and unmanned plane (for example, based on airborne laser range finder Obtain), in conjunction with the elevation information (for example, obtaining based on barometer) of unmanned plane itself, obtain the water of unmanned plane and specified target Flat distance, further according to unmanned plane itself latitude and longitude information (for example, based on GPS sensor obtain) and target relative to nobody The azimuth of machine, the final latitude and longitude information for determining specified target, and believe the latitude and longitude information as the target position of target Breath.In another example in another embodiment, pitch angle of the unmanned aerial vehicle (UAV) control equipment based on unmanned plane is (for example, be based on gyroscope Obtain) determine the angle of line and plumb line between unmanned plane and specified target, and according to the height (example of the angle and unmanned plane Such as, obtained based on barometer) horizontal distance between unmanned plane and specified target is calculated, further according to the longitude and latitude of unmanned plane itself The azimuth of information (for example, being obtained based on GPS sensor) and target relative to unmanned plane, the final warp for determining specified target Latitude information, and using the latitude and longitude information as the target position information of target.Certainly, those skilled in the art will be understood that The above-described mode for obtaining target position information is only for example, other acquisitions that are existing or being likely to occur from now on Mode such as can be suitably used for the application, be also contained in the protection scope of the application, and be incorporated herein by reference.
As previously mentioned, same user equipment can be with multiple unmanned aerial vehicle (UAV) control equipment further to promote team collaboration's efficiency It communicates simultaneously or respectively.Similarly, for same unmanned aerial vehicle (UAV) control equipment, can also with multiple user equipmenies simultaneously or Person communicates respectively, such as ground operational staff corresponding to multiple user equipmenies is located at the different positions with a piece of operational region It sets, by interacting simultaneously or respectively with unmanned aerial vehicle (UAV) control equipment, team's resource is rationally utilized.In some embodiments, In step S210, unmanned aerial vehicle (UAV) control equipment sends unmanned plane auxiliary information at least one corresponding user equipment, for institute It states at least one user equipment and corresponding ground action auxiliary information is presented.For example, unmanned aerial vehicle (UAV) control equipment can be in turn to multiple User equipment sends unmanned plane auxiliary information, or sends multinomial unmanned plane auxiliary information (such as a variety of different information point simultaneously It is not sent to different user equipmenies), or one needed for sending some ground staff according to the selection operation of unmanned plane user Item or multinomial unmanned plane auxiliary information;For each unmanned plane auxiliary information, the mode for generating or sending is with more than The mode is identical or essentially identical, repeats no more, and is incorporated herein by reference.
According to the one aspect of the application, a kind of user equipment of the action of ground for rendering auxiliary information is additionally provided. With reference to Fig. 7, which includes the one one module 110 and the one or two module 120.
One one module 110 receives unmanned plane auxiliary information transmitted by corresponding unmanned aerial vehicle (UAV) control equipment.In some realities It applies in example, the unmanned plane auxiliary information includes but is not limited to one or more below:
1) the destination position information of target related data information, including but not limited to ground action, periphery terrestrial reference The relevant informations such as information, the title of related ground target (including but not limited to target object and target person) or feature;
2) image information captured by unmanned plane (calling unmanned plane image information in the following text), including but not limited to static image information With dynamic image data (such as video);
3) markup information, the including but not limited to described user, unmanned plane fly the mark of hand or other users based on addition Information, the object marked include but is not limited in scene certain point (for example, the point be based on ground practical action or simulation row The latitude and longitude information of dynamic destination and determine) or a certain region (such as latitude and longitude information of the region based on its each vertex And determine), the form of mark includes but is not limited to color dot, lines, model, image, text etc.;
4) target position information can be used for including but not limited to for latitude and longitude information where the target of ground staff's reference Determine target relative to orientation locating for ground staff.
Wherein, in some embodiments, ground staff needs to recognize related target that (such as ground staff is police Member), above-mentioned target related data information may also include face, macroscopic features, height, the property of target (such as suspect) Not, the information such as age.
In the case where unmanned plane auxiliary information includes periphery landmark information, which can be based on unmanned plane Heat transfer agent and determine.For example, unmanned plane is mounted with multiple sensors, including GPS sensor, baroceptor, gyroscope, electricity Sub- compass etc., these sensors be used to acquire the longitude and latitude position of unmanned plane, posture, speed, angular speed, acceleration, height and The information such as air speed;The unmanned aerial vehicle (UAV) control equipment communicated with unmanned plane obtains the latitude and longitude information of unmanned plane, then to geography Information system (Geographic Information System, GIS) issues association requests, and GIS-Geographic Information System is according to reception The building landmark information on periphery is returned to unmanned plane longitude and latitude data to unmanned aerial vehicle (UAV) control equipment.Later, unmanned aerial vehicle (UAV) control equipment Sensing data needed for obtaining, and the data such as height, orientation, longitude and latitude position for combining unmanned plane, the building terrestrial reference on periphery It is added in the image of current unmanned plane shooting, understands surrounding geography information for policeman and backstage commanding.
The action of ground corresponding to unmanned plane auxiliary information auxiliary information is presented in one or two module 120.For example, user Above-mentioned target related data information, unmanned plane image information or markup information is presented in equipment.Alternatively, user equipment is based on above-mentioned Unmanned plane auxiliary information generates corresponding ground action auxiliary information, such as the unmanned plane auxiliary information includes target position letter Breath;Correspondingly, the one or two module 120 is based on the target position information, determines ground corresponding to the unmanned plane auxiliary information Face action auxiliary information (such as the location information based on above-mentioned target position information and user equipment itself, invocation map are answered With interface to generate real scene navigation information, for example, being based on transmission-type glasses, virtual arrow etc. is superimposed in outdoor scene for leading Boat;Or independent scene map and navigation information are provided), and the ground action auxiliary information is presented, to guide ground dough figurine Member's action.
Wherein in some embodiments, the ground action auxiliary information includes the letter of unmanned plane image captured by unmanned plane Breath, above-mentioned user equipment further includes the one or six module (not shown), and the one or six module is used for: according to the unmanned plane figure of user As labeling operation (such as click, touching, the sliding etc. that user carries out on the touch screen operate), corresponding unmanned plane figure is determined As markup information;And Xiang Suoshu unmanned aerial vehicle (UAV) control equipment sends the unmanned plane image labeling information for unmanned plane user Check, for example, the unmanned plane image labeling information be ground staff by the user equipment corresponding to it in above-mentioned unmanned plane image It is sent back to unmanned aerial vehicle (UAV) control equipment after marking on the basis of information, so that the user of unmanned aerial vehicle (UAV) control equipment refers to, for example, nobody Machine control equipment receives and the unmanned plane image mark transmitted by the user equipment about the unmanned plane image information is presented Infuse information.
Wherein, above-described user equipment can be the head-wearing type intelligents equipment such as intelligent glasses/helmet, or mobile electricity Words, tablet computer, navigation equipment (such as hand-held or fixed calculating equipment on a vehicle);In some cases, Above-described user equipment can be used for capturing the first multi-view video of ground staff, above-mentioned ground action auxiliary information and its The interactive information of his user, take charge command scheduling information etc. transmitted by the maneuvering platform of ground action.In some implementations In example, fixed area (such as rectangular area or whole of the ground action auxiliary information in the display device of user equipment It is a can display area) in present;In further embodiments, the ground action auxiliary information is in a manner of augmented reality It is existing, such as the presentation of the grenade instrumentation based on transmission-type glasses, so that virtual information is superimposed in the relevant range of real world, with reality The experience that existing actual situation combines.Those skilled in the art will be understood that above-described user equipment is only for example, other are existing Or the user equipment being likely to occur from now on such as can be suitably used for the application, be also contained in the protection scope of the application, and to draw Mode is incorporated herein.
In some embodiments, the one one module 110 via the corresponding network equipment (e.g., including but be not limited to cloud Server), unmanned plane auxiliary information transmitted by corresponding unmanned aerial vehicle (UAV) control equipment is received, to realize the information sharing of multiterminal, Such as other unmanned planes fly hand or ground staff in the case where flying hand perhaps multiple ground staff there are multiple unmanned planes Above-mentioned unmanned plane auxiliary information can be obtained by the network equipment.In some embodiments, the unmanned plane auxiliary information includes Unmanned plane image information (such as video information) and by unmanned aerial vehicle (UAV) control equipment plug-flow to the network equipment, so as to each participant Real time inspection or readjustment thumb corresponding image document.
In some embodiments, with reference to Fig. 8, which further includes the one or three module 130.One or three module 130 The ground image information of scene where obtaining the user equipment, and the ground image information is sent to the unmanned plane control Control equipment.For example, the user equipment is based on the figure for being fixed at photographic device captured in real-time ground staff present position thereon As being used as ground image information, and send it to the unmanned aerial vehicle (UAV) control equipment.It is similar with above-described situation, some In embodiment, the ground image information is sent to the unmanned aerial vehicle (UAV) control via the corresponding network equipment and set by user equipment It is standby, such as by the ground image information plug-flow to the network equipment, so that each participant real time inspection or readjustment are thumbed Corresponding image document.
In some embodiments, above-mentioned user equipment 100 further includes 140 (not shown) of the one or four module and the one or five module 150 (not shown).
Specifically, the one or four module 140 executes the ground image information after obtaining above-mentioned ground image information Target identification operation.For example, specific object (is fixed or is not fixed, such as the object for identification for the target identification operation For building or the vehicles) or personage.In a specific embodiment, target identification operation is calculated based on deep learning Method realization, first preparation training set (such as the image for wearing the pedestrian of different colours clothes) and corresponding label (such as pedestrian exists Position in image);Then deep learning model is trained, and according to the parameter of the continuous iteration model of training set until model is received It holds back;Finally the image by user equipment shooting inputs trained deep learning model, can be obtained with specific clothes color Position of the pedestrian in picture, so that target identification operation is completed.
The ground image information is presented in one or five module 150, and the operating result based on target identification operation is in Existing corresponding target following information.For example, connecting example, position of the pedestrian in picture is being obtained by target identification operation Afterwards, user equipment is superimposed in the position is presented corresponding target following information, by the pedestrian in picture with other objects or Pedestrian mutually distinguishes, such as the target following information is the contour line or box, vitta, color dot, arrow highlighted around target Deng.In some embodiments, when the ground image information changes over time (such as the not still image of user equipment shooting But video), to keep the target and the state mutually distinguished of other objects or pedestrian in picture that identify, the target with Track information can follow the target identified mobile, such as execute above-mentioned target identification frame by frame to video captured by user equipment and grasp Make, or above-mentioned target identification is executed to multiple key frames in video and is operated.
Wherein, above-mentioned target following information is in addition to being operated based on user equipment in the target identification locally carried out to obtain it Outside, it may be based on the acquisition of unmanned plane auxiliary information transmitted by unmanned plane.In some embodiments, the unmanned plane auxiliary information Tracking information is assisted including target, the ground action auxiliary information includes target corresponding with target auxiliary tracking information Tracking information.For example, the unmanned aerial vehicle (UAV) control equipment is to the identification operation of image performance objective captured by unmanned plane to identify this Certain objects or personage in image, to obtain above-mentioned target auxiliary tracking information (for example, certain objects or personage are at nobody Position in the image of machine shooting).The process of unmanned aerial vehicle (UAV) control equipment performance objective identification operation and above-mentioned user equipment execute The process of target identification operation is identical or essentially identical, repeats no more, and be incorporated herein by reference.
In some embodiments, the unmanned plane auxiliary information includes markup information (for example, the markup information is by unmanned plane Fly hand to add by the unmanned aerial vehicle (UAV) control equipment, or other action participants by that can communicate with the user equipment, Such as maneuvering platform, addition), the markup information include mark element (including but not limited to box, vitta, color dot, arrow, Picture/video, animation, threedimensional model etc.) and its position of appearing information (for determining that aforementioned mark element is locating in picture Position).Correspondingly, in the step s 120, user equipment is based on the markup information, and it is right that the unmanned plane auxiliary information institute is presented The ground action auxiliary information (such as ground action auxiliary information includes the markup information) answered.For example, working as the mark Note information corresponds to certain point (such as markup information includes ground practical action or the latitude and longitude information for simulating the destination taken action) When, user equipment superposition presentation color in the position in outdoor scene or in map picture or in the image frame of unmanned plane transmission Point;And when the markup information corresponds to a certain region (such as markup information include each vertex in a certain region latitude and longitude information) When, user equipment corresponds to region superposition and a color lump is presented.Specifically, in some embodiments, above-mentioned markup information can Including but not limited to the following aspects: route planning information, such as can be according to the position of the suspect currently tracked, before addition Line policeman arrests the programme path of action;Strategic plan information, such as (the burst thing of tab area in picture can be passed in unmanned plane figure Specified region etc. when enclosure region when part, task on duty when affiliated area, tactics deployment), before capable of being allowed in arresting action Line police intuitively obtain specific region position;Combat exercise (such as surface state is labeled with auxiliary tactics Rehearsal) information, the position of the suspect of simulation can be demarcated and highlighted with red circle.While neighbouring ground staff goes to destination It can check monitoring unmanned picture, by these markup informations, ground staff better understood when and execute task, action effect Rate is also promoted.
Wherein, in some embodiments, the unmanned plane auxiliary information further includes unmanned plane image information, and the unmanned plane Image information is video.Under the premise of ignoring network delay, unmanned aerial vehicle (UAV) control equipment to user equipment send the video and After above-mentioned markup information (including mark element and its position of appearing information), the video is presented in user equipment, and based on mark member The mark element is presented in the position of appearing information of element, flies the real-time of hand marked content to unmanned plane to realize at user equipment end It presents, so that user is based on the marked content fast reaction, flies the cooperation efficiency of hand to promote user and unmanned plane.In some feelings Under shape, for example including but be not limited to and in the ignorable situation of network delay and need to adjust back and thumb video and markup information Situation, the markup information further include time shaft location information corresponding to the mark element, which uses In determine the mark element accurate corresponding video frame (for example, by determining the position of associated video frame on a timeline), And the mark element is superimposed in the video frame, to avoid mark caused by due to marking element overlaid in not corresponding video frame Note dislocation.
Further to promote team collaboration's efficiency, same user equipment simultaneously or can divide with multiple unmanned aerial vehicle (UAV) control equipment It does not communicate, such as unmanned plane corresponding to multiple unmanned aerial vehicle (UAV) control equipment is covered each by the different piece with a piece of operational region, It is interacted simultaneously or respectively by multiple unmanned aerial vehicle (UAV) control equipment with unmanned plane, team's resource is rationally utilized.In some realities It applies in example, it is auxiliary that the one one module 110 receives unmanned plane transmitted by least one of corresponding multiple unmanned aerial vehicle (UAV) control equipment Supplementary information;One or two module 120 is presented in the multiple unmanned aerial vehicle (UAV) control equipment, transmitted by least one unmanned aerial vehicle (UAV) control equipment Unmanned plane auxiliary information corresponding to ground take action auxiliary information.For example, multiple unmanned plane controls can be presented in user equipment in turn Unmanned plane auxiliary information transmitted by control equipment, or (such as a variety of different information while being superimposed presentation) multinomial nothing is presented simultaneously Man-machine auxiliary information, or the one or more unmanned plane auxiliary information according to needed for the selection operation presentation user of user; For each unmanned plane auxiliary information, presentation mode is identical as the presentation mode of above-described unmanned plane auxiliary information Or it is essentially identical, it repeats no more, and be incorporated herein by reference.
According to further aspect of the application, a kind of unmanned aerial vehicle (UAV) control of the action of ground for rendering auxiliary information is provided Equipment.With reference to Fig. 9, which includes the 2nd 1 module 210.2nd 1 module 210 is set to corresponding user Preparation send unmanned plane auxiliary information, so that corresponding ground action auxiliary information is presented in the user equipment.In some embodiments In, the unmanned plane auxiliary information includes but is not limited to one or more below:
1) the destination position information of target related data information, including but not limited to ground action, periphery terrestrial reference The relevant informations such as information, the title of related ground target (including but not limited to target object and target person) or feature;One In a little embodiments, unmanned aerial vehicle (UAV) control equipment carries out target identification operation to the image obtained captured by unmanned plane, specific to identify Object or person target, and then unmanned aerial vehicle (UAV) control equipment read in local or its addressable database target correlation money Information is expected, to promote task execution efficiency;
2) unmanned plane image information, including but not limited to static image information and dynamic image data, such as video;
3) markup information, the including but not limited to described user, unmanned plane fly the mark of hand or other users based on addition Information, wherein the object marked include but is not limited in scene certain point (for example, the point be based on ground practical action or mould Plan to implement the latitude and longitude information of dynamic destination and determine) or a certain region (such as longitude and latitude of the region based on its each vertex Information and determine), the form of mark includes but is not limited to color dot, lines, model, image, text etc.;
4) target position information can be used for including but not limited to for latitude and longitude information where the target of ground staff's reference Determine target relative to orientation locating for ground staff.
In some embodiments, unmanned aerial vehicle (UAV) control equipment via the corresponding network equipment (e.g., including but be not limited to cloud Server), the unmanned plane auxiliary information sent to corresponding user equipment to realize the information sharing of multiterminal, such as exists Multiple unmanned planes fly hand, and perhaps other unmanned planes fly hand or ground staff and can also pass through the net in the case where multiple ground staff Network equipment obtains above-mentioned unmanned plane auxiliary information.In some embodiments, the unmanned plane auxiliary information includes unmanned plane image Information (such as video information) and by unmanned aerial vehicle (UAV) control equipment plug-flow to the network equipment, so as to each participant real time inspection or Person's readjustment thumbs corresponding image document.
Other than sending unmanned plane auxiliary information to user equipment, unmanned aerial vehicle (UAV) control equipment also can receive user equipment institute The ground image information of transmission is for reference;Correspondingly, in some embodiments, above-mentioned unmanned aerial vehicle (UAV) control equipment further includes second Two modules 220, as shown in Figure 10.Two or two module 220 receives and ground image information transmitted by the user equipment is presented. In addition, unmanned aerial vehicle (UAV) control equipment can also execute the ground image information based on ground image information transmitted by user equipment Target identification operation, and the operating result based on target identification operation send corresponding target auxiliary tracking letter to user equipment It ceases (for example, the position of the target identified in picture), so that user equipment is based on target auxiliary tracking information to the ground Corresponding target following information (such as the contour line highlighted around target or box, vitta, color dot, arrow etc.) is presented in personnel. By receiving ground image information transmitted by user equipment, unmanned plane, which flies hand, can obtain the first visual angle picture of user, entirely Field conditions are grasped in face, be can also assist in ground staff and are identified operation to the ground image information performance objective, to further mention Rise cooperation efficiency.For example, specific object (is fixed or is not fixed, such as the object is for identification for the target identification operation Building or the vehicles) or personage.In a specific embodiment, target identification operation is based on deep learning algorithm Realization, first preparation training set (such as the image for wearing the pedestrian of different colours clothes) and corresponding label (such as pedestrian is scheming Position as in);Then deep learning model is trained, and according to the parameter of the continuous iteration model of training set until model is received It holds back;Finally the image by user equipment shooting inputs trained deep learning model, can be obtained with specific clothes color Position of the pedestrian in picture, so that target identification operation is completed.
Wherein in some embodiments, above-mentioned unmanned aerial vehicle (UAV) control equipment further includes the two or eight module (not shown), and described Sixteen modules determine corresponding ground image for the ground image labeling operation according to user about the ground image information Markup information.Wherein, the unmanned plane auxiliary information includes the ground image markup information.For example, receiving ground staff Captured by corresponding user equipment and after the ground image of transmission, the user of unmanned aerial vehicle (UAV) control equipment adds phase on this image The markup information answered, and the image and corresponding markup information are sent back to aforementioned user equipment again.
In some cases, above-described unmanned plane auxiliary information can be believed based on the unmanned plane image captured by unmanned plane Breath obtains.In some embodiments, with reference to Figure 11, above-mentioned unmanned aerial vehicle (UAV) control equipment 200 includes the two or five module 250.Two or five Module 250 obtains unmanned plane image information captured by corresponding unmanned plane (including but not limited to still image, video etc.), it The 2nd 1 module 210 is based on the unmanned plane image information afterwards, sends unmanned plane auxiliary information to corresponding user equipment, for Corresponding ground action auxiliary information is presented in the user equipment.For example, ground action auxiliary information include it is above-mentioned nobody Machine image information;Alternatively, unmanned aerial vehicle (UAV) control equipment is to unmanned plane image information performance objective identification operation to identify specific mesh Mark, and determine that corresponding target assists tracking information;Alternatively, the quantity of the specific objective in unmanned plane identification picture, and should Quantity information is sent to user equipment as a part of ground action auxiliary information.
On this basis, in some embodiments, above-mentioned unmanned aerial vehicle (UAV) control equipment 200 further include the two or six module 260 (not It shows).Two or six module 260 based on unmanned plane user user's operation (e.g., including but be not limited to click, frame choosing, dragging behaviour Make or text input operate), determine the image labeling information about the unmanned plane image information, wherein described image mark Infusing information includes mark element (including but not limited to box, vitta, color dot, arrow, picture/video, animation, threedimensional model etc.) And its position of appearing information (for determining aforementioned mark element the location of in picture).Correspondingly, the 2nd 1 module 210 Based on the unmanned plane image information and described image markup information, unmanned plane auxiliary letter is sent to corresponding user equipment Breath, so that corresponding ground action auxiliary information is presented in the user equipment.
Wherein, in some embodiments, the unmanned plane auxiliary information further includes unmanned plane image information, and the unmanned plane Image information is video.For example, the two or six user's operation of the module 260 based on unmanned plane user, determines about the unmanned plane The image labeling information of image information, wherein described image markup information includes mark element and its position of appearing information, is also wrapped Time shaft location information corresponding to the mark element is included, such as under the premise of ignoring network delay, unmanned aerial vehicle (UAV) control is set After sending the video and above-mentioned markup information (including mark element and its position of appearing information) to user equipment, Yong Hushe It is standby that the video is presented, and the mark element is presented in the position of appearing information based on mark element, to be realized at user equipment end Fly the real-time presentation of hand marked content to unmanned plane, so that user is based on the marked content fast reaction, to promote user and nothing The cooperation efficiency of man-machine winged hand.In some cases, for example including but be not limited to and in the ignorable situation of network delay and It needs to adjust back the situation for thumbing video and markup information, the markup information further includes time shaft corresponding to the mark element Location information, the time shaft location information is for determining the accurate corresponding video frame of mark element institute (for example, by determining phase Close the position of video frame on a timeline), and the mark element is superimposed in the video frame, to avoid because marking element overlaid In not corresponding video frame and caused by mark dislocation.
In some embodiments, above-mentioned unmanned aerial vehicle (UAV) control equipment 200 further includes 270 (not shown) of the two or seven module.Second Seven modules 270 are based on corresponding unmanned plane and the relative orientation information of specified target and the space bit confidence of the unmanned plane Breath, determines the target position information of the specified target.Wherein in some embodiments, target is specified to fly hand in nothing by unmanned plane It determines in Human-machine Control equipment, such as is determined on the display screen of unmanned aerial vehicle (UAV) control equipment by modes such as click, frame choosings.Example Such as, in one embodiment, selected operation of the unmanned aerial vehicle (UAV) control equipment based on user determines corresponding specified target, subsequent nothing Human-machine Control equipment control unmanned plane measures the linear distance of the specified target and unmanned plane (for example, based on airborne Laser Measuring Distance meter obtains), in conjunction with the elevation information (for example, obtaining based on barometer) of unmanned plane itself, obtain unmanned plane and specified target Horizontal distance, latitude and longitude information (for example, based on GPS sensor obtain) and target further according to unmanned plane itself relative to The azimuth of unmanned plane, the final latitude and longitude information for determining specified target, and using the latitude and longitude information as the target position of target Confidence breath.In another example in another embodiment, pitch angle of the unmanned aerial vehicle (UAV) control equipment based on unmanned plane is (for example, be based on top Spiral shell instrument obtains) determine the angle of line and plumb line between unmanned plane and specified target, and according to the height of the angle and unmanned plane Degree (for example, being obtained based on barometer) calculates the horizontal distance between unmanned plane and specified target, further according to unmanned plane itself The azimuth of latitude and longitude information (for example, being obtained based on GPS sensor) and target relative to unmanned plane, it is final to determine specified mesh Target latitude and longitude information, and using the latitude and longitude information as the target position information of target.Certainly, those skilled in the art should be able to Understand, the above-described mode for obtaining target position information is only for example, other are existing or are likely to occur from now on Acquisition modes such as can be suitably used for the application, be also contained in the protection scope of the application, and be incorporated herein by reference.
As previously mentioned, same user equipment can be with multiple unmanned aerial vehicle (UAV) control equipment further to promote team collaboration's efficiency It communicates simultaneously or respectively.Similarly, for same unmanned aerial vehicle (UAV) control equipment, can also with multiple user equipmenies simultaneously or Person communicates respectively, such as ground operational staff corresponding to multiple user equipmenies is located at the different positions with a piece of operational region It sets, by interacting simultaneously or respectively with unmanned aerial vehicle (UAV) control equipment, team's resource is rationally utilized.In some embodiments, 2nd 1 module 210 sends unmanned plane auxiliary information at least one corresponding user equipment, at least one described user Corresponding ground action auxiliary information is presented in equipment.For example, unmanned aerial vehicle (UAV) control equipment can send nothing to multiple user equipmenies in turn Man-machine auxiliary information, or send simultaneously multinomial unmanned plane auxiliary information (such as a variety of different information be respectively sent to it is different User equipment), or one or more needed for sending some ground staff according to the selection operation of unmanned plane user nobody Machine auxiliary information;For each unmanned plane auxiliary information, the mode for generating or sending is identical as manner discussed above Or it is essentially identical, it repeats no more, and be incorporated herein by reference.
It should be noted that for purposes of the present application, the letter of the mark as transmitted by user equipment or unmanned aerial vehicle (UAV) control equipment Breath may be there are many presentation mode.For example, the markup information can by recipient be superimposed present (such as based on mark element position Confidence breath, or position of appearing information and time shaft position information based on mark element, are superimposed in corresponding video frame and present The mark element);In addition, the sender of markup information protects after can also being superimposed coherent element on the basis of shooting resulting video New video or video flowing are saved as, and by newly-generated video or video stream to recipient for its presentation.
Present invention also provides a kind of computer readable storage medium, the computer-readable recording medium storage has calculating Machine code, when the computer code is performed, such as preceding described in any item methods are performed.
Present invention also provides a kind of computer program products, when the computer program product is executed by computer equipment When, such as preceding described in any item methods are performed.
Present invention also provides a kind of computer equipment, the computer equipment includes:
One or more processors;
Memory, for storing one or more computer programs;
When one or more of computer programs are executed by one or more of processors so that it is one or Multiple processors realize such as preceding described in any item methods.
Figure 12 shows the exemplary system that can be used for implementing each embodiment described herein.
As shown in figure 12, in some embodiments, system 300 can be as any one user in each embodiment Equipment or unmanned aerial vehicle (UAV) control equipment.In some embodiments, system 300 may include one or more computers with instruction Readable medium (for example, system storage or NVM/ store equipment 320) and with the one or more computer-readable medium coupling Merging is configured as executing instruction the one or more processors (example to realize module thereby executing movement described herein Such as, (one or more) processor 305).
For one embodiment, system control module 310 may include any suitable interface controller, with to (one or It is multiple) at least one of processor 305 and/or any suitable equipment or component that communicate with system control module 310 mentions For any suitable interface.
System control module 310 may include Memory Controller module 330, to provide interface to system storage 315.It deposits Memory controller module 330 can be hardware module, software module and/or firmware module.
System storage 315 can be used for for example, load of system 300 and storing data and/or instruction.For a reality Example is applied, system storage 315 may include any suitable volatile memory, for example, DRAM appropriate.In some embodiments In, system storage 315 may include four Synchronous Dynamic Random Access Memory of Double Data Rate type (DDR4SDRAM).
For one embodiment, system control module 310 may include one or more input/output (I/O) controller, with Equipment 320 is stored to NVM/ and (one or more) communication interface 325 provides interface.
For example, NVM/ storage equipment 320 can be used for storing data and/or instruction.NVM/ storage equipment 320 may include appointing It anticipates nonvolatile memory appropriate (for example, flash memory) and/or to may include that any suitable (one or more) is non-volatile deposit Equipment is stored up (for example, one or more hard disk drives (HDD), one or more CD (CD) drivers and/or one or more Digital versatile disc (DVD) driver).
NVM/ storage equipment 320 may include a part for the equipment being physically mounted on as system 300 Storage resource or its can by the equipment access without a part as the equipment.For example, NVM/ storage equipment 320 can It is accessed by network via (one or more) communication interface 325.
(one or more) communication interface 325 can be provided for system 300 interface with by one or more networks and/or with Other any equipment communications appropriate.System 300 can be according to any mark in one or more wireless network standards and/or agreement Quasi- and/or agreement is carried out wireless communication with the one or more components of wireless network.
For one embodiment, at least one of (one or more) processor 305 can be with system control module 310 The logic of one or more controllers (for example, Memory Controller module 330) is packaged together.For one embodiment, (one It is a or multiple) at least one of processor 305 can encapsulate with the logic of one or more controllers of system control module 310 Together to form system in package (SiP).For one embodiment, at least one of (one or more) processor 305 It can be integrated on same mold with the logic of one or more controllers of system control module 310.For one embodiment, At least one of (one or more) processor 305 can be with the logic of one or more controllers of system control module 310 It is integrated on same mold to form system on chip (SoC).
In various embodiments, system 300 can be, but not limited to be: server, work station, desk-top calculating equipment or movement It calculates equipment (for example, lap-top computing devices, handheld computing device, tablet computer, net book etc.).In various embodiments, System 300 can have more or fewer components and/or different frameworks.For example, in some embodiments, system 300 includes One or more video cameras, keyboard, liquid crystal display (LCD) screen (including touch screen displays), nonvolatile memory port, Mutiple antennas, graphic chips, specific integrated circuit (ASIC) and loudspeaker.
It should be noted that the application can be carried out in the assembly of software and/or software and hardware, for example, can adopt With specific integrated circuit (ASIC), general purpose computer or any other realized similar to hardware device.In one embodiment In, the software program of the application can be executed to implement the above steps or functions by processor.Similarly, the application Software program (including relevant data structure) can be stored in computer readable recording medium, for example, RAM memory, Magnetic or optical driver or floppy disc and similar devices.In addition, hardware can be used to realize in some steps or function of the application, example Such as, as the circuit cooperated with processor thereby executing each step or function.
In addition, a part of the application can be applied to computer program product, such as computer program instructions, when its quilt When computer executes, by the operation of the computer, it can call or provide according to the present processes and/or technical solution. Those skilled in the art will be understood that the existence form of computer program instructions in computer-readable medium includes but is not limited to Source file, executable file, installation package file etc., correspondingly, the mode that computer program instructions are computer-executed include but Be not limited to: the computer directly execute the instruction or the computer compile the instruction after execute program after corresponding compiling again, Perhaps the computer reads and executes the instruction or after the computer reads and install and execute corresponding installation again after the instruction Program.Here, computer-readable medium can be for computer access any available computer readable storage medium or Communication media.
Communication media includes whereby including, for example, computer readable instructions, data structure, program module or other data Signal of communication is transmitted to the medium of another system from a system.Communication media may include having the transmission medium led (such as electric Cable and line (for example, optical fiber, coaxial etc.)) and can propagate wireless (not having the transmission the led) medium of energy wave, such as sound, electricity Magnetic, RF, microwave and infrared.Computer readable instructions, data structure, program module or other data can be embodied as example wireless Medium (such as carrier wave or be such as embodied as spread spectrum technique a part similar mechanism) in modulated message signal. Term " modulated message signal " refers to that one or more feature is modified or is set in a manner of encoded information in the signal Fixed signal.Modulation can be simulation, digital or Hybrid Modulation Technology.
As an example, not a limit, computer readable storage medium may include such as computer-readable finger for storage Enable, the volatile and non-volatile that any method or technique of the information of data structure, program module or other data is realized, can Mobile and immovable medium.For example, computer readable storage medium includes, but are not limited to volatile memory, such as with Machine memory (RAM, DRAM, SRAM);And nonvolatile memory, such as flash memory, various read-only memory (ROM, PROM, EPROM, EEPROM), magnetic and ferromagnetic/ferroelectric memory (MRAM, FeRAM);And magnetic and optical storage apparatus (hard disk, Tape, CD, DVD);Or other currently known media or Future Development can store the computer used for computer system Readable information/data.
Here, including a device according to one embodiment of the application, which includes for storing computer program The memory of instruction and processor for executing program instructions, wherein when the computer program instructions are executed by the processor When, trigger method and/or technology scheme of the device operation based on aforementioned multiple embodiments according to the application.
It is obvious to a person skilled in the art that the application is not limited to the details of above-mentioned exemplary embodiment, Er Qie In the case where without departing substantially from spirit herein or essential characteristic, the application can be realized in other specific forms.Therefore, no matter From the point of view of which point, the present embodiments are to be considered as illustrative and not restrictive, and scope of the present application is by appended power Benefit requires rather than above description limits, it is intended that all by what is fallen within the meaning and scope of the equivalent elements of the claims Variation is included in the application.Any reference signs in the claims should not be construed as limiting the involved claims.This Outside, it is clear that one word of " comprising " does not exclude other units or steps, and odd number is not excluded for plural number.That states in device claim is multiple Unit or device can also be implemented through software or hardware by a unit or device.The first, the second equal words are used to table Show title, and does not indicate any particular order.

Claims (28)

1. it is a kind of user equipment end for rendering ground action auxiliary information method, wherein this method comprises:
Receive unmanned plane auxiliary information transmitted by corresponding unmanned aerial vehicle (UAV) control equipment;
The action of ground corresponding to unmanned plane auxiliary information auxiliary information is presented;
Wherein, the ground action auxiliary information is used for ancillary terrestrial action.
2. according to the method described in claim 1, wherein, the ground action auxiliary information includes following at least any one:
Target related data information;
Unmanned plane image information;
Markup information;
Real scene navigation information.
3. according to the method described in claim 1, wherein, the ground action auxiliary information includes unmanned plane image information;
The method also includes:
It is operated according to the unmanned plane image labeling of user, determines corresponding unmanned plane image labeling information;
The unmanned plane image labeling information is sent to the unmanned aerial vehicle (UAV) control equipment.
4. described to receive unmanned plane transmitted by corresponding unmanned aerial vehicle (UAV) control equipment according to the method described in claim 1, wherein Auxiliary information, comprising:
Via the corresponding network equipment, unmanned plane auxiliary information transmitted by corresponding unmanned aerial vehicle (UAV) control equipment is received.
5. according to the method described in claim 1, wherein, the method also includes:
The ground image information of scene where obtaining the user equipment, and by the ground image information be sent to it is described nobody Machine controls equipment.
6. according to the method described in claim 5, wherein, the ground image for obtaining user equipment place scene is believed Breath, and the ground image information is sent to the unmanned aerial vehicle (UAV) control equipment, comprising:
The ground image information of scene where obtaining the user equipment, and via the corresponding network equipment by the ground image Information is sent to the unmanned aerial vehicle (UAV) control equipment.
7. according to the method described in claim 5, wherein, the method also includes:
The ground image information performance objective is identified and is operated;
The ground image information is presented, and corresponding target following letter is presented in the operating result based on target identification operation Breath.
8. the mark is believed according to the method described in claim 1, wherein, the unmanned plane auxiliary information includes markup information Breath includes mark element and its position of appearing information;
It is described that the action of ground corresponding to unmanned plane auxiliary information auxiliary information is presented, comprising:
Based on the markup information, the action of ground corresponding to unmanned plane auxiliary information auxiliary information is presented.
9. according to the method described in claim 8, wherein, the markup information further includes the time corresponding to the mark element Shaft position information.
10. according to the method described in claim 1, wherein, the unmanned plane auxiliary information includes target position information, described to be in Action auxiliary information in ground corresponding to the existing unmanned plane auxiliary information, comprising:
Based on the target position information, the action of ground corresponding to unmanned plane auxiliary information auxiliary information is determined, and be in The existing ground action auxiliary information.
11. according to the method described in claim 1, wherein, the unmanned plane auxiliary information includes target auxiliary tracking information, institute Stating ground action auxiliary information includes target following information corresponding with target auxiliary tracking information.
It is described to receive transmitted by corresponding unmanned aerial vehicle (UAV) control equipment nobody 12. according to the method described in claim 1, wherein Machine auxiliary information, comprising:
Receive unmanned plane auxiliary information transmitted by least one of corresponding multiple unmanned aerial vehicle (UAV) control equipment;
It is described that the action of ground corresponding to unmanned plane auxiliary information auxiliary information is presented, comprising:
It presents in the multiple unmanned aerial vehicle (UAV) control equipment, unmanned plane auxiliary information transmitted by least one unmanned aerial vehicle (UAV) control equipment Corresponding ground action auxiliary information.
13. it is a kind of unmanned aerial vehicle (UAV) control equipment end for rendering ground action auxiliary information method, wherein this method comprises:
Unmanned plane auxiliary information is sent to corresponding user equipment, so that corresponding ground action auxiliary is presented in the user equipment Information.
14. according to the method for claim 13, wherein it is described to send unmanned plane auxiliary information to corresponding user equipment, So that corresponding ground action auxiliary information is presented in the user equipment, comprising:
Via the corresponding network equipment, unmanned plane auxiliary information is sent to corresponding user equipment, so that the user equipment is in Existing corresponding ground action auxiliary information.
15. according to the method for claim 13, wherein the method also includes:
It receives and ground image information transmitted by the user equipment is presented.
16. according to the method for claim 15, wherein the method also includes:
The ground image information performance objective is identified and is operated;
Based on the operating result of target identification operation, Xiang Suoshu user equipment sends corresponding target and assists tracking information.
17. according to the method for claim 15, wherein the method also includes:
Ground image labeling operation according to user about the ground image information determines corresponding ground image mark letter Breath;
Wherein, the unmanned plane auxiliary information includes the ground image markup information.
18. according to the method for claim 13, wherein the method also includes:
Obtain unmanned plane image information captured by corresponding unmanned plane;
It is described to send unmanned plane auxiliary information to corresponding user equipment, so that corresponding ground action is presented in the user equipment Auxiliary information, comprising:
Based on the unmanned plane image information, unmanned plane auxiliary information is sent to corresponding user equipment, so that the user sets It is standby that corresponding ground action auxiliary information is presented.
19. according to the method for claim 18, wherein the method also includes:
Based on the user's operation of unmanned plane user, the image labeling information about the unmanned plane image information is determined, wherein institute Stating image labeling information includes mark element and its position of appearing information;
It is described to be based on the unmanned plane image information, unmanned plane auxiliary information is sent to corresponding user equipment, for the use Corresponding ground action auxiliary information is presented in family equipment, comprising:
Based on the unmanned plane image information and described image markup information, unmanned plane auxiliary is sent to corresponding user equipment Information, so that corresponding ground action auxiliary information is presented in the user equipment.
20. according to the method for claim 19, wherein the user's operation based on unmanned plane user is determined about institute The image labeling information of unmanned plane image information is stated, wherein described image markup information includes mark element and its Presence Bit confidence Breath, comprising:
Based on the user's operation of unmanned plane user, the image labeling information about the unmanned plane image information is determined, wherein institute Stating image labeling information includes mark element and its position of appearing information, further includes time axle position corresponding to the mark element Confidence breath.
21. according to the method for claim 18, wherein the method also includes:
It receives and the unmanned plane image labeling information transmitted by the user equipment about the unmanned plane image information is presented.
22. according to the method for claim 13, wherein the method also includes:
The space of relative orientation information and the unmanned plane based on corresponding unmanned plane unmanned plane corresponding with specified target Location information determines the target position information of the specified target.
23. according to the method for claim 13, wherein it is described to send unmanned plane auxiliary information to corresponding user equipment, So that corresponding ground action auxiliary information is presented in the user equipment, comprising:
Unmanned plane auxiliary information is sent at least one corresponding user equipment, for the presentation pair of at least one described user equipment The ground action auxiliary information answered.
24. a kind of user equipment of the action of ground for rendering auxiliary information, wherein the user equipment includes:
One one module, for receiving unmanned plane auxiliary information transmitted by corresponding unmanned aerial vehicle (UAV) control equipment;
One or two module, for rendering ground corresponding to the unmanned plane auxiliary information take action auxiliary information;
Wherein, the ground action auxiliary information is used for ancillary terrestrial action.
25. a kind of unmanned aerial vehicle (UAV) control equipment of the action of ground for rendering auxiliary information, wherein the unmanned aerial vehicle (UAV) control equipment packet It includes:
2nd 1 module, for sending unmanned plane auxiliary information to corresponding user equipment, for user equipment presentation pair The ground action auxiliary information answered.
26. a kind of method of the action of ground for rendering auxiliary information, wherein this method comprises:
Unmanned aerial vehicle (UAV) control equipment sends unmanned plane auxiliary information to corresponding user equipment;
The user equipment receives unmanned plane auxiliary information transmitted by the unmanned aerial vehicle (UAV) control equipment, and the unmanned plane is presented The action of ground corresponding to auxiliary information auxiliary information;
Wherein, the ground action auxiliary information is used for ancillary terrestrial action.
27. a kind of equipment of the action of ground for rendering auxiliary information, wherein the equipment includes:
Processor;And
It is arranged to the memory of storage computer executable instructions, the executable instruction makes the processor when executed It executes according to claim 1 to the operation of any one of 23 the methods.
28. a kind of computer-readable medium including instruction, described instruction wants system execution according to right Ask the operation of any one of 1 to 23 the method.
CN201811397300.2A 2018-11-22 2018-11-22 Method and equipment for presenting ground action auxiliary information Active CN109656319B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811397300.2A CN109656319B (en) 2018-11-22 2018-11-22 Method and equipment for presenting ground action auxiliary information

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811397300.2A CN109656319B (en) 2018-11-22 2018-11-22 Method and equipment for presenting ground action auxiliary information

Publications (2)

Publication Number Publication Date
CN109656319A true CN109656319A (en) 2019-04-19
CN109656319B CN109656319B (en) 2021-06-15

Family

ID=66111289

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811397300.2A Active CN109656319B (en) 2018-11-22 2018-11-22 Method and equipment for presenting ground action auxiliary information

Country Status (1)

Country Link
CN (1) CN109656319B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110248157A (en) * 2019-05-25 2019-09-17 亮风台(上海)信息科技有限公司 A kind of method and apparatus carrying out scheduling on duty
CN110460808A (en) * 2019-06-27 2019-11-15 安徽科力信息产业有限责任公司 Target designation real-time display method, device and unmanned plane
CN112221149A (en) * 2020-09-29 2021-01-15 中北大学 Artillery and soldier continuous intelligent combat drilling system based on deep reinforcement learning
CN115439635A (en) * 2022-06-30 2022-12-06 亮风台(上海)信息科技有限公司 Method and equipment for presenting mark information of target object

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100004798A1 (en) * 2005-01-25 2010-01-07 William Kress Bodin Navigating a UAV to a next waypoint
KR20140112588A (en) * 2013-03-11 2014-09-24 한국항공우주산업 주식회사 Method of terminal guidance of airplane and apparatuse for using the same
CN105741213A (en) * 2016-01-13 2016-07-06 天津中科智能识别产业技术研究院有限公司 Disaster relief force scheduling deployment command and control system based on GIS
CN107054654A (en) * 2017-05-09 2017-08-18 广东容祺智能科技有限公司 A kind of unmanned plane target tracking system and method
CN107416207A (en) * 2017-06-13 2017-12-01 深圳市易成自动驾驶技术有限公司 Unmanned plane rescue mode, unmanned plane and computer-readable recording medium
US20180009279A1 (en) * 2016-07-05 2018-01-11 SkyRunner, LLC Dual engine air and land multimodal vehicle
CN107851308A (en) * 2016-03-01 2018-03-27 深圳市大疆创新科技有限公司 system and method for identifying target object
CN107968932A (en) * 2017-10-31 2018-04-27 易瓦特科技股份公司 The method, system and device being identified based on earth station to destination object
CN108139759A (en) * 2015-09-15 2018-06-08 深圳市大疆创新科技有限公司 For unmanned vehicle path planning and the system and method for control
CN108510689A (en) * 2018-04-23 2018-09-07 成都鹏派科技有限公司 A kind of Forest Fire Alarm reaction system

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100004798A1 (en) * 2005-01-25 2010-01-07 William Kress Bodin Navigating a UAV to a next waypoint
KR20140112588A (en) * 2013-03-11 2014-09-24 한국항공우주산업 주식회사 Method of terminal guidance of airplane and apparatuse for using the same
CN108139759A (en) * 2015-09-15 2018-06-08 深圳市大疆创新科技有限公司 For unmanned vehicle path planning and the system and method for control
CN105741213A (en) * 2016-01-13 2016-07-06 天津中科智能识别产业技术研究院有限公司 Disaster relief force scheduling deployment command and control system based on GIS
CN107851308A (en) * 2016-03-01 2018-03-27 深圳市大疆创新科技有限公司 system and method for identifying target object
US20180009279A1 (en) * 2016-07-05 2018-01-11 SkyRunner, LLC Dual engine air and land multimodal vehicle
CN107054654A (en) * 2017-05-09 2017-08-18 广东容祺智能科技有限公司 A kind of unmanned plane target tracking system and method
CN107416207A (en) * 2017-06-13 2017-12-01 深圳市易成自动驾驶技术有限公司 Unmanned plane rescue mode, unmanned plane and computer-readable recording medium
CN107968932A (en) * 2017-10-31 2018-04-27 易瓦特科技股份公司 The method, system and device being identified based on earth station to destination object
CN108510689A (en) * 2018-04-23 2018-09-07 成都鹏派科技有限公司 A kind of Forest Fire Alarm reaction system

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110248157A (en) * 2019-05-25 2019-09-17 亮风台(上海)信息科技有限公司 A kind of method and apparatus carrying out scheduling on duty
CN110460808A (en) * 2019-06-27 2019-11-15 安徽科力信息产业有限责任公司 Target designation real-time display method, device and unmanned plane
CN112221149A (en) * 2020-09-29 2021-01-15 中北大学 Artillery and soldier continuous intelligent combat drilling system based on deep reinforcement learning
CN112221149B (en) * 2020-09-29 2022-07-19 中北大学 Artillery and soldier continuous intelligent combat drilling system based on deep reinforcement learning
CN115439635A (en) * 2022-06-30 2022-12-06 亮风台(上海)信息科技有限公司 Method and equipment for presenting mark information of target object

Also Published As

Publication number Publication date
CN109656319B (en) 2021-06-15

Similar Documents

Publication Publication Date Title
US11887312B2 (en) Fiducial marker patterns, their automatic detection in images, and applications thereof
CN109561282A (en) A kind of method and apparatus of the action of ground for rendering auxiliary information
CN109656319A (en) A kind of action of ground for rendering auxiliary information method and apparatus
EP2572336B1 (en) Mobile device, server arrangement and method for augmented reality applications
CN103460256B (en) In Augmented Reality system, virtual image is anchored to real world surface
CN109596118A (en) It is a kind of for obtaining the method and apparatus of the spatial positional information of target object
CN109541584A (en) A kind of low flyer reconnaissance warning system and method based on intelligent terminal
CN108279679A (en) A kind of Intelligent meal delivery robot system and its food delivery method based on wechat small routine and ROS
CN109459029A (en) It is a kind of for determining the method and apparatus of the navigation routine information of target object
CN109671118A (en) A kind of more people's exchange methods of virtual reality, apparatus and system
KR20150131744A (en) Method, system and recording medium for providing augmented reality service and file distribution system
CN109656259A (en) It is a kind of for determining the method and apparatus of the image location information of target object
CN104982090A (en) Personal information communicator
CN109618131A (en) A kind of method and apparatus of information to aid in decision for rendering
US10846901B2 (en) Conversion of 2D diagrams to 3D rich immersive content
Ribeiro et al. Web AR solution for UAV pilot training and usability testing
CN110248157B (en) Method and equipment for scheduling on duty
CN115460539B (en) Method, equipment, medium and program product for acquiring electronic fence
WO2024000733A1 (en) Method and device for presenting marker information of target object
Zhang et al. Mixed reality annotations system for museum space based on the UWB positioning and mobile device
Huang et al. Space robot teleoperation based on active vision
Anderson et al. High-Throughput Computing and Multi-Sensor Unmanned Aerial Systems in Support of Explosive Hazard Detection
CN115760964A (en) Method and equipment for acquiring screen position information of target object
CN110160532A (en) Localization method and device and terminal device
CN108107913A (en) A kind of preposition tracking of balance car and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP02 Change in the address of a patent holder
CP02 Change in the address of a patent holder

Address after: 201210 7th Floor, No. 1, Lane 5005, Shenjiang Road, China (Shanghai) Pilot Free Trade Zone, Pudong New Area, Shanghai

Patentee after: HISCENE INFORMATION TECHNOLOGY Co.,Ltd.

Address before: Room 501 / 503-505, 570 shengxia Road, China (Shanghai) pilot Free Trade Zone, Pudong New Area, Shanghai, 201203

Patentee before: HISCENE INFORMATION TECHNOLOGY Co.,Ltd.