CN106767179A - Flight formula throws target monitoring device and monitoring method - Google Patents
Flight formula throws target monitoring device and monitoring method Download PDFInfo
- Publication number
- CN106767179A CN106767179A CN201710001497.2A CN201710001497A CN106767179A CN 106767179 A CN106767179 A CN 106767179A CN 201710001497 A CN201710001497 A CN 201710001497A CN 106767179 A CN106767179 A CN 106767179A
- Authority
- CN
- China
- Prior art keywords
- target
- target body
- information
- shooting
- unit
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000012544 monitoring process Methods 0.000 title claims abstract description 46
- 238000000034 method Methods 0.000 title claims abstract description 37
- 238000012806 monitoring device Methods 0.000 title claims abstract description 21
- 238000001514 detection method Methods 0.000 claims description 24
- 238000003384 imaging method Methods 0.000 claims description 11
- 238000001093 holography Methods 0.000 claims description 2
- 230000000149 penetrating effect Effects 0.000 claims description 2
- 238000010586 diagram Methods 0.000 description 6
- 238000010304 firing Methods 0.000 description 4
- RZVHIXYEVGDQDX-UHFFFAOYSA-N 9,10-anthraquinone Chemical compound C1=CC=C2C(=O)C3=CC=CC=C3C(=O)C2=C1 RZVHIXYEVGDQDX-UHFFFAOYSA-N 0.000 description 2
- 240000002836 Ipomoea tricolor Species 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 241000272201 Columbiformes Species 0.000 description 1
- 241001269238 Data Species 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000002360 preparation method Methods 0.000 description 1
Classifications
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F41—WEAPONS
- F41J—TARGETS; TARGET RANGES; BULLET CATCHERS
- F41J9/00—Moving targets, i.e. moving when fired at
- F41J9/08—Airborne targets, e.g. drones, kites, balloons
Landscapes
- Engineering & Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- General Engineering & Computer Science (AREA)
- Radar Systems Or Details Thereof (AREA)
Abstract
The present invention provides a kind of flight formula and throws target monitoring device and monitoring method, belongs to throwing target field.The flight formula throws target monitoring device to be included:Flying unit, throw target unit and monitoring unit, the flying unit carries the throwing target unit flight, and the target unit of throwing is used to be used to monitor the shooting information that user shoots the target body for shooting the target body for aiming at, the monitoring unit according to the instruction release for receiving.Flight formula throwing target monitoring device of the invention is removable to be easy to carry, and expands and uses scope;The degree of accuracy shot can be also monitored using monitoring unit.
Description
Technical field
The present invention relates to throw target field.
Background technology
The flying saucer project of early stage, is that the pigeon to letting fly away is shot at, after be changed to dish target shoot.Flying saucer project is approximately hunted
Activity is hunted, it is interesting strong, it is deeply welcomed by the people.Existing trap machine does not have flight function, and cannot monitor penetrating for throwing target
Result is hit, the use scope of user and experience are restricted.
The content of the invention
The present invention provides a kind of flight formula for being intended to throw target in flight and be monitored record of fire and throws target monitoring
Device and monitoring method.
Concrete technical scheme is as follows:
A kind of flight formula throws target monitoring device, and the device includes:Flying unit, throws target unit and monitoring unit, described to fly
Row unit carries the throwing target unit flight, and the target unit of throwing is used to according to the instruction release for receiving for shooting the target for aiming at
Body, the monitoring unit is used to monitor the shooting information that user shoots the target body.
Preferably, the flying unit uses rotary wind type flight structure, or
The flying unit uses flapping wings type flight structure, or
The flying unit uses fixed-wing formula flight structure, or
The flying unit uses umbrella wing formula flight structure, or
The flying unit uses jet-propelled flight structure, or
The flying unit uses dirigible structure.
Preferably, the monitoring unit includes:
Target body acquisition module, is used to obtain the motion track information and positional information of the target body;
Shooting acquisition module, is used to obtain shooting information of the user to the target body;
Processing module, to the motion track information according to the target body, the positional information and the shooting are believed
Breath, calculates the accuracy of fire.
Preferably, the monitoring unit includes:
Acquisition module, is used to obtain the motion track information and positional information of the target body;
Cock processing module, to obtain the shooting target according to the motion track information and the positional information
The shooting information of body.
Preferably, the target body includes entity target body, and/or virtual target body.
Preferably, the entity target body uses plate shape structure, loop configuration or rectangular configuration, or
The entity target body is the balloon of filling gas, or
The entity target body is the target body for being provided with light source.
Preferably, the virtual target body is the virtual target body presented using optics.
Preferably, the virtual target body is that the virtual target body is presented using holographic imaging mode.
Preferably, it is described throwing target unit include,
First receiver module, is used to receive control instruction;
Storage target cabin, is used to store the entity target body;
Ejection portion, is used to for the outlet of the entity target body from the throwing target unit to launch output;
First detection module, is used to detect the position data of the throwing target unit;
First control module, connects first receiver module, the ejection portion and the first detection module respectively, uses
Controlled described in the ejection portion ejection with the position data detected according to the first detection module and the control instruction
Entity target body.
Preferably, it is described throwing target unit include,
Second receiver module, is used to receive control instruction;
Emission part, is used to using holographic imaging mode output light, so that the virtual target body is presented;
Second detection module, is used to detect the position data of the throwing target unit, the position data include altitude information,
Air speed data and coordinate data;
Second control module, connects second receiver module, the emission part and second detection module respectively, uses
The emission part is controlled to be imaged with the position data detected according to second detection module and the control instruction.
The present invention also provides a kind of flight formula and throws target monitoring method, including the step of be:
Flying unit carries throwing target unit and is flown;
It is described to throw the target body that target unit is aimed at according to the instruction release for receiving for shooting;
Monitoring user shoots the shooting information of the target body.
Preferably, the process of the shooting information of the monitoring user shooting target body is:
Obtain the motion track information and positional information of the target body;
Obtain shooting information of the user to the target body;
According to the motion track information of the target body, the positional information and the shooting information calculate shooting accurate
Exactness.
Preferably, the process of the shooting information of the monitoring user shooting target body is:
Obtain the motion track information and positional information of the target body;
The shooting information of the shooting target body is obtained according to the motion track information and the positional information.
Preferably, the target body includes entity target body and/or virtual target body;
The virtual target body is that the virtual target body is presented using holographic imaging mode.
Preferably, the process of the calculating accuracy of fire is:
Count the shooting information to be analyzed the shooting information, calculated according to shooting information and shooting runout information
The accuracy of fire;
Show the shooting information and the accuracy of fire.
Preferably, the shooting letter of the shooting target body is obtained according to the motion track information and the positional information
The process of breath is:
The movement locus, the positional information and vibration information according to the entity target body obtain the shooting entity
The shooting information of target body, or
Image information movement locus and position feedback information according to the virtual target body obtain the shooting virtual target body
The shooting information.
The beneficial effect of above-mentioned technical proposal is exemplified below:
Flight formula throwing target monitoring device is removable to be easy to carry, and expands and uses scope;Can also be right using monitoring unit
The degree of accuracy of shooting is monitored;
Flight formula is thrown target monitoring method and the process that shot can be monitored, and to obtain shooting information, improves user's body
Test.
Brief description of the drawings
Fig. 1 is a kind of overall structure diagram of embodiment that flight formula of the present invention throws target monitoring device.
Fig. 2 is a kind of module map of embodiment of throwing target unit of the present invention.
Fig. 3 is the module map of another embodiment of throwing target unit of the present invention.
Fig. 4 is a kind of module map of embodiment of monitoring unit of the present invention.
Fig. 5 is the module map of another embodiment of monitoring unit of the present invention.
Fig. 6 is a kind of method flow diagram of embodiment that flight formula of the present invention throws target monitoring method.
A kind of method flow diagram of embodiment when Fig. 7 is present invention monitoring user's shooting target body.
Fig. 8 is method flow diagram of the throwing target unit of the present invention according to a kind of embodiment of instruction release target body.
Fig. 9 is method flow diagram of the throwing target unit of the present invention according to another embodiment of instruction release target body.
Figure 10 is that flying unit of the present invention carries the method flow diagram for throwing the flight of target unit.
Specific embodiment
Below in conjunction with the accompanying drawing in the embodiment of the present invention, the technical scheme in the embodiment of the present invention is carried out clear, complete
Site preparation is described, it is clear that described embodiment is only a part of embodiment of the invention, rather than whole embodiments.It is based on
Embodiment in the present invention, it is every other that those of ordinary skill in the art are obtained under the premise of creative work is not made
Embodiment, belongs to the scope of protection of the invention.
It should be noted that in the case where not conflicting, the embodiment in the present invention and the feature in embodiment can phases
Mutually combination.
The invention will be further described with specific embodiment below in conjunction with the accompanying drawings, but not as limiting to the invention.
As shown in figure 1, a kind of flight formula throws target monitoring device, the device includes:Flying unit 1, throws target unit 2 and monitoring
Unit 3, flying unit 1 carries throwing target unit 2 and flies, and throws target unit 2 to supply shooting to aim at according to the instruction release for receiving
Target body, monitoring unit 3 is used to monitor the shooting information that user shoots target body.
In the present embodiment, flight formula is thrown target monitoring device and be may be implemented in flight course and thrown using throwing target unit 2
Target, to realize that mobile throwing target meets the demand of user, and flight formula throwing device may move and be easy to carry, and expands and uses model
Enclose, the firing point to user does not require specifically, expand the shooting scope of activities of user;Monitoring unit 3 is utilized simultaneously
The degree of accuracy shot can be monitored.
Used as a kind of embodiment, flying unit 1 can use flapping wings type using rotary wind type flight structure, or flying unit 1
Flight structure, or flying unit 1 can use umbrella wing formula flight structure using fixed-wing formula flight structure, or flying unit 1, or
Flying unit 1 can use dirigible structure using jet-propelled flight structure, or flying unit 1.
On the basis of above-mentioned technical proposal, further, flying unit can be rotor craft, Fixed Wing AirVehicle or
The aircraft that fixed-wing mixes with rotor.Can be to include one or more spiral if flying unit uses rotor craft
The aircraft of oar, such as helicopter, quadrotor, six rotorcraft, eight-rotary wing aircraft of single propeller etc..Fly
Row unit can automatically fly to target area according to the course data being pre-configured with, it is also possible to receive such as remote control, intelligence
The flight control data that the control devices with radio function such as mobile phone, panel computer, intelligent wearable device send in ground surface end,
And flown to target area based on these flight control datas.
Aircraft generally can also include Power Component, such as horizontal stage electric machine, propeller, propeller motor, and radiating
The mechanical parts such as fan.
Flying unit may also include the winged control plate for controlling flight, and flying control plate includes circuit board and be arranged at circuit board
The upper functional module for being used to realize predetermined function, functional module can include controller module, processor module, sensor assembly
Deng.Further, sensor assembly can include inertia measuring module, temperature sensor module, Height sensor module, distance
Sensor assembly etc..Inertia measuring module includes acceleration transducer and gyroscope, and inertia measuring module can measure nobody
The flight attitude information of aircraft.
In a preferred embodiment, target body may include entity target body, and/or virtual target body.
Further, for example and without limitation, entity target body can use plate shape structure, loop configuration or rectangle knot
Structure, or entity target body can also be the balloon of filling gas, or entity target body is the target body for being provided with light source.For example on target body
LED is set, so that user carries out throwing target shooting at night.
In a preferred embodiment, virtual target body is the virtual target body presented using optics.
Further, for example and without limitation, virtual target body is that virtual target body is presented using holographic imaging mode.
As shown in figure 4, in a preferred embodiment, monitoring unit 3 includes:
Target body acquisition module 31, is used to obtain the motion track information and positional information of target body;
Shooting acquisition module 32, is used to obtain shooting information of the user to target body;
Processing module 33, to the motion track information according to target body, positional information is driven the cross with shooting information, calculating
Degree.
Further, for example and without limitation, shooting information may include:Firing point information, aim at attitude information and
Shooting path etc..
In the present embodiment, the motion track information and positional information of target body are obtained by target body acquisition module 31, is used
Shooting acquisition module 32 is monitored to the firing operation of user, to obtain the shooting information of shooting bullet or laser, according to this
Acquisition of information shoots the shooting information of target body accordingly.
As shown in figure 5, in a preferred embodiment, monitoring unit 3 includes:
Acquisition module 35, is used to obtain the motion track information and positional information of target body;
Cock processing module 36, to obtain the shooting letter of shooting target body according to motion track information and positional information
Breath.
In the present embodiment, cock processing module 36 is mainly the shooting information that user obtains the target body for being hit,
The shooting information of the target body for judging to be hit according to the movement locus of target body, positional information.
As shown in Fig. 2 in a preferred embodiment, throwing target unit 2 includes,
First receiver module 21, is used to receive control instruction;
Storage target cabin 22, to storage entities target body;
Ejection portion 23, is used to for entity target body to launch output from the outlet 210 for throwing target unit 2;
First detection module 24, is used to detect the position data for throwing target unit 2;
First control module 25, connects the first receiver module 21, ejection portion 23 and first detection module 24, to root respectively
Launch entity target body in position data and control instruction control the ejection portion 23 detected according to first detection module 24.
Further, for example and without limitation, position data may include altitude information, air speed data and coordinate data.
In the present embodiment, target body is stored using storage target cabin 22, the height for throwing target unit 2 is detected by first detection module 24
Degrees of data, air speed data and coordinate data, the entity target in target cabin 22 will be stored up using the control ejection of the first control module 25 portion 23
Body launches output from the outlet 210 for throwing target unit 2, so as to realize the purpose of the target body that release is aimed at for shooting.
As shown in figure 3, in a preferred embodiment, throwing target unit 2 includes,
Second receiver module 26, is used to receive control instruction;
Emission part 27, is used to using holographic imaging mode output light, so that virtual target body is presented;
Second detection module 28, is used to detect the position data for throwing target unit 2, position data includes altitude information, wind speed
Data and coordinate data;
Second control module 29, connects the second receiver module 26, the detection module 28 of emission part 27 and second, to root respectively
Position data and control instruction the control emission part 27 detected according to the second detection module 28 are imaged.
Further, for example and without limitation, position data may include altitude information, air speed data and coordinate data.
In the present embodiment, altitude information, air speed data and the coordinate for throwing target unit 2 are detected by the second detection module 28
Data, holographic imaging mode output light is used using the control emission part 27 of the second control module 29, so that virtual target body is presented, from
And realize the purpose of the target body that release is aimed at for shooting.
In a preferred embodiment, 33 pairs of shooting information of processing module are analyzed, and are deviateed according to shooting information and shooting
Information calculates the accuracy of fire.
In the present embodiment, for example and without limitation, can be counted using 33 pairs of shooting information of processing module, according to
All previous shooting ring or hit number of times, and shooting runout information, calculate the degree of accuracy of shooting.
In a preferred embodiment, monitoring unit 3 also includes:
Display module 34, is used to show shooting information and the accuracy of fire.
In the present embodiment, for example and without limitation, by 34 pairs of shooting information of display module and shooting standard can be calculated
Exactness is shown, so that the shooting information that user intuitively obtains.
In a preferred embodiment, cock processing module 36 is believed to the movement locus according to entity target body, position
Breath and vibration information obtain the shooting information of shooting entity target body, or
Cock processing module 36 is used to be obtained according to the image information movement locus and position feedback information of virtual target body
Take the shooting information for shooting virtual target body.
In a preferred embodiment, flying unit 1 includes,
Locating module, to the positional information of positioning flight unit 1;
3rd receiver module, is used to receive flight directive;
Flight module, connects locating module and the 3rd receiver module respectively, to be flown according to flight directive and positional information
OK, or
Flight module is used to be flown to target location according to flight directive and positional information.
In the present embodiment, for example and without limitation, can by the positional information of locating module positioning flight unit 1,
The flight directive that flight module is received according to the 3rd receiver module, performs corresponding flight operation.Throwing target unit 2 can be in flight
Unit 1 discharges target body when being in state of flight;Throw target unit 2 can also discharge target body when flying unit 1 remains static.
In a preferred embodiment, also including image recognition trigger element, image recognition trigger element includes,
Taking module, is used to shoot the image comprising user;
Identification module, to the demand status of identifying user;
Demand determination module, to judge release target body demand according to the image and demand status that shoot.
In the present embodiment, for example and without limitation, can be flown using the technological means of image recognition triggering flying unit 1
OK, target list is thrown when the demand status of the user of identification module identification trigger the carrying of flying unit 1 to need to discharge target body demand, then
Unit 2 takes off from initial position, flight to target location.
In a preferred embodiment, also including voice recognition trigger element, voice recognition trigger element includes,
Acquisition module, is used to gather the voice signal of user;
Identification module, to the semantic content of recognition of speech signals;
Demand determination module, to judge release target body demand according to semantic content.
In the present embodiment, for example and without limitation, can be single using the technological means triggering flight of voice signal identification
Unit 1 flies, when the semantic content of identification module identification is release target body demand, then trigger flying unit 1 carry throw target unit 2 from
Initial position takes off and flies to target location.
As shown in fig. 6, a kind of flight formula throws target monitoring method, including the step of be:
S1. flying unit carries throwing target unit and is flown;
S2. the target body that target unit is aimed at according to the instruction release for receiving for shooting is thrown;
S3. monitoring user shoots the shooting information of target body.
In the present embodiment, flight formula is thrown target monitoring method and can control to throw target unit according to receiving in flight course
Instruction release for shooting aim at target body, and to shoot process be monitored, to obtain shooting information, improve user's body
Test.
As shown in fig. 7, in a preferred embodiment, the process of the shooting information of monitoring user's shooting target body is:
S31. the motion track information and positional information of target body are obtained;
S32. shooting information of the user to target body is obtained;
S33. according to the motion track information of target body, positional information and shooting information, calculate the accuracy of fire.
In the present embodiment, by obtaining the motion track information and positional information of target body, the firing operation to user enters
Row monitoring, to obtain the shooting information of shooting bullet or laser, the shooting information of target body is shot according to the acquisition of information accordingly.
In a preferred embodiment, the process of the shooting information of monitoring user shooting target body is:
Obtain the motion track information and positional information of target body;
The shooting information of shooting target body is obtained according to motion track information and positional information.
The shooting information that user obtains the target body for being hit mainly is obtained in the present embodiment, according to the motion rail of target body
Mark, positional information judge the shooting information of the target body for being hit.
In a preferred embodiment, target body includes entity target body and/or virtual target body;
Virtual target body is that virtual target body is presented using holographic imaging mode.
In a preferred embodiment, the process of the calculating accuracy of fire is:
Statistics shooting information is analyzed to shooting information, calculates and drives the cross according to shooting information and shooting runout information
Degree;
Display shooting information and the accuracy of fire.
In the present embodiment, for example and without limitation, shooting information can be counted, according to all previous shooting ring
Or number of times, and the runout information shot are hit, calculate the degree of accuracy of shooting;Driven the cross by shooting information and calculating
Degree is shown, so that the shooting information that user intuitively obtains.
In a preferred embodiment, the mistake of the shooting information of shooting target body is obtained according to motion track information and positional information
Cheng Wei:
Movement locus, positional information and vibration information according to entity target body obtain the shooting information of shooting entity target body,
Or
Image information movement locus and position feedback information according to virtual target body obtain the shooting letter of the virtual target body of shooting
Breath.
As shown in figure 8, in a preferred embodiment, throw target unit is according to the process of instruction release target body:
A1. control instruction is received;
A2. the position data of target unit is thrown in detection;
A3. entity target body is launched output by the position data and control instruction according to detection from the outlet for throwing target unit.
Further, position data includes altitude information, air speed data and coordinate data.
In the present embodiment, altitude information, air speed data and the coordinate data of target unit are thrown by detecting, by entity target body
Output is launched from the outlet for throwing target unit, so as to realize the purpose of the target body that release is aimed at for shooting.
As shown in figure 9, in a preferred embodiment, throw target unit is according to the process of instruction release target body:
B1. control instruction is received;
B2. the position data of target unit is thrown in detection, and position data includes altitude information, air speed data and coordinate data;
B3. the position data and control instruction according to detection use holographic imaging mode output light, so that virtual target is presented
Body.
Further, for example and without limitation, position data may include altitude information, air speed data and coordinate data.
In the present embodiment, the altitude information of target unit, air speed data and coordinate data are thrown by detecting, using holography into
Image space formula output light, so that virtual target body is presented, so as to realize the purpose of the target body that release is aimed at for shooting.
As shown in Figure 10, in a preferred embodiment, the process of flying unit carrying throwing target unit flight is:
C1. flight directive is received;
C2. the positional information of positioning flight unit;
C3. flown according to flight directive and positional information, or
Flown to target location according to flight directive and positional information.
In the present embodiment, by the positional information of positioning flight unit, flying unit according to the flight directive for receiving,
Perform corresponding flight operation.Throwing target unit can discharge target body when flying unit is in state of flight;Throwing target unit can also be
Target body is discharged when flying unit remains static.
In excellent embodiment, throwing before target unit discharges target body according to instruction also includes:
Shoot the image comprising user;
The demand status of identifying user;
Image and demand status according to shooting judge release target body demand.
In the present embodiment, can be using the triggering flying unit flight of the technological means of image recognition, when the user's for recognizing
Demand status are to need to discharge target body demand, then triggering flying unit carrying throwing target unit takes off from initial position and flies to target
Position.
In a preferred embodiment, throwing before target unit discharges target body according to instruction also includes:
Gather the voice signal of user;
The semantic content of recognition of speech signals;
Release target body demand is judged according to semantic content.
In the present embodiment, can be using the technological means triggering flying unit flight of voice signal identification, when the language of identification
Adopted content is release target body demand, then triggering flying unit carrying throwing target unit takes off from initial position and flies to target location.
In several embodiments provided by the present invention, it should be understood that disclosed relevant apparatus and method, Ke Yitong
Other modes are crossed to realize.For example, device embodiment described above is only schematical, for example, the module or list
The division of unit, only a kind of division of logic function can have other dividing mode when actually realizing, such as multiple units or
Component can be combined or be desirably integrated into another system, or some features can be ignored, or not performed.
The unit that is illustrated as separating component can be or may not be it is physically separate, it is aobvious as unit
The part for showing can be or may not be physical location, you can with positioned at a place, or can also be distributed to multiple
On NE.Some or all of unit therein can be according to the actual needs selected to realize the mesh of this embodiment scheme
's.
In addition, during each functional unit or module in each embodiment of the invention can be integrated in a processing unit,
Can also be that unit is individually physically present, it is also possible to which two or more units are integrated in a unit.Above-mentioned collection
Into unit can both be realized in the form of hardware, it would however also be possible to employ the form of SFU software functional unit is realized.
Embodiments of the invention are the foregoing is only, the scope of the claims of the invention is not thereby limited, it is every to utilize this hair
Equivalent structure or equivalent flow conversion that bright specification and accompanying drawing content are made, or directly or indirectly it is used in other related skills
Art field, is included within the scope of the present invention.
Claims (16)
1. a kind of flight formula throws target monitoring device, it is characterised in that the device includes:Flying unit, throws target unit and monitoring is single
Unit, the flying unit carries the throwing target unit flight, and the target unit of throwing is used to according to the instruction release for receiving for penetrating
The target body of aiming is hit, the monitoring unit is used to monitor the shooting information that user shoots the target body.
2. flight formula according to claim 1 throws target monitoring device, it is characterised in that:The flying unit uses rotary wind type
Flight structure, or
The flying unit uses flapping wings type flight structure, or
The flying unit uses fixed-wing formula flight structure, or
The flying unit uses umbrella wing formula flight structure, or
The flying unit uses jet-propelled flight structure, or
The flying unit uses dirigible structure.
3. flight formula according to claim 1 throws target monitoring device, it is characterised in that:The monitoring unit includes:
Target body acquisition module, is used to obtain the motion track information and positional information of the target body;
Shooting acquisition module, is used to obtain shooting information of the user to the target body;
Processing module, to the motion track information according to the target body, the positional information and the shooting information, meter
Calculate the accuracy of fire.
4. flight formula according to claim 1 throws target monitoring device, it is characterised in that:The monitoring unit includes:
Acquisition module, is used to obtain the motion track information and positional information of the target body;
Cock processing module, to obtain the shooting target body according to the motion track information and the positional information
The shooting information.
5. the flight formula according to claim 1 or 3 throws target monitoring device, it is characterised in that:The target body includes entity target
Body, and/or virtual target body.
6. flight formula according to claim 5 throws target monitoring device, it is characterised in that:The entity target body uses plate shape
Structure, loop configuration or rectangular configuration, or
The entity target body is the balloon of filling gas, or
The entity target body is the target body for being provided with light source.
7. flight formula according to claim 5 throws target monitoring device, it is characterised in that:The virtual target body is to use optics
The virtual target body for presenting.
8. flight formula according to claim 7 throws target monitoring device, it is characterised in that:The virtual target body is using holography
Imaging mode is presented the virtual target body.
9. flight formula according to claim 5 throws target monitoring device, it is characterised in that:It is described throwing target unit include,
First receiver module, is used to receive control instruction;
Storage target cabin, is used to store the entity target body;
Ejection portion, is used to for the outlet of the entity target body from the throwing target unit to launch output;
First detection module, is used to detect the position data of the throwing target unit;
First control module, connects first receiver module, the ejection portion and the first detection module, to root respectively
The position data detected according to the first detection module and the control instruction control the ejection portion to launch the entity
Target body.
10. flight formula according to claim 8 throws target monitoring device, it is characterised in that:It is described throwing target unit include,
Second receiver module, is used to receive control instruction;
Emission part, is used to using holographic imaging mode output light, so that the virtual target body is presented;
Second detection module, is used to detect the position data of the throwing target unit, and the position data includes altitude information, wind speed
Data and coordinate data;
Second control module, connects second receiver module, the emission part and second detection module, to root respectively
The position data detected according to second detection module and the control instruction control the emission part to be imaged.
A kind of 11. flight formulas throw target monitoring methods, it is characterised in that including the step of be:
Flying unit carries throwing target unit and is flown;
It is described to throw the target body that target unit is aimed at according to the instruction release for receiving for shooting;
Monitoring user shoots the shooting information of the target body.
12. flight formulas according to claim 11 throw target monitoring method, it is characterised in that:Monitoring user shoots the target body
The process of the shooting information be:
Obtain the motion track information and positional information of the target body;
Obtain shooting information of the user to the target body;
According to the motion track information of the target body, the positional information and the shooting information calculate the accuracy of fire.
13. flight formulas according to claim 12 throw target monitoring method, it is characterised in that:Monitoring user shoots the target body
The process of the shooting information be:
Obtain the motion track information and positional information of the target body;
The shooting information of the shooting target body is obtained according to the motion track information and the positional information.
The 14. flight formula according to claim 11 or 12 throws target monitoring method, it is characterised in that:The target body includes entity
Target body and/or virtual target body;
The virtual target body is that the virtual target body is presented using holographic imaging mode.
15. flight formulas according to claim 11 throw target monitoring method, it is characterised in that:Calculate the process of the accuracy of fire
For:
Count the shooting information to be analyzed the shooting information, shooting is calculated according to shooting information and shooting runout information
The degree of accuracy;
Show the shooting information and the accuracy of fire.
16. flight formulas according to claim 12 throw target monitoring method, it is characterised in that:According to the motion track information
Process with the shooting information that the positional information obtains the shooting target body is:
The movement locus, the positional information and vibration information according to the entity target body obtain the shooting entity target body
The shooting information, or
Image information movement locus and position feedback information according to the virtual target body obtain the institute of the shooting virtual target body
State shooting information.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710001497.2A CN106767179B (en) | 2017-01-03 | 2017-01-03 | Flight formula throws target monitoring device and monitoring method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710001497.2A CN106767179B (en) | 2017-01-03 | 2017-01-03 | Flight formula throws target monitoring device and monitoring method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN106767179A true CN106767179A (en) | 2017-05-31 |
CN106767179B CN106767179B (en) | 2018-09-11 |
Family
ID=58952922
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710001497.2A Expired - Fee Related CN106767179B (en) | 2017-01-03 | 2017-01-03 | Flight formula throws target monitoring device and monitoring method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106767179B (en) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6269726B1 (en) * | 1997-12-16 | 2001-08-07 | Barnet Resnick | Multi-shot, non-lethal, taser cartridge remote firing system for protection of facilities and vehicles against personnel |
WO2007015282A1 (en) * | 2005-08-03 | 2007-02-08 | D Adamo Mario | Equipment for target shooting without using live targets |
GB2439744A (en) * | 2006-07-04 | 2008-01-09 | Christopher Bee | Shot pattern and target display |
US20160223298A1 (en) * | 2015-02-03 | 2016-08-04 | Lumir J. Drahota | Method of and System for Capturing and Reporting Scores |
CN206399295U (en) * | 2017-01-03 | 2017-08-11 | 上海量明科技发展有限公司 | Flight formula throws target monitoring device |
-
2017
- 2017-01-03 CN CN201710001497.2A patent/CN106767179B/en not_active Expired - Fee Related
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6269726B1 (en) * | 1997-12-16 | 2001-08-07 | Barnet Resnick | Multi-shot, non-lethal, taser cartridge remote firing system for protection of facilities and vehicles against personnel |
WO2007015282A1 (en) * | 2005-08-03 | 2007-02-08 | D Adamo Mario | Equipment for target shooting without using live targets |
GB2439744A (en) * | 2006-07-04 | 2008-01-09 | Christopher Bee | Shot pattern and target display |
US20160223298A1 (en) * | 2015-02-03 | 2016-08-04 | Lumir J. Drahota | Method of and System for Capturing and Reporting Scores |
CN206399295U (en) * | 2017-01-03 | 2017-08-11 | 上海量明科技发展有限公司 | Flight formula throws target monitoring device |
Also Published As
Publication number | Publication date |
---|---|
CN106767179B (en) | 2018-09-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR102233490B1 (en) | System having 3dimensions target for ball sports | |
CN106628180A (en) | Flight airdropping device and airdropping method thereof | |
CN109443097A (en) | A kind of acquisition equipment and catching method for rotor wing unmanned aerial vehicle | |
CN109579620A (en) | A kind of unmanned plane flexibility capture systems and application method | |
CN110114631A (en) | Utilize the gunnery system and method for unmanned plane | |
CN105597308A (en) | Unmanned plane, simulative air combat gaming device and simulative air combat gaming system | |
CN109562830A (en) | The method of unmanned flight's object and control unmanned flight's object | |
CN107339914B (en) | A kind of anti-UAV system based on sound wave | |
CN102530255A (en) | Accurate parachute landing device for traction type unmanned plane and method | |
CN110465036A (en) | Controllable guidance fire extinguisher bomb, fire extinguishing system and extinguishing method | |
CN106741888B (en) | Bionic unmanned reconnaissance helicopter | |
CN206399295U (en) | Flight formula throws target monitoring device | |
KR20170104812A (en) | Drone collecting and furnishing image data for bomb damage assessment and air-to-ground weaponry system including the same | |
CN103599631A (en) | Flying saucer simulation training system and method based on machine vision | |
CN107403481A (en) | Information interaction system and information collecting device for unmanned vehicle | |
US20220114906A1 (en) | Weapon targeting training system and method therefor | |
CN106767179B (en) | Flight formula throws target monitoring device and monitoring method | |
CN206378060U (en) | Flight formula throwing device and system | |
CN110009960A (en) | Virtual implementing helmet formula weaponry simulated training method | |
CN207832032U (en) | Four duct electric power rockets and its emitter | |
CN210922352U (en) | Unmanned aerial vehicle intercepting device based on catch net battle array | |
RU2709562C1 (en) | Drone control method and system for its implementation | |
CN106725360A (en) | Flight formula temperature-detecting device, detection method and system | |
CN106767180A (en) | Flight formula throwing device, throwing Target process and system | |
CN103471472A (en) | Aerial anti-terrorist unit for firing mini-rocket to propel special ammunition |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20180911 |