CN106959051A - The automatized calibration method of weapon-aiming system based on spatial perception location technology - Google Patents
The automatized calibration method of weapon-aiming system based on spatial perception location technology Download PDFInfo
- Publication number
- CN106959051A CN106959051A CN201710200401.5A CN201710200401A CN106959051A CN 106959051 A CN106959051 A CN 106959051A CN 201710200401 A CN201710200401 A CN 201710200401A CN 106959051 A CN106959051 A CN 106959051A
- Authority
- CN
- China
- Prior art keywords
- glasses
- sight
- weapon
- aiming system
- spatial perception
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F41—WEAPONS
- F41G—WEAPON SIGHTS; AIMING
- F41G1/00—Sighting devices
- F41G1/54—Devices for testing or checking ; Tools for adjustment of sights
Abstract
The present invention provides a kind of automatized calibration method of the weapon-aiming system based on spatial perception location technology, and it comprises the following steps:A weapon-aiming system is provided, the weapon-aiming system includes glasses and sight is constituted, and communication connection is carried out by wired or wireless mode between glasses and sight;Spatial perception positioning is completed by glasses, glasses walking a period of time is had on, so that increment restores the 3D maps of surrounding enviroment;The main control parts of glasses is parsed to the image that sight is sent simultaneously, and the visual angle of sight is obtained from image reckoning, and then extrapolates according to image depth information the rough position of sight;In follow-up moving process, glasses keep real-time tracking to the location information of sight, in conjunction with other sensors information, are calculated by successive ignition, reduce measurement error, improve positioning precision.Compared with prior art, the present invention can be automatically performed calibration, and can keep real time calibration.
Description
Technical field
The invention belongs to automatic collimation field, and in particular to a kind of weapon aiming system based on spatial perception location technology
The automatized calibration method of system.
Background technology
Existing weapon-aiming system generally comprises glasses and sight is constituted, can be by having between glasses and sight
Line wirelessly carries out communication connection.In order to ensure weapon-aiming system can normal work, it is necessary to ensure glasses with taking aim at
Quasi- device can be matched quickly and without interruption.Glasses must first carry out quickly calibrated, the purpose of calibration with sight before
It is:
A) between the initial value and sensor of sensor relative position relation demarcation, to implement spatial perception positioning;
B) it is consistent the aiming point and the aiming point of actual sight of the sight seen on glasses, to ensure effective fire
Precision.
Existing technology has the following disadvantages:It is generally necessary to carry out manual joint calibration, lack automatic capability, there is mistake
Poor accumulation problem, when error accumulation for a period of time after, it is necessary to recalibrate.
The content of the invention
It is an object of the invention to provide a kind of auto-calibration of the weapon-aiming system based on spatial perception location technology
Method.
The present invention uses following technical scheme:A kind of automation of the weapon-aiming system based on spatial perception location technology
Calibration method, it is characterised in that:Comprise the following steps:S1:A weapon-aiming system is provided, the weapon-aiming system includes eye
Mirror and sight, communication connection is carried out between glasses and sight by wired or wireless mode;S2:Space is completed by glasses
Perceptual positioning, has on glasses walking a period of time, and increment restores the 3D maps of surrounding enviroment;S3:Sight collection image letter
Breath, depth information and attitude information, glasses are sent to by communication module;S4:The main control part of glasses is sent to sight
Image parsed, calculated from image and obtain the visual angle of sight;S5:The main control part of glasses is pushed away according to image depth information
Calculate the position of sight;S6:The aiming point or virtual trajectory ray of sight are projected or shown on glasses;S7:Follow-up
In moving process, glasses keep real-time tracking to the location information of sight, in conjunction with other sensors information, by repeatedly changing
In generation, calculates, and reduces measurement error, improves positioning precision.
Preferably, glasses and sight need to be respectively completed respective calibration according to professional standard before dispatching from the factory.
Further, calculated in S5 be position of the sight axis in the three dimensions constructed by glasses and
Attitude, the projection of the extended line of axis on using glasses as the imaging surface at visual angle is exactly current aiming point, and this point will be
Shown in red cross star on glasses.
Compared with prior art, error of the invention will not only accumulate, but also can gradually be contracted as sample increases
It is small, and can also keep calibrating real-time update;The calibration method of the present invention does not use conventional calibration methods, but separately wards off footpath
Footpath, calibration is completed by means of spatial perception location technology, and this is a kind of new calibration method faced the future, is particularly suitable for use in
There is no prior information, need autonomous positioning, the occasion of self-calibrating.
Brief description of the drawings
Fig. 1 is the three dimensions that glasses are constructed.
Fig. 2 is perspective views of the Fig. 1 on glasses multi-view image.
Embodiment
Explanation is further explained to the present invention with specific embodiment below in conjunction with the accompanying drawings.
The present invention provides a kind of automatized calibration method of the weapon-aiming system based on spatial perception location technology, and it is wrapped
Include following steps:
S1:A weapon-aiming system is provided, the weapon-aiming system includes glasses and sight, leads between glasses and sight
Cross wired or wireless mode and carry out communication connection;
S2:Spatial perception positioning is completed by glasses, glasses walking a period of time is had on, increment is with restoring the 3D of surrounding enviroment
Figure;
S3:Sight collection image information, depth information and attitude information, glasses are sent to by communication module;
S4:The main control part of glasses is parsed to the image that sight is sent, and the visual angle for obtaining sight is calculated from image;
S5:The main control part of glasses extrapolates the position of sight according to image depth information;
S6:The aiming point or virtual trajectory ray of sight are projected or shown on glasses;
S7:In follow-up moving process, glasses keep real-time tracking to the location information of sight, believe in conjunction with other sensors
Breath, is calculated by successive ignition, reduces measurement error, improves positioning precision.
Weapon-aiming system needs to know oneself in the position of real world and attitude(Positioning), observer visual angle it is accurate
Position and direction, and real world three-dimensional structure(Map structuring), can:
For simulated training or the application scenarios of game class, dummy object is put in correct position in three dimensions, and instead
What kind of image where calculate show;
For the application scenarios of military actual combat, the Attitude Calculation of the axis of sight in three dimensions is realized, and in eye
Projection on mirror multi-view image(As shown in Figure 1, 2, wherein Fig. 1 is the three dimensions that glasses are constructed, and Fig. 2 is in glasses visual angle figure
As upper projection).
Spatial perception location technology is by various sensors(For example:Laser radar --- depth information is provided, optics is taken the photograph
As head --- image information, depth camera are provided --- depth information, inertial sensor are provided --- attitude information is provided)'s
Fusion, along with the SLAM technologies in computer vision or robotics, can be achieved equipment unknown from one in circumstances not known
Position starts movement, and the essence of equipment oneself in three dimensions is calculated according to location estimation and environmental information in moving process
True position, while incrementally by the three-dimensional map real-time reconstruction of surrounding on the basis of self poisoning.Spatial perception process is needed
To complete in real time, make the imperceptible delay of equipment wearer.Moreover, being continuously increased for information is gathered with outside, by multiple
After iterative calculation, positional precision will also be lifted further.
In general, spatial perception location technology at least needs to include following 3 steps:
Body surface is parsed and tracked
SLAM three-dimensionalreconstructions and self poisoning
Sensor merges to improve SLAM precision
Using the calibration method of the present invention, calibration can be automatically completed, and real time calibration can be kept.
The foregoing is only presently preferred embodiments of the present invention, all impartial changes done according to scope of the present invention patent with
Modification, should all belong to the covering scope of the present invention.
Claims (3)
1. a kind of automatized calibration method of the weapon-aiming system based on spatial perception location technology, it is characterised in that:Including
Following steps:
S1:A weapon-aiming system is provided, the weapon-aiming system includes glasses and sight, leads between glasses and sight
Cross wired or wireless mode and carry out communication connection;
S2:Spatial perception positioning is completed by glasses, glasses walking a period of time is had on, increment is with restoring the 3D of surrounding enviroment
Figure;
S3:Sight collection image information, depth information and attitude information, glasses are sent to by communication module;
S4:The main control part of glasses is parsed to the image that sight is sent, and the visual angle for obtaining sight is calculated from image;
S5:The main control part of glasses extrapolates the position of sight according to image depth information;
S6:The aiming point or virtual trajectory ray of sight are projected or shown on glasses;
S7:In follow-up moving process, glasses keep real-time tracking to the location information of sight, believe in conjunction with other sensors
Breath, is calculated by successive ignition, reduces measurement error, improves positioning precision.
2. the automatized calibration method of the weapon-aiming system according to claim 1 based on spatial perception location technology,
It is characterized in that:Glasses and sight need to be respectively completed respective calibration according to professional standard before dispatching from the factory.
3. the automatized calibration method of the weapon-aiming system according to claim 1 based on spatial perception location technology,
It is characterized in that:What is calculated in S5 is position and attitude of the sight axis in the three dimensions constructed by glasses,
Projection of the extended line of axis on using glasses as the imaging surface at visual angle is exactly current aiming point, and this point shows on glasses
It is shown as red cross star.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710200401.5A CN106959051B (en) | 2017-03-30 | 2017-03-30 | The automatized calibration method of weapon-aiming system based on spatial perception location technology |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710200401.5A CN106959051B (en) | 2017-03-30 | 2017-03-30 | The automatized calibration method of weapon-aiming system based on spatial perception location technology |
Publications (2)
Publication Number | Publication Date |
---|---|
CN106959051A true CN106959051A (en) | 2017-07-18 |
CN106959051B CN106959051B (en) | 2018-09-25 |
Family
ID=59471592
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710200401.5A Expired - Fee Related CN106959051B (en) | 2017-03-30 | 2017-03-30 | The automatized calibration method of weapon-aiming system based on spatial perception location technology |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106959051B (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107367831A (en) * | 2017-09-04 | 2017-11-21 | 深圳共分享网络科技有限公司 | A kind of virtual display device |
CN110275181A (en) * | 2019-07-08 | 2019-09-24 | 武汉中海庭数据技术有限公司 | A kind of vehicle-mounted mobile measuring system and its data processing method |
CN111609760A (en) * | 2020-06-01 | 2020-09-01 | 中光智控(北京)科技有限公司 | Intelligent sighting telescope shooting opportunity determination method and system |
CN114577064A (en) * | 2021-12-31 | 2022-06-03 | 哈瓦国际航空技术(深圳)有限公司 | Sighting device calibration method and unmanned aerial vehicle |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103282089A (en) * | 2010-11-17 | 2013-09-04 | 索尼电脑娱乐公司 | Smart shell to a game controller |
US20140075820A1 (en) * | 2011-04-03 | 2014-03-20 | Ipu Industries Ltd | Firearm gun-sight |
CN104457418A (en) * | 2014-10-27 | 2015-03-25 | 李昌福 | Sniper rifle with camera recording and intelligent communication functions |
CN104699247A (en) * | 2015-03-18 | 2015-06-10 | 北京七鑫易维信息技术有限公司 | Virtual reality interactive system and method based on machine vision |
CN104801039A (en) * | 2015-04-30 | 2015-07-29 | 浙江工商大学 | Virtual reality gaming device and scene realization method |
CN105988232A (en) * | 2015-02-11 | 2016-10-05 | 贵州景浩科技有限公司 | Electronic collimation device with wearable display device |
-
2017
- 2017-03-30 CN CN201710200401.5A patent/CN106959051B/en not_active Expired - Fee Related
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103282089A (en) * | 2010-11-17 | 2013-09-04 | 索尼电脑娱乐公司 | Smart shell to a game controller |
US20140075820A1 (en) * | 2011-04-03 | 2014-03-20 | Ipu Industries Ltd | Firearm gun-sight |
CN104457418A (en) * | 2014-10-27 | 2015-03-25 | 李昌福 | Sniper rifle with camera recording and intelligent communication functions |
CN105988232A (en) * | 2015-02-11 | 2016-10-05 | 贵州景浩科技有限公司 | Electronic collimation device with wearable display device |
CN104699247A (en) * | 2015-03-18 | 2015-06-10 | 北京七鑫易维信息技术有限公司 | Virtual reality interactive system and method based on machine vision |
CN104801039A (en) * | 2015-04-30 | 2015-07-29 | 浙江工商大学 | Virtual reality gaming device and scene realization method |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107367831A (en) * | 2017-09-04 | 2017-11-21 | 深圳共分享网络科技有限公司 | A kind of virtual display device |
CN110275181A (en) * | 2019-07-08 | 2019-09-24 | 武汉中海庭数据技术有限公司 | A kind of vehicle-mounted mobile measuring system and its data processing method |
CN111609760A (en) * | 2020-06-01 | 2020-09-01 | 中光智控(北京)科技有限公司 | Intelligent sighting telescope shooting opportunity determination method and system |
CN111609760B (en) * | 2020-06-01 | 2022-07-05 | 中光智控(北京)科技有限公司 | Intelligent sighting telescope shooting time determination method and system |
CN114577064A (en) * | 2021-12-31 | 2022-06-03 | 哈瓦国际航空技术(深圳)有限公司 | Sighting device calibration method and unmanned aerial vehicle |
CN114577064B (en) * | 2021-12-31 | 2023-06-23 | 哈瓦国际航空技术(深圳)有限公司 | Sighting device calibration method and unmanned aerial vehicle |
Also Published As
Publication number | Publication date |
---|---|
CN106959051B (en) | 2018-09-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106529495B (en) | Obstacle detection method and device for aircraft | |
CN103175524B (en) | A kind of position of aircraft without view-based access control model under marking environment and attitude determination method | |
CN106959051B (en) | The automatized calibration method of weapon-aiming system based on spatial perception location technology | |
US20190206073A1 (en) | Aircraft information acquisition method, apparatus and device | |
CN108022302B (en) | Stereo display device of Inside-Out space orientation's AR | |
TWI522650B (en) | System, method and computer program product for real-time alignment of an augmented reality device | |
CN107014378A (en) | A kind of eye tracking aims at control system and method | |
CN106708066A (en) | Autonomous landing method of unmanned aerial vehicle based on vision/inertial navigation | |
CN111338383B (en) | GAAS-based autonomous flight method and system, and storage medium | |
CN108235735A (en) | Positioning method and device, electronic equipment and computer program product | |
CN106989746A (en) | Air navigation aid and guider | |
CN107192375B (en) | A kind of unmanned plane multiple image adaptive location bearing calibration based on posture of taking photo by plane | |
CN106933096B (en) | Self-following robot device and method for providing space positioning information for third party | |
JP2022510418A (en) | Time synchronization processing method, electronic devices and storage media | |
KR20200064542A (en) | Apparatus for measuring ground control point using unmanned aerial vehicle and method thereof | |
CN110889873A (en) | Target positioning method and device, electronic equipment and storage medium | |
KR102219843B1 (en) | Estimating location method and apparatus for autonomous driving | |
CN114347033B (en) | Robot character grabbing method and device, robot and storage medium | |
US20190236847A1 (en) | Method and system for aligning digital display of images on augmented reality glasses with physical surrounds | |
CN111077907A (en) | Autonomous positioning method of outdoor unmanned aerial vehicle | |
CN110031880B (en) | High-precision augmented reality method and equipment based on geographical position positioning | |
CN109490931A (en) | Flight localization method, device and unmanned plane | |
KR101764222B1 (en) | System and method for high precise positioning | |
WO2022000713A1 (en) | Augmented reality self-positioning method based on aviation assembly | |
CN110388939A (en) | One kind being based on the matched vehicle-mounted inertial navigation position error modification method of Aerial Images |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20180925 |