CN107607939B - Optical target tracking and positioning radar device based on real map and image - Google Patents

Optical target tracking and positioning radar device based on real map and image Download PDF

Info

Publication number
CN107607939B
CN107607939B CN201710813571.0A CN201710813571A CN107607939B CN 107607939 B CN107607939 B CN 107607939B CN 201710813571 A CN201710813571 A CN 201710813571A CN 107607939 B CN107607939 B CN 107607939B
Authority
CN
China
Prior art keywords
module
target
coordinate
real
algorithm
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710813571.0A
Other languages
Chinese (zh)
Other versions
CN107607939A (en
Inventor
晋建志
何伍斌
许凯华
冯毓伟
张娆
柳黎
范为广
洪标
解铮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangsu Such As Earth Space Mdt Infotech Ltd
Original Assignee
Jiangsu Such As Earth Space Mdt Infotech Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiangsu Such As Earth Space Mdt Infotech Ltd filed Critical Jiangsu Such As Earth Space Mdt Infotech Ltd
Priority to CN201710813571.0A priority Critical patent/CN107607939B/en
Publication of CN107607939A publication Critical patent/CN107607939A/en
Application granted granted Critical
Publication of CN107607939B publication Critical patent/CN107607939B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The invention discloses an optical target tracking and positioning radar device based on a real map and an image, which comprises an information acquisition module and a data center, wherein the information acquisition module comprises an optical induction module, an all-round motor module and an unmanned aerial vehicle module, the data center comprises a target identification module, a coordinate resolving module, a projection transformation module, an image fusion module and a target tracking module, and the optical induction module, the all-round motor module and the unmanned aerial vehicle module are connected with the data center through a communication network. The method identifies and tracks the target, can obtain the accurate radar coordinate of the tracked target in real time through the real map and the live-action image, visualizes the real environment around the tracked target, visually observes the real environment where the tracked target is located, and warns relevant personnel to remind the relevant personnel to implement decision making.

Description

Optical target tracking and positioning radar device based on real map and image
Technical Field
The invention relates to the field of target tracking radars, in particular to an optical target tracking and positioning radar device based on a real map and an image, which can be applied to the fields of military, economy, society and the like, and has good application prospects in the aspects of traffic monitoring, public security and control, water administration and road administration law enforcement, forest resource clearing, geological survey and the like.
Background
At present, due to the continuous development of science and technology, target tracking radars are widely applied to the fields of military affairs, economy, society and the like, and the conventional tracking radars have the main functions of measuring target coordinates and tracks thereof in real time, predicting the future positions of targets and the like. However, the method has the defects that the position of the target which may appear can only be known approximately, the target cannot be identified and tracked, the accurate position information of the target cannot be known, and the real environment around the position of the target cannot be known, so that for many industries, certain errors exist when leaders carry out decision making.
disclosure of Invention
The invention aims to provide an optical target tracking and positioning radar device based on a real map and an image aiming at the defects of the prior art. The method can identify and track the target, obtain the accurate radar coordinate of the tracked target in real time through the real map and the live-action image, visualize the real environment around the tracked target, visually observe the real environment where the tracked target is located, early warn relevant personnel and remind the relevant personnel to implement decision making.
the technical solution of the invention is as follows:
an optical target tracking and positioning radar device based on a real map and an image is characterized in that: the system comprises an information acquisition module and a data center, wherein the information acquisition module comprises an optical sensing module, an all-directional motor module and an unmanned aerial vehicle module, the data center comprises a target identification module, a coordinate calculation module, a projection transformation module, an image fusion module and a target tracking module, the optical sensing module, the all-directional motor module and the unmanned aerial vehicle module are connected with the data center through a communication network, the optical sensing module is responsible for outputting an optical view field image, the all-directional motor module is responsible for outputting a horizontal angle and a vertical angle, the target identification module is responsible for identifying the optical view field image output by the optical sensing module and obtaining a target view field coordinate, the coordinate calculation module is responsible for calculating a space rectangular coordinate, a space-time coordinate and a radar coordinate of a target, and the unmanned aerial vehicle module is responsible for aerial photography, the radar three-dimensional map recognition method comprises the steps of restoring a real three-dimensional map, outputting an orthographic view image and an orthographic two-dimensional map by the projection transformation module, fusing the orthographic view image and the orthographic two-dimensional map to generate a final radar map by the image fusion module, and tracking a recognized target by the target tracking module.
Preferably, the communication network is one or more of CDMA and 4G, WIFI ad hoc networks, or a wired communication network.
Preferably, the optical sensing module comprises a base, a support and a camera, the support is arranged on the base, the camera is arranged on the support, the omnibearing motor module comprises a transverse shaft steering engine component, a vertical shaft steering engine component and a drive control module, a control program is arranged in the drive control module, the transverse shaft steering engine component is arranged in the support and comprises a transverse shaft code wheel, a transverse shaft stepping motor, a transverse shaft synchronous belt and a transverse shaft synchronous belt wheel, the transverse shaft code wheel is connected with the transverse shaft synchronous belt wheel through a connecting rod, the vertical shaft steering engine component comprises a vertical shaft code wheel, a vertical shaft stepping motor, a vertical shaft synchronous belt and a vertical shaft synchronous belt wheel, the vertical shaft code wheel is connected with the vertical shaft synchronous belt wheel through a connecting rod, the vertical shaft code wheel is arranged in the support, the vertical shaft stepping motor, the vertical shaft synchronous belt and the vertical shaft stepping motor are actuated under the action, therefore, the camera can automatically and circularly move at a certain speed within the range of 360 degrees horizontally and 120 degrees vertically, and the horizontal shaft code disc and the vertical shaft code disc respectively output a horizontal angle and a vertical angle.
Preferably, the target identification module is pre-stored with target standard parameters, and includes a CNN algorithm and an RNN algorithm.
The coordinate calculation module preferably comprises a space coordinate mapping algorithm and a space coordinate transformation algorithm, the target field coordinate and the horizontal angle and the vertical angle output by the omnibearing motor module generate a target space rectangular coordinate under the action of the space coordinate mapping algorithm, the target space rectangular coordinate generates a target earth space-time coordinate under the action of the space coordinate transformation algorithm, and the target earth space-time coordinate generates a radar coordinate under the action of the space coordinate transformation algorithm.
Preferably, the projection transformation module comprises a space mathematical projection transformation algorithm, an optical view field image output by the optical sensing module, a horizontal angle and a vertical angle output by the omnibearing motor module generate an orthographic view angle image under the action of the space mathematical projection transformation algorithm in the projection transformation module, a large number of two-dimensional sequence photos output by the unmanned aerial vehicle module generate an orthographic two-dimensional map through a real three-dimensional map in a modeling form, and the real three-dimensional map generates the orthographic two-dimensional map under the action of the space mathematical projection transformation algorithm in the projection transformation module.
The image fusion module preferably comprises a video angle transformation algorithm and a video three-dimensional fusion algorithm, and the orthographic view images and the orthographic two-dimensional map generate a real radar map under the action of the video angle transformation algorithm and the video three-dimensional fusion algorithm.
Preferably said target tracking module comprises a target tracking algorithm.
An optical target tracking and positioning radar device based on a real map and an image is characterized in that: the method comprises the following working steps:
(1) the first step is as follows: and (6) collecting data. After the omnibearing motor module starts to work, a transverse shaft stepping motor and a vertical shaft stepping motor in the omnibearing motor module act under the action of a control program, thereby driving the camera to automatically and circularly move at a certain speed within the range of 360 degrees horizontally and 120 degrees vertically to realize the automatic inspection function, forming a real-time optical view field image after the optical sensing module calibrates the relationship between the horizontal view field angle, the vertical view field angle, the optical view field coordinate and the space distance, the optical view field image is transmitted to the data center through a communication network, a horizontal shaft code wheel and a vertical shaft code wheel in the omnibearing motor module accurately position the current horizontal angle and vertical angle and transmit the horizontal angle and vertical angle to the data center through the communication network, the unmanned aerial vehicle module generates a real three-dimensional map through aerial photography and modeling, and the real three-dimensional map generated by the unmanned aerial vehicle module is transmitted to the data center through the communication network.
(2) The second step is that: and (4) generating radar coordinates. The CNN algorithm and the RNN algorithm in the target identification module compare an optical view field image output by the optical sensing module with target standard parameters prestored in the target identification module so as to identify and process a specific target and generate a target view field coordinate, the target view field coordinate and a horizontal angle and a vertical angle output by the omnibearing motor module generate a target space rectangular coordinate under the action of a space coordinate mapping algorithm in the coordinate calculation module, the target space rectangular coordinate generates a target earth space-time coordinate under the action of a space coordinate transformation algorithm in the coordinate calculation module, and the target earth space-time coordinate generates a radar coordinate under the action of the space coordinate transformation algorithm in the coordinate calculation module.
(3) The third step: and generating a radar map. The optical field-of-view image output by the optical sensing module, the horizontal angle and the vertical angle output by the omnibearing motor module generate an orthoscopic image under the action of a spatial mathematical projection transformation algorithm in the projection transformation module, a large number of two-dimensional sequence photos output by the unmanned aerial vehicle module generate a real three-dimensional map through modeling, the real three-dimensional map generates an orthoscopic two-dimensional map under the action of the spatial mathematical projection transformation algorithm in the projection transformation module, and the orthoscopic image and the orthoscopic two-dimensional map generate a real radar map under the action of a video angle transformation algorithm and a video three-dimensional fusion algorithm in the image fusion module.
(4) The fourth step: the radar coordinates generated in the second step are displayed on the radar map generated in the third step, so that the warning reminding effect is achieved, and meanwhile, the target tracking algorithm in the target tracking module controls the omnibearing motor module to act so as to realize the tracking of the identified target.
The invention provides an optical target tracking and positioning radar which is instant, simple, efficient and accurate based on a real map and an image, can identify and track a target, can obtain accurate radar coordinates of the tracked target in real time through the real map and a live-action image, can visualize the real environment around the tracked target, visually observe the real environment where the tracked target is located, warn relevant personnel, remind the relevant personnel to make a decision, and can be widely applied to the fields of traffic monitoring, public security control, water administration and road administration law enforcement, forest resource clearing, geological survey and the like.
Drawings
Fig. 1 is a block diagram of the working principle of the present invention.
Fig. 2 is a left perspective view of the optical sensing module and the omni-directional motor module.
Fig. 3 is a right perspective view of the optical sensing module and the omni-directional motor module.
Wherein: 1 base, 2 support, 3 camera, 4 transverse shaft coded disc, 5 transverse shaft stepping motor, 6 transverse shaft synchronous belt, 7 transverse shaft synchronous belt, 8 vertical shaft coded disc, 9 vertical shaft stepping motor, 10 vertical shaft synchronous belt, 11 vertical shaft synchronous belt.
Detailed Description
the invention is further illustrated with reference to the following figures and examples:
an optical target tracking and positioning radar device based on a real map and an image comprises an information acquisition module and a data center. The information acquisition module comprises an optical sensing module, an omnibearing motor module and an unmanned aerial vehicle module. The data center comprises a target identification module, a coordinate calculation module, a projection transformation module, an image fusion module and a target tracking module. The optical sensing module, the omnibearing motor module and the unmanned aerial vehicle module are connected with a data center through a communication network, wherein the communication network is one or more of CDMA and 4G, WIFI ad hoc networks or a wired communication network. The optical sensing module is responsible for outputting an optical view field image, and the omnibearing motor module is responsible for outputting a horizontal angle and a vertical angle. The optical induction module comprises a base 1, a support 2 and a camera 3, the support 2 is arranged on the base 1, the camera 3 is arranged on the support 2, the omnibearing motor module comprises a transverse shaft steering engine component, a vertical shaft steering engine component and a drive control module, a control program is arranged in the drive control module, the transverse shaft steering engine component is arranged in the support 2 and comprises a transverse shaft code wheel 4, a transverse shaft stepping motor 5, a transverse shaft synchronous belt 6 and a transverse shaft synchronous pulley 7, the transverse shaft code wheel 4 is connected with the transverse shaft synchronous pulley 7 through a connecting rod, the transverse shaft synchronous pulley 7 is connected with an output shaft of the transverse shaft stepping motor 5 through the transverse shaft synchronous belt 6, the vertical shaft steering engine component comprises a vertical shaft code wheel 8, a vertical shaft stepping motor 9, a vertical shaft synchronous belt 10 and a vertical shaft synchronous pulley 11, the vertical shaft code wheel 8 is connected with the vertical shaft synchronous pulley 11 through a connecting rod, the vertical shaft synchronous pulley 11 is, the vertical shaft coded disc 8 is arranged in the support 2, the vertical shaft stepping motor 9, the vertical shaft synchronous belt 10 and the vertical shaft synchronous belt pulley 11 are arranged in the base 1, and the horizontal shaft stepping motor 5 and the vertical shaft stepping motor 9 act under the action of a control program, so that the camera 3 automatically and circularly moves at a certain speed within the range of 360 degrees horizontally and 120 degrees vertically, and the horizontal shaft coded disc 4 and the vertical shaft coded disc 8 respectively output a horizontal angle and a vertical angle. The target identification module is responsible for identifying the optical view field image output from the optical sensing module and obtaining a target view field coordinate, and the target identification module is prestored with target standard parameters and comprises a CNN algorithm and an RNN algorithm. The coordinate calculation module is responsible for calculating space rectangular coordinates, earth space-time coordinates and radar coordinates of a target, the coordinate calculation module comprises a space coordinate mapping algorithm and a space coordinate transformation algorithm, horizontal angles and vertical angles output by the target field coordinates and the omnibearing motor module generate target space rectangular coordinates under the action of the space coordinate mapping algorithm in the coordinate calculation module, the target space rectangular coordinates generate target earth space-time coordinates under the action of the space coordinate transformation algorithm in the coordinate calculation module, and the target earth space-time coordinates generate radar coordinates under the action of the space coordinate transformation algorithm in the coordinate calculation module. The unmanned aerial vehicle module is responsible for taking photo by plane, modeling, restoring true three-dimensional map, and the unmanned aerial vehicle module includes ground station and has the unmanned aerial vehicle of function of making a video recording, connects through one or several kinds of wireless communication network in CDMA, 4G, WIFI ad hoc network between ground station and the unmanned aerial vehicle. The projection transformation module is responsible for outputting an orthographic view angle image and an orthographic two-dimensional map, the projection transformation module comprises a space mathematical projection transformation algorithm, an optical view field image output by the optical sensing module, a horizontal angle and a vertical angle output by the omnibearing motor module generate an orthographic view angle image under the action of the space mathematical projection transformation algorithm in the projection transformation module, a large number of two-dimensional sequence photos output by the unmanned aerial vehicle module generate a real three-dimensional map through modeling, and the real three-dimensional map generates the orthographic two-dimensional map under the action of the space mathematical projection transformation algorithm in the projection transformation module. The image fusion module is responsible for fusing the orthographic view images and the orthographic two-dimensional map to generate a final radar map, the image fusion module comprises a video angle transformation algorithm and a video three-dimensional fusion algorithm, and the orthographic view images and the orthographic two-dimensional map generate a real radar map under the action of the video angle transformation algorithm and the video three-dimensional fusion algorithm in the image fusion module. The target tracking module is responsible for tracking the identified target and comprises a target tracking algorithm.
The working process of the invention is as follows:
(1) The first step is as follows: and (6) collecting data. After the omnibearing motor module starts to work, the horizontal axis stepping motor 7 and the vertical axis stepping motor 9 in the omnibearing motor module act under the action of a control program, thereby driving the camera 3 to automatically and circularly move at a certain speed within the range of 360 degrees horizontally and 120 degrees vertically to realize the automatic inspection function, forming a real-time optical view field image after the optical sensing module calibrates the relationship between the horizontal view field angle, the vertical view field angle, the optical view field coordinate and the space distance, the optical view field image is transmitted to the data center through a communication network, the horizontal shaft coded disc 4 and the vertical shaft coded disc 8 in the omnibearing motor module accurately position the current horizontal angle and vertical angle and transmit the horizontal angle and vertical angle to the data center through the communication network, the unmanned aerial vehicle module generates a real three-dimensional map through aerial photography and modeling, and the real three-dimensional map generated by the unmanned aerial vehicle module is transmitted to the data center through the communication network.
(2) The second step is that: and (4) generating radar coordinates. The CNN algorithm and the RNN algorithm in the target identification module compare an optical field image output by the optical sensing module with a target standard parameter so as to identify and process a specific target and generate a target field coordinate, the target field coordinate and a horizontal angle and a vertical angle output by the omnibearing motor module generate a target space rectangular coordinate under the action of a space coordinate mapping algorithm in the coordinate calculation module, the target space rectangular coordinate generates a target earth space-time coordinate under the action of a space coordinate transformation algorithm in the coordinate calculation module, and the target earth space-time coordinate generates a radar coordinate under the action of a space coordinate transformation algorithm in the coordinate calculation module.
(3) The third step: and generating a radar map. The optical field-of-view image output by the optical sensing module, the horizontal angle and the vertical angle output by the omnibearing motor module generate an orthoscopic image under the action of a spatial mathematical projection transformation algorithm in the projection transformation module, a large number of two-dimensional sequence photos output by the unmanned aerial vehicle module generate a real three-dimensional map through modeling, the real three-dimensional map generates an orthoscopic two-dimensional map under the action of the spatial mathematical projection transformation algorithm in the projection transformation module, and the orthoscopic image and the orthoscopic two-dimensional map generate a real radar map under the action of a video angle transformation algorithm and a video three-dimensional fusion algorithm in the image fusion module.
(4) The fourth step: the radar coordinates generated in the second step are displayed on the radar map generated in the third step, the warning reminding effect is achieved, and meanwhile, the target tracking algorithm in the target tracking module drives the omnibearing motor module to actuate, so that the positioned target is tracked.
The method identifies and tracks the target, can obtain the accurate radar coordinate of the tracked target in real time through the real map and the live-action image, visualizes the real environment around the tracked target, visually observes the real environment where the tracked target is located, and warns relevant personnel to remind the relevant personnel to implement decision making.
In conclusion, the invention achieves the expected effect.

Claims (9)

1. an optical target tracking and positioning radar device based on a real map and an image is characterized in that: the system comprises an information acquisition module and a data center, wherein the information acquisition module comprises an optical sensing module, an all-directional motor module and an unmanned aerial vehicle module, the data center comprises a target identification module, a coordinate calculation module, a projection transformation module, an image fusion module and a target tracking module, the optical sensing module, the all-directional motor module and the unmanned aerial vehicle module are connected with the data center through a communication network, the optical sensing module is responsible for outputting an optical view field image, the all-directional motor module is responsible for outputting a horizontal angle and a vertical angle, the target identification module is responsible for identifying the optical view field image output by the optical sensing module and obtaining a target view field coordinate, the coordinate calculation module is responsible for calculating a space rectangular coordinate, a space-time coordinate and a radar coordinate of a target, and the unmanned aerial vehicle module is responsible for aerial photography, the radar three-dimensional map recognition method comprises the steps of restoring a real three-dimensional map, outputting an orthographic view image and an orthographic two-dimensional map by the projection transformation module, fusing the orthographic view image and the orthographic two-dimensional map to generate a final radar map by the image fusion module, and tracking a recognized target by the target tracking module.
2. The radar apparatus for tracking and positioning optical targets based on real maps and images as claimed in claim 1, wherein: the communication network is one or more of CDMA and 4G, WIFI ad hoc networks, or is a wired communication network.
3. The radar apparatus for tracking and positioning optical targets based on real maps and images as claimed in claim 1, wherein: the optical induction module comprises a base, a support and a camera, the support is arranged on the base, the camera is arranged on the support, the omnibearing motor module comprises a transverse shaft steering engine component, a vertical shaft steering engine component and a drive control module, a control program is arranged in the drive control module, the transverse shaft steering engine component is arranged in the support and comprises a transverse shaft code disc, a transverse shaft stepping motor, a transverse shaft synchronous belt and a transverse shaft synchronous belt wheel, the transverse shaft code disc is connected with the transverse shaft synchronous belt wheel through a connecting rod, the vertical shaft steering engine component comprises a vertical shaft code disc, a vertical shaft stepping motor, a vertical shaft synchronous belt and a vertical shaft synchronous belt wheel, the vertical shaft code disc is connected with the vertical shaft synchronous belt wheel through a connecting rod, the vertical shaft code disc is arranged in the support, the vertical shaft stepping motor, the vertical shaft synchronous belt and the vertical shaft stepping motor are arranged in the base, therefore, the camera can automatically and circularly move at a certain speed within the range of 360 degrees horizontally and 120 degrees vertically, and the horizontal shaft code disc and the vertical shaft code disc respectively output a horizontal angle and a vertical angle.
4. the radar apparatus for tracking and positioning optical targets based on real maps and images as claimed in claim 1, wherein: the target identification module is internally pre-stored with target standard parameters and comprises a CNN algorithm and an RNN algorithm.
5. The radar apparatus for tracking and positioning optical targets based on real maps and images as claimed in claim 1, wherein: the coordinate resolving module comprises a space coordinate mapping algorithm and a space coordinate transformation algorithm, a target space rectangular coordinate is generated by the target field coordinate and the horizontal angle and the vertical angle output by the omnibearing motor module under the action of the space coordinate mapping algorithm, a target earth space-time coordinate is generated by the target space rectangular coordinate under the action of the space coordinate transformation algorithm, and a radar coordinate is generated by the target earth space-time coordinate under the action of the space coordinate transformation algorithm.
6. The radar apparatus for tracking and positioning optical targets based on real maps and images as claimed in claim 1, wherein: the projection transformation module comprises a space mathematical projection transformation algorithm, an optical view field image output by the optical sensing module, a horizontal angle and a vertical angle output by the omnibearing motor module generate an orthoscopic image under the action of the space mathematical projection transformation algorithm in the projection transformation module, a large number of two-dimensional sequence photos output by the unmanned aerial vehicle module generate an orthoscopic two-dimensional map through a real three-dimensional map in a modeling form, and the real three-dimensional map generates the orthoscopic two-dimensional map under the action of the space mathematical projection transformation algorithm in the projection transformation module.
7. The radar apparatus for tracking and positioning optical targets based on real maps and images as claimed in claim 1, wherein: the image fusion module comprises a video angle transformation algorithm and a video three-dimensional fusion algorithm, and the orthographic view images and the orthographic two-dimensional map generate a real radar map under the action of the video angle transformation algorithm and the video three-dimensional fusion algorithm.
8. The radar apparatus for tracking and positioning optical targets based on real maps and images as claimed in claim 1, wherein: the target tracking module includes a target tracking algorithm.
9. The radar apparatus for tracking and positioning optical targets based on real maps and images as claimed in claim 1, wherein: the method comprises the following working processes:
(1) The first step is as follows: data acquisition, after the operation, a transverse axis stepping motor and a vertical axis stepping motor in the omnibearing motor module act under the action of a control program, thereby driving the camera to automatically and circularly move at a certain speed within the range of 360 degrees horizontally and 120 degrees vertically to realize the automatic inspection function, forming a real-time optical view field image after the optical sensing module calibrates the relationship between the horizontal view field angle, the vertical view field angle, the optical view field coordinate and the space distance, the optical view field image is transmitted to a data center through a communication network, a horizontal shaft code wheel and a vertical shaft code wheel in the omnibearing motor module accurately position the current horizontal angle and vertical angle and transmit the horizontal angle and vertical angle to the data center through the communication network, the unmanned aerial vehicle module generates a real three-dimensional map through aerial photography and modeling, and the real three-dimensional map generated by the unmanned aerial vehicle module is transmitted to the data center through the communication network;
(2) The second step is that: the method comprises the steps of generating radar coordinates, wherein a CNN algorithm and an RNN algorithm in a target identification module compare an optical view field image output by an optical sensing module with target standard parameters prestored in the target identification module so as to identify and process a specific target and generate target view field coordinates, the target view field coordinates and horizontal angles and vertical angles output by an omnibearing motor module generate target space rectangular coordinates under the action of a space coordinate mapping algorithm in a coordinate calculation module, the target space rectangular coordinates generate target earth space coordinates under the action of a space coordinate transformation algorithm in the coordinate calculation module, and the target earth space coordinates generate radar coordinates under the action of the space coordinate transformation algorithm in the coordinate calculation module;
(3) The third step: the method comprises the steps of generating a radar map, generating an orthographic view image by an optical view image output by an optical sensing module and horizontal and vertical angles output by an all-directional motor module under the action of a spatial mathematical projection transformation algorithm in a projection transformation module, generating a real three-dimensional map by a large number of two-dimensional sequence photos output by an unmanned aerial vehicle module through modeling, generating an orthographic two-dimensional map by the real three-dimensional map under the action of the spatial mathematical projection transformation algorithm in the projection transformation module, and generating a real radar map by the orthographic view image and the orthographic two-dimensional map under the action of a video angle transformation algorithm and a video three-dimensional fusion algorithm in an image fusion module;
(4) The fourth step: the radar coordinates generated in the second step are displayed on the radar map generated in the third step, so that the warning reminding effect is achieved, and meanwhile, the target tracking algorithm in the target tracking module controls the omnibearing motor module to act so as to realize the tracking of the identified target.
CN201710813571.0A 2017-09-11 2017-09-11 Optical target tracking and positioning radar device based on real map and image Active CN107607939B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710813571.0A CN107607939B (en) 2017-09-11 2017-09-11 Optical target tracking and positioning radar device based on real map and image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710813571.0A CN107607939B (en) 2017-09-11 2017-09-11 Optical target tracking and positioning radar device based on real map and image

Publications (2)

Publication Number Publication Date
CN107607939A CN107607939A (en) 2018-01-19
CN107607939B true CN107607939B (en) 2019-12-13

Family

ID=61062095

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710813571.0A Active CN107607939B (en) 2017-09-11 2017-09-11 Optical target tracking and positioning radar device based on real map and image

Country Status (1)

Country Link
CN (1) CN107607939B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109001676B (en) * 2018-05-31 2020-08-21 北京科技大学 Robot positioning navigation system
CN109720280A (en) * 2019-03-01 2019-05-07 山东华宇信息空间技术有限公司 A kind of exact image information transmission system combined based on radar with camera
CN113365030B (en) * 2021-06-01 2023-07-04 珠海大横琴科技发展有限公司 Multi-angle target tracking method and tracking system

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104166137A (en) * 2014-08-19 2014-11-26 东北电力大学 Target comprehensive parameter tracking measurement method based on display of radar warning situation map
CN104457704A (en) * 2014-12-05 2015-03-25 北京大学 System and method for positioning ground targets of unmanned planes based on enhanced geographic information
CN106454209A (en) * 2015-08-06 2017-02-22 航天图景(北京)科技有限公司 Unmanned aerial vehicle emergency quick action data link system and unmanned aerial vehicle emergency quick action monitoring method based on spatial-temporal information fusion technology

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9977123B2 (en) * 2014-05-20 2018-05-22 Bae Systems Information And Electronic Systems Integration Inc. Automated track projection bias removal using frechet distance and road networks
US9944390B2 (en) * 2016-02-29 2018-04-17 Intel Corporation Technologies for managing data center assets using unmanned aerial vehicles

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104166137A (en) * 2014-08-19 2014-11-26 东北电力大学 Target comprehensive parameter tracking measurement method based on display of radar warning situation map
CN104457704A (en) * 2014-12-05 2015-03-25 北京大学 System and method for positioning ground targets of unmanned planes based on enhanced geographic information
CN106454209A (en) * 2015-08-06 2017-02-22 航天图景(北京)科技有限公司 Unmanned aerial vehicle emergency quick action data link system and unmanned aerial vehicle emergency quick action monitoring method based on spatial-temporal information fusion technology

Also Published As

Publication number Publication date
CN107607939A (en) 2018-01-19

Similar Documents

Publication Publication Date Title
CN110174093B (en) Positioning method, device, equipment and computer readable storage medium
CN110446159B (en) System and method for accurate positioning and autonomous navigation of indoor unmanned aerial vehicle
CN107161141B (en) Unmanned automobile system and automobile
US20220156967A1 (en) Device and method for detection and localization of vehicles
US11527084B2 (en) Method and system for generating a bird's eye view bounding box associated with an object
JP7182895B2 (en) Information processing device, program, and information processing method
CN110945320B (en) Vehicle positioning method and system
KR20180044279A (en) System and method for depth map sampling
JP2018512687A (en) Environmental scanning and unmanned aircraft tracking
CN109737981B (en) Unmanned vehicle target searching device and method based on multiple sensors
CN108594244B (en) Obstacle recognition transfer learning method based on stereoscopic vision and laser radar
CN107607939B (en) Optical target tracking and positioning radar device based on real map and image
CN115597659B (en) Intelligent safety management and control method for transformer substation
US20220044558A1 (en) Method and device for generating a digital representation of traffic on a road
CN112085003A (en) Automatic identification method and device for abnormal behaviors in public places and camera equipment
CN104166137A (en) Target comprehensive parameter tracking measurement method based on display of radar warning situation map
Vu et al. Traffic sign detection, state estimation, and identification using onboard sensors
CN110136186A (en) A kind of detection target matching method for mobile robot object ranging
Altekar et al. Infrastructure-based sensor data capture systems for measurement of operational safety assessment (osa) metrics
RU2562368C1 (en) Three-dimensional (3d) mapping method
CN113557713A (en) Context aware monitoring
CN116630931A (en) Obstacle detection method, obstacle detection system, agricultural machine, electronic device, and storage medium
CN115004273A (en) Digital reconstruction method, device and system for traffic road
CN111491154A (en) Detection and ranging based on one or more monoscopic frames
KR101163453B1 (en) Measuring method of distance from object using laser sensor and vision sensor

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant