CN113776540A - Control method for vehicle-mounted tethered unmanned aerial vehicle to track moving vehicle in real time based on visual navigation positioning - Google Patents

Control method for vehicle-mounted tethered unmanned aerial vehicle to track moving vehicle in real time based on visual navigation positioning Download PDF

Info

Publication number
CN113776540A
CN113776540A CN202111321792.9A CN202111321792A CN113776540A CN 113776540 A CN113776540 A CN 113776540A CN 202111321792 A CN202111321792 A CN 202111321792A CN 113776540 A CN113776540 A CN 113776540A
Authority
CN
China
Prior art keywords
vehicle
unmanned aerial
aerial vehicle
pose
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111321792.9A
Other languages
Chinese (zh)
Other versions
CN113776540B (en
Inventor
严晓明
岳野
陈明
马倩如
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing shengjiaxiang Construction Engineering Co.,Ltd.
Original Assignee
Beijing Aikelite Optoelectronic Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Aikelite Optoelectronic Technology Co ltd filed Critical Beijing Aikelite Optoelectronic Technology Co ltd
Priority to CN202111321792.9A priority Critical patent/CN113776540B/en
Publication of CN113776540A publication Critical patent/CN113776540A/en
Application granted granted Critical
Publication of CN113776540B publication Critical patent/CN113776540B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S11/00Systems for determining distance or velocity not using reflection or reradiation
    • G01S11/12Systems for determining distance or velocity not using reflection or reradiation using electromagnetic waves other than radio waves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Image Processing (AREA)

Abstract

The invention provides a control method for tracking a mobile vehicle in real time by a vehicle-mounted tethered unmanned aerial vehicle based on visual navigation positioning, which comprises the following steps: performing visual ranging on the tracked vehicle according to the ranging sensor, and feeding back visual ranging information to the cloud; estimating the pose of the tracked vehicle according to the visual ranging information stored in the cloud, and determining the pose of the tracked vehicle; detecting an obstacle according to the pose of the tracked vehicle, and planning a local path according to the obstacle detection result; and determining a tracking waypoint in the local path, and guiding the flight state of the unmanned aerial vehicle by using the waypoint. The invention is used for automatically positioning the tracked object and further realizing displacement along with the movement of the tracked object after positioning.

Description

Control method for vehicle-mounted tethered unmanned aerial vehicle to track moving vehicle in real time based on visual navigation positioning
Technical Field
The invention relates to the technical field of real-time tracking of unmanned aerial vehicles, in particular to a control method for tracking a moving vehicle by a vehicle-mounted mooring unmanned aerial vehicle in real time based on visual navigation positioning.
Background
At present, unmanned aerial vehicles are more and more widely applied, and the existing unmanned aerial vehicles are operated manually and remotely, so that the unmanned aerial vehicles fly, further acquire information in the flying process and finally transmit the acquired information back to a terminal;
however, when a plurality of unmanned aerial vehicles work simultaneously, a terminal is difficult to consider the simultaneous control of the plurality of unmanned aerial vehicles, so that an unmanned aerial vehicle control method which can automatically position a tracked object and further can realize displacement along with the movement of the tracked object after positioning is lacked.
Disclosure of Invention
The invention provides a control method for tracking a moving vehicle in real time by a vehicle-mounted tethered unmanned aerial vehicle based on visual navigation positioning, which is used for automatically positioning a tracked object and further realizing displacement along with the movement of the tracked object after positioning.
The invention provides a control method for tracking a mobile vehicle in real time by a vehicle-mounted tethered unmanned aerial vehicle based on visual navigation positioning, which comprises the following steps:
performing visual ranging on the tracked vehicle according to the ranging sensor, and feeding back visual ranging information to the cloud;
estimating the pose of the tracked vehicle according to the visual ranging information stored in the cloud, and determining the pose of the tracked vehicle;
detecting an obstacle according to the pose of the tracked vehicle, and planning a local path according to the obstacle detection result;
a tracking waypoint is determined in the local path, with which to guide the flight of the drone.
Preferably, the visually ranging the tracked vehicle according to the ranging sensor, and feeding back the visually ranging information to the cloud end comprises:
acquiring image information of a plurality of cameras of the unmanned aerial vehicle, extracting internal parameters of the image information and external parameters of the cameras,
carrying out reverse distortion on the image information of the plurality of cameras to obtain initial calibration image information;
extracting the characteristics of the plurality of pieces of initial calibration image information, and matching the extracted same characteristics to obtain secondary correction image information;
projecting the plurality of secondary correction image information to the same horizontal plane of the plurality of corrected images to be corrected, and performing tertiary correction on the plurality of secondary correction image information through homography matrix deformation;
and (4) performing distance measurement and calculation by utilizing the three-time corrected image information to obtain visual ranging information.
Preferably, the internal parameters of the drone camera include: focal length and pixel information;
the external parameters of the unmanned aerial vehicle camera are the camera position and the camera rotation direction in the world coordinate system.
Preferably, the information of the three times of corrected images includes a first image group and a second image group;
the distance measurement and calculation by utilizing the three times of corrected image information to obtain the visual ranging information comprises the following steps:
the current time in the first image group and the second image grouptAnd the last momentt-1Respectively extracting corresponding images in the images to obtain four groups of images;
acquiring feature matching information in the feature corresponding relation of the four groups of image groups;
according to the feature matching information, constructing reconstruction positions in the reconstruction scene features of the first image group and the second image group;
and (3) performing coordinate transformation by using the formula (1) according to the reconstruction position:
Figure 914676DEST_PATH_IMAGE001
where n is the sum of the points present in the reconstruction location;
Figure 651688DEST_PATH_IMAGE002
is the last momentt-1Time pointqA location in the reconstruction location;
Figure 240320DEST_PATH_IMAGE003
as the current timetTime pointqA location in the reconstruction location;Xin coordinate transformation for reconstructing positionThe coordinates after the rotational movement are obtained,sthe coordinates after the translation action in the coordinate transformation of the reconstruction position;
if a coordinate system is defined at the center of any one of the image groups and the image groups are corrected, the reconstructed location coordinates of the feature matching information are obtained as shown in equation (2)
Figure 670164DEST_PATH_IMAGE004
Wherein (A), (B), (C), (D), (C), (B), (C)d 1 d 2 ) Respectively the feature plane coordinates of one of the image groups,fis the focal length of the camera and is,Jis a baseline for a plurality of cameras,cfor the feature disparity on each group of images,Zto reconstruct the position coordinates.
Preferably, estimating the pose of the tracked vehicle according to the visual ranging information stored in the cloud, and determining the pose of the tracked vehicle further includes:
setting a known starting point for the reconstructed coordinates, and accumulating the transformation of the frame according to the starting point; if the pose of the first frame is (0, 0, 0) of the world coordinate system, the tracked vehicle is at the current momenttThe pose of (a) is formula (3):
Figure 974107DEST_PATH_IMAGE005
wherein the content of the first and second substances,O t-1 record for 3 x 3 matrix at last momentt-1Tracked vehicle direction of;q t-1 record the last moment for the 3 x 1 vectort-1Tracked vehicle pose coordinates;O t for 3 x 3 matrix intTracked vehicle direction at time;q t for 3 x 1 vector intThe tracked vehicle pose coordinates at that moment;X tis composed oftRotation parameters in the coordinate transformation of the reconstruction position at the moment;S t is composed oftTranslation parameters in the coordinate transformation of the reconstruction location in time instants.
Preferably, the obstacle detection is performed according to the pose of the tracked vehicle, and a local path is planned according to the obstacle detection result; further comprising:
carrying out three-dimensional matching on the pose of the tracked vehicle and the pose of the unmanned aerial vehicle to obtain a real-time running path;
detecting obstacles according to the real-time running path of the tracked vehicle;
and if no obstacle exists, planning a plurality of local paths existing in the flight path of the unmanned aerial vehicle.
Preferably, if there is an obstacle, the temporary waypoints are matchedmAnd calculating a temporary waypointmAnd the current timetDistance of the unmanned aerial vehicle position;
gridding the distance to obtain a plurality of sub-grid units;
defining an area outside the plurality of sub-grid cells as unobstructed;
and acquiring an obstacle-free boundary, and moving the unmanned aerial vehicle to an obstacle-free area.
Preferably, the determining a tracking waypoint in the local path, the guiding the flight of the unmanned aerial vehicle by using the tracking waypoint, includes:
segmenting the local path, extracting features of sub-local paths in each segment, and defining a plurality of feature information as waypoints;
connecting a plurality of waypoints in a time sequence to obtain waypoint lines;
and taking the waypoint line as a guide line of the flight state of the unmanned aerial vehicle.
In the invention, the distance of the tracked vehicle is monitored by the unmanned aerial vehicle by using visual ranging, the pose of the tracked vehicle is further estimated by using visual ranging information, and then the pose of the unmanned aerial vehicle is determined and estimated, so that the unmanned aerial vehicle can move adaptively along with the movement of the tracked vehicle; simultaneously, at the tracking in-process to the vehicle that is tracked, can also realize the detection to the barrier for unmanned aerial vehicle can avoid the barrier at the in-process of pursuing the target, has improved unmanned aerial vehicle security and stability at the execution flight task in-process. The purpose that a plurality of unmanned aerial vehicles can respectively carry out work simultaneously is realized, the tracked object can be automatically positioned, and further displacement along with the movement of the tracked object can be realized after positioning.
Additional features and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention. The objectives and other advantages of the invention may be realized and attained by the structure particularly pointed out in the written description.
The technical solution of the present invention is further described in detail by the following examples.
Detailed Description
The following description is presented in conjunction with the preferred embodiments of the present invention, and it is to be understood that the preferred embodiments described herein are presented only for the purpose of illustrating and explaining the present invention, and are not intended to limit the present invention.
The embodiment of the invention provides a control method for a vehicle-mounted tethered unmanned aerial vehicle to track a moving vehicle in real time based on visual navigation positioning, which comprises the following steps:
performing visual ranging on the tracked vehicle according to the ranging sensor, and feeding back visual ranging information to the cloud;
estimating the pose of the tracked vehicle according to the visual ranging information stored in the cloud, and determining the pose of the tracked vehicle;
detecting an obstacle according to the pose of the tracked vehicle, and planning a local path according to the obstacle detection result;
and determining a tracking navigation point in the local path, and guiding the flight of the unmanned aerial vehicle by using the tracking navigation point.
In the invention, the distance of the tracked vehicle is monitored by the unmanned aerial vehicle by using visual ranging, the pose of the tracked vehicle is further estimated by using visual ranging information, and then the pose of the unmanned aerial vehicle is determined and estimated, so that the unmanned aerial vehicle can move adaptively along with the movement of the tracked vehicle; simultaneously, at the tracking in-process to the vehicle that is tracked, can also realize the detection to the barrier for unmanned aerial vehicle can avoid the barrier at the in-process of pursuing the target, has improved unmanned aerial vehicle security and stability at the execution flight task in-process. The purpose that a plurality of unmanned aerial vehicles can respectively carry out work simultaneously is realized, the tracked object can be automatically positioned, and further displacement along with the movement of the tracked object can be realized after positioning.
In one embodiment, the visually ranging the tracked vehicle according to the ranging sensor and feeding back the visual ranging information to the cloud end comprises:
acquiring image information of a plurality of cameras of the unmanned aerial vehicle, extracting internal parameters of the image information and external parameters of the cameras,
carrying out reverse distortion on the image information of the plurality of cameras to obtain initial calibration image information;
extracting the characteristics of the plurality of pieces of initial calibration image information, and matching the extracted same characteristics to obtain secondary correction image information;
projecting the plurality of secondary correction image information to the same horizontal plane of the plurality of corrected images to be corrected, and performing tertiary correction on the plurality of secondary correction image information through homography matrix deformation;
and (4) performing distance measurement and calculation by utilizing the three-time corrected image information to obtain visual ranging information.
The internal parameters of the unmanned aerial vehicle camera include: focal length and pixel information;
the external parameters of the unmanned aerial vehicle camera are the camera position and the camera rotation direction in the world coordinate system.
The information of the three times of corrected images comprises a first image group and a second image group;
the distance measurement and calculation by utilizing the three times of corrected image information to obtain the visual ranging information comprises the following steps:
the current time in the first image group and the second image grouptAnd the last momentt-1Respectively extracting corresponding images in the images to obtain four groups of images;
acquiring feature matching information in the feature corresponding relation of the four groups of image groups;
according to the feature matching information, constructing reconstruction positions in the reconstruction scene features of the first image group and the second image group;
and (3) performing coordinate transformation by using the formula (1) according to the reconstruction position:
Figure 249230DEST_PATH_IMAGE001
where n is the sum of the points present in the reconstruction location;
Figure 892701DEST_PATH_IMAGE002
is the last momentt-1Time pointqA location in the reconstruction location;
Figure 290185DEST_PATH_IMAGE003
as the current timetTime pointqA location in the reconstruction location;Xto reconstruct the coordinates after the rotation action in the coordinate transformation of the position,sthe coordinates after the translation action in the coordinate transformation of the reconstruction position;
if a coordinate system is defined at the center of any one of the image groups and the image groups are corrected, the reconstructed location coordinates of the feature matching information are obtained as shown in equation (2)
Figure 19106DEST_PATH_IMAGE006
Wherein (A), (B), (C), (D), (C), (B), (C)d 1 d 2 ) Respectively the feature plane coordinates of one of the image groups,fis the focal length of the camera and is,Jis a baseline for a plurality of cameras,cfor the feature disparity on each group of images,Zto reconstruct the position coordinates.
Estimating the pose of the tracked vehicle according to the visual ranging information stored in the cloud, and determining the pose of the tracked vehicle further comprises:
setting a known starting point for the reconstructed coordinates, and accumulating the transformation of the frame according to the starting point; if the pose of the first frame is (0, 0, 0) of the world coordinate system, the tracked vehicle is at the current moment
Figure 97921DEST_PATH_IMAGE007
The pose of (a) is formula (3):
Figure 658215DEST_PATH_IMAGE005
wherein the content of the first and second substances,O t-1 record for 3 x 3 matrix at last momentt-1Tracked vehicle direction of;q t-1 record the last moment for the 3 x 1 vectort-1Tracked vehicle pose coordinates;O t for 3 x 3 matrix intTracked vehicle direction at time;q t for 3 x 1 vector intThe tracked vehicle pose coordinates at that moment;X tis composed oftRotation parameters in the coordinate transformation of the reconstruction position at the moment;S t is composed oftTranslation parameters in the coordinate transformation of the reconstruction location in time instants.
In the embodiment, in the wide-angle image information, the image information around the wide-angle image has distortion, and the distorted image information can cause data inaccuracy, so that the wide-angle image information can better accord with real image information by correcting the distortion, and the situation that the unmanned aerial vehicle is positioned to have larger deviation due to image distortion is reduced; further, feature extraction and coordinate transformation are carried out on the corrected image information, so that the extracted feature information can generate the pose coordinates of the tracked vehicle; and matching the pose coordinates of the tracked vehicle with the pose coordinates of the unmanned aerial vehicle to obtain the tracking path of the unmanned aerial vehicle. Therefore, the purpose of automatic tracking of the unmanned aerial vehicle is achieved.
In one embodiment, the obstacle detection is performed according to the pose of the tracked vehicle, and a local path is planned according to the obstacle detection result; further comprising:
carrying out three-dimensional matching on the pose of the tracked vehicle and the pose of the unmanned aerial vehicle to obtain a real-time running path;
detecting obstacles according to the real-time running path of the tracked vehicle;
and if no obstacle exists, planning a plurality of local paths existing in the flight path of the unmanned aerial vehicle.
If the obstacle exists, matching the temporary path pointmAnd calculating a temporary waypointmAnd the current timetDistance of the unmanned aerial vehicle position;
gridding the distance to obtain a plurality of sub-grid units;
defining an area outside the plurality of sub-grid cells as unobstructed;
and acquiring an obstacle-free boundary, and moving the unmanned aerial vehicle to an obstacle-free area.
The determining a tracking waypoint in the local path, using the tracking waypoint to guide the flight of the unmanned aerial vehicle, includes: the method also comprises the following steps:
segmenting the local path, extracting features of sub-local paths in each segment, and defining a plurality of feature information as waypoints;
connecting a plurality of waypoints in a time sequence to obtain waypoint lines;
and taking the waypoint line as a guide line of the flight state of the unmanned aerial vehicle.
In the embodiment, the obstacle is detected in the image information, so that the unmanned aerial vehicle can effectively avoid the obstacle in the process of executing the flight mission, and the condition that the unmanned aerial vehicle damages the obstacle due to touch is reduced; and further, a guiding route of the flight state of the unmanned aerial vehicle is re-planned by using a path avoiding the obstacle, namely, the method is used for adjusting the flight state or the flight attitude of the unmanned aerial vehicle at the next moment when the obstacle is judged to exist in advance, and guiding the unmanned aerial vehicle to avoid the obstacle.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present invention without departing from the spirit and scope of the invention. Thus, if such modifications and variations of the present invention fall within the scope of the claims of the present invention and their equivalents, the present invention is also intended to include such modifications and variations.

Claims (8)

1. The utility model provides a control method of vehicle-mounted mooring unmanned aerial vehicle real-time tracking moving vehicle based on vision navigation positioning which characterized in that includes:
performing visual ranging on the tracked vehicle according to the ranging sensor, and feeding back visual ranging information to the cloud;
estimating the pose of the tracked vehicle according to the visual ranging information stored in the cloud, and determining the pose of the tracked vehicle;
detecting an obstacle according to the pose of the tracked vehicle, and planning a local path according to the obstacle detection result;
and determining a tracking navigation point in the local path, and guiding the flight of the unmanned aerial vehicle by using the tracking navigation point.
2. The control method for the vehicle-mounted tethered unmanned aerial vehicle to track the moving vehicle in real time based on visual navigation positioning as claimed in claim 1, wherein the visually ranging the tracked vehicle according to the ranging sensor and feeding back the visual ranging information to the cloud end comprises:
acquiring image information of a plurality of cameras of the unmanned aerial vehicle, extracting internal parameters of the image information and external parameters of the cameras,
carrying out reverse distortion on the image information of the plurality of cameras to obtain initial calibration image information;
extracting the characteristics of the plurality of pieces of initial calibration image information, and matching the extracted same characteristics to obtain secondary correction image information;
projecting the plurality of secondary correction image information to the same horizontal plane of the plurality of corrected images to be corrected, and performing tertiary correction on the plurality of secondary correction image information through homography matrix deformation;
and (4) performing distance measurement and calculation by utilizing the three-time corrected image information to obtain visual ranging information.
3. The control method for the vehicle-mounted tethered unmanned aerial vehicle to track the moving vehicle in real time based on visual navigation positioning as claimed in claim 2,
the internal parameters of the unmanned aerial vehicle camera include: focal length and pixel information;
the external parameters of the unmanned aerial vehicle camera are the camera position and the camera rotation direction in the world coordinate system.
4. The control method for the vehicle-mounted tethered unmanned aerial vehicle to track the moving vehicle in real time based on visual navigation positioning as claimed in claim 2,
the information of the three times of corrected images comprises a first image group and a second image group;
the distance measurement and calculation by utilizing the three times of corrected image information to obtain the visual ranging information comprises the following steps:
the current time in the first image group and the second image grouptAnd the last momentt-1Respectively extracting corresponding images in the images to obtain four groups of images;
acquiring feature matching information in the feature corresponding relation of the four groups of image groups;
according to the feature matching information, constructing reconstruction positions in the reconstruction scene features of the first image group and the second image group;
and (3) performing coordinate transformation by using the formula (1) according to the reconstruction position:
Figure 288641DEST_PATH_IMAGE001
where n is the sum of the points present in the reconstruction location;
Figure 555674DEST_PATH_IMAGE002
is the last momentt-1Time pointqA location in the reconstruction location;
Figure 816891DEST_PATH_IMAGE003
as the current timetTime pointqA location in the reconstruction location;Xto reconstruct the coordinates after the rotation action in the coordinate transformation of the position,sthe coordinates after the translation action in the coordinate transformation of the reconstruction position;
if a coordinate system is defined at the center of any one of the image groups and the image groups are corrected, the reconstructed location coordinates of the feature matching information are obtained as shown in equation (2)
Figure 485770DEST_PATH_IMAGE004
Wherein (A), (B), (C), (D), (C), (B), (C)d 1 d 2 ) Respectively the feature plane coordinates of one of the image groups,fis the focal length of the camera and is,Jis a baseline for a plurality of cameras,cfor the feature disparity on each group of images,Zto reconstruct the position coordinates.
5. The control method for the vehicle-mounted tethered unmanned aerial vehicle to track the moving vehicle in real time based on visual navigation positioning as claimed in claim 2,
estimating the pose of the tracked vehicle according to the visual ranging information stored in the cloud, and determining the pose of the tracked vehicle further comprises:
setting a known starting point for the reconstructed coordinates, and accumulating the transformation of the frame according to the starting point; if the pose of the first frame is (0, 0, 0) of the world coordinate system, the pose of the tracked vehicle at the current time t is formula (3):
Figure 479134DEST_PATH_IMAGE005
wherein the content of the first and second substances,O t-1 record for 3 x 3 matrix at last momentt-1Tracked vehicle direction of;q t-1 record the last moment for the 3 x 1 vectort-1Tracked vehicle pose coordinates;O t for 3 x 3 matrix intTracked vehicle direction at time;q t for 3 x 1 vector intThe tracked vehicle pose coordinates at that moment;X tis composed oftRotation parameters in the coordinate transformation of the reconstruction position at the moment;S t is composed oftTranslation parameters in the coordinate transformation of the reconstruction location in time instants.
6. The control method for the vehicle-mounted tethered unmanned aerial vehicle to track the moving vehicle in real time based on visual navigation positioning as claimed in claim 5,
detecting an obstacle according to the pose of the tracked vehicle, and planning a local path according to the obstacle detection result; further comprising:
carrying out three-dimensional matching on the pose of the tracked vehicle and the pose of the unmanned aerial vehicle to obtain a real-time running path;
detecting obstacles according to the real-time running path of the tracked vehicle;
and if no obstacle exists, planning a plurality of local paths existing in the flight path of the unmanned aerial vehicle.
7. The control method for the vehicle-mounted tethered unmanned aerial vehicle to track the moving vehicle in real time based on visual navigation positioning as claimed in claim 6,
if the obstacle exists, matching the temporary path pointmAnd calculating a temporary pathmAnd the current timetDistance of the unmanned aerial vehicle position;
gridding the distance to obtain a plurality of sub-grid units;
defining an area outside the plurality of sub-grid cells as unobstructed;
and acquiring an obstacle-free boundary, and moving the unmanned aerial vehicle to an obstacle-free area.
8. The control method for the vehicle-mounted tethered drone to track the moving vehicle in real time based on visual navigation positioning as claimed in claim 1, wherein said determining a tracking waypoint in the local path with which to guide the flight of the drone comprises:
segmenting the local path, extracting features of sub-local paths in each segment, and defining a plurality of feature information as waypoints;
connecting a plurality of waypoints in a time sequence to obtain waypoint lines;
and taking the waypoint line as a guide line of the flight state of the unmanned aerial vehicle.
CN202111321792.9A 2021-11-09 2021-11-09 Control method for vehicle-mounted tethered unmanned aerial vehicle to track moving vehicle in real time based on visual navigation positioning Active CN113776540B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111321792.9A CN113776540B (en) 2021-11-09 2021-11-09 Control method for vehicle-mounted tethered unmanned aerial vehicle to track moving vehicle in real time based on visual navigation positioning

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111321792.9A CN113776540B (en) 2021-11-09 2021-11-09 Control method for vehicle-mounted tethered unmanned aerial vehicle to track moving vehicle in real time based on visual navigation positioning

Publications (2)

Publication Number Publication Date
CN113776540A true CN113776540A (en) 2021-12-10
CN113776540B CN113776540B (en) 2022-03-22

Family

ID=78956911

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111321792.9A Active CN113776540B (en) 2021-11-09 2021-11-09 Control method for vehicle-mounted tethered unmanned aerial vehicle to track moving vehicle in real time based on visual navigation positioning

Country Status (1)

Country Link
CN (1) CN113776540B (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105759839A (en) * 2016-03-01 2016-07-13 深圳市大疆创新科技有限公司 Unmanned aerial vehicle (UAV) visual tracking method, apparatus, and UAV
WO2017177533A1 (en) * 2016-04-12 2017-10-19 深圳市龙云创新航空科技有限公司 Method and system for controlling laser radar based micro unmanned aerial vehicle
CN107424196A (en) * 2017-08-03 2017-12-01 江苏钜芯集成电路技术股份有限公司 A kind of solid matching method, apparatus and system based on the weak more mesh cameras of demarcation
CN107610157A (en) * 2016-07-12 2018-01-19 深圳雷柏科技股份有限公司 A kind of unmanned plane target method for tracing and system
CN108594851A (en) * 2015-10-22 2018-09-28 飞智控(天津)科技有限公司 A kind of autonomous obstacle detection system of unmanned plane based on binocular vision, method and unmanned plane
CN108731587A (en) * 2017-04-14 2018-11-02 中交遥感载荷(北京)科技有限公司 A kind of the unmanned plane dynamic target tracking and localization method of view-based access control model
CN109753076A (en) * 2017-11-03 2019-05-14 南京奇蛙智能科技有限公司 A kind of unmanned plane vision tracing implementing method
WO2020135446A1 (en) * 2018-12-24 2020-07-02 深圳市道通智能航空技术有限公司 Target positioning method and device and unmanned aerial vehicle
US20210041896A1 (en) * 2019-03-13 2021-02-11 Goertek Inc. Method for controlling a drone, drone and system
CN113467500A (en) * 2021-07-19 2021-10-01 天津大学 Unmanned aerial vehicle non-cooperative target tracking system based on binocular vision

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108594851A (en) * 2015-10-22 2018-09-28 飞智控(天津)科技有限公司 A kind of autonomous obstacle detection system of unmanned plane based on binocular vision, method and unmanned plane
CN105759839A (en) * 2016-03-01 2016-07-13 深圳市大疆创新科技有限公司 Unmanned aerial vehicle (UAV) visual tracking method, apparatus, and UAV
WO2017177533A1 (en) * 2016-04-12 2017-10-19 深圳市龙云创新航空科技有限公司 Method and system for controlling laser radar based micro unmanned aerial vehicle
CN107610157A (en) * 2016-07-12 2018-01-19 深圳雷柏科技股份有限公司 A kind of unmanned plane target method for tracing and system
CN108731587A (en) * 2017-04-14 2018-11-02 中交遥感载荷(北京)科技有限公司 A kind of the unmanned plane dynamic target tracking and localization method of view-based access control model
CN107424196A (en) * 2017-08-03 2017-12-01 江苏钜芯集成电路技术股份有限公司 A kind of solid matching method, apparatus and system based on the weak more mesh cameras of demarcation
CN109753076A (en) * 2017-11-03 2019-05-14 南京奇蛙智能科技有限公司 A kind of unmanned plane vision tracing implementing method
WO2020135446A1 (en) * 2018-12-24 2020-07-02 深圳市道通智能航空技术有限公司 Target positioning method and device and unmanned aerial vehicle
US20210041896A1 (en) * 2019-03-13 2021-02-11 Goertek Inc. Method for controlling a drone, drone and system
CN113467500A (en) * 2021-07-19 2021-10-01 天津大学 Unmanned aerial vehicle non-cooperative target tracking system based on binocular vision

Also Published As

Publication number Publication date
CN113776540B (en) 2022-03-22

Similar Documents

Publication Publication Date Title
JP7326720B2 (en) Mobile position estimation system and mobile position estimation method
CN108051002B (en) Transport vehicle space positioning method and system based on inertial measurement auxiliary vision
CN111326023B (en) Unmanned aerial vehicle route early warning method, device, equipment and storage medium
CN109324337B (en) Unmanned aerial vehicle route generation and positioning method and device and unmanned aerial vehicle
CN112197770B (en) Robot positioning method and positioning device thereof
CN111448478B (en) System and method for correcting high-definition maps based on obstacle detection
CN111046743B (en) Barrier information labeling method and device, electronic equipment and storage medium
CN111462200A (en) Cross-video pedestrian positioning and tracking method, system and equipment
CN108896994A (en) A kind of automatic driving vehicle localization method and equipment
CN106092123B (en) A kind of video navigation method and device
Shunsuke et al. GNSS/INS/on-board camera integration for vehicle self-localization in urban canyon
CN108151713A (en) A kind of quick position and orientation estimation methods of monocular VO
JP2018081008A (en) Self position posture locating device using reference video map
JP6349737B2 (en) Moving object tracking device and moving object tracking method
CN112991401B (en) Vehicle running track tracking method and device, electronic equipment and storage medium
Caballero et al. Improving vision-based planar motion estimation for unmanned aerial vehicles through online mosaicing
CN116359873A (en) Method, device, processor and storage medium for realizing SLAM processing of vehicle-end 4D millimeter wave radar by combining fisheye camera
Bazin et al. UAV attitude estimation by vanishing points in catadioptric images
JP2023525927A (en) Vehicle localization system and method
CN115077519A (en) Positioning and mapping method and device based on template matching and laser inertial navigation loose coupling
CN111539305A (en) Map construction method and system, vehicle and storage medium
CN113776540B (en) Control method for vehicle-mounted tethered unmanned aerial vehicle to track moving vehicle in real time based on visual navigation positioning
CN114820768B (en) Method for aligning geodetic coordinate system and slam coordinate system
CN113554705B (en) Laser radar robust positioning method under changing scene
EP3816938A1 (en) Region clipping method and recording medium storing region clipping program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20220413

Address after: 101399 room 184, 1st floor, building 17, 16 Caixiang East Road, Nancai Town, Shunyi District, Beijing

Patentee after: Beijing shengjiaxiang Construction Engineering Co.,Ltd.

Address before: 100080 3rd floor, building 1, 66 Zhongguancun East Road, Haidian District, Beijing

Patentee before: BEIJING AIKELITE OPTOELECTRONIC TECHNOLOGY Co.,Ltd.