CN115031732A - Ground engineering vehicle positioning method utilizing binocular vision of unmanned aerial vehicle - Google Patents
Ground engineering vehicle positioning method utilizing binocular vision of unmanned aerial vehicle Download PDFInfo
- Publication number
- CN115031732A CN115031732A CN202210481404.1A CN202210481404A CN115031732A CN 115031732 A CN115031732 A CN 115031732A CN 202210481404 A CN202210481404 A CN 202210481404A CN 115031732 A CN115031732 A CN 115031732A
- Authority
- CN
- China
- Prior art keywords
- vehicle
- unmanned aerial
- aerial vehicle
- tag
- ground
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Image Processing (AREA)
Abstract
The invention discloses a ground engineering vehicle positioning method by utilizing binocular vision of an unmanned aerial vehicle, which comprises the following steps: firstly, setting a ground label and a vehicle label by using an Apriltag; secondly, setting an unmanned aerial vehicle observation tag with a binocular camera, shooting one or more depth images including all tags, and resolving position-posture coordinates of each tag in the shot depth images under a camera coordinate system; the unmanned aerial vehicle binocular recognition ground tag system comprises an unmanned aerial vehicle, a ground tag, a construction site and a control system, wherein the unmanned aerial vehicle binocular recognition ground tag establishes a plane rectangular coordinate system of the construction site as a reference coordinate system; the unmanned aerial vehicle binocular recognition method comprises the following steps that an unmanned aerial vehicle binocular recognizes vehicle labels in a construction site, the attitude coordinates of the vehicle labels are calculated in an image through a reference coordinate system to position the vehicle labels, and position information of corresponding vehicles is generated; and thirdly, the unmanned aerial vehicle sends the position information of the vehicle to the terminal equipment in real time. Other vehicle positions are obtained through the terminal equipment, and the defects that the detection range is small, limitation exists and potential safety hazards are large when the vehicle runs are avoided.
Description
Technical Field
The invention relates to the technical field of visual navigation, in particular to a ground engineering vehicle positioning method utilizing binocular vision of an unmanned aerial vehicle.
Background
In a typical outdoor construction site, multiple and multiple types of engineering vehicles are often in synchronous matched construction, for example, an asphalt paver, a road roller and a material conveying vehicle can simultaneously operate in the construction site when a new high-speed road surface is built, and a constructor shuttles back and forth in the construction process. The engineering vehicles are large in size, heavy in loading and large in blind area, injuries and deaths of drivers and constructors are often caused when the engineering vehicles collide, and a great number of potential safety hazards are brought to the whole construction process. At present, many engineering vehicles are equipped with various sensors, such as front/rear/look-around cameras and ultrasonic radars, to assist drivers to safely avoid obstacles. However, there is a limitation that this method can only know the relative positions of vehicles and people within a certain range around itself, but cannot know the positions of other vehicles in the whole construction site.
Disclosure of Invention
Therefore, the technical problem to be solved by the invention is to overcome the problems that in an outdoor construction site in the prior art, a blind area exists in the moving process of an engineering vehicle, the running tracks of surrounding vehicles and operators cannot be observed completely, accidents are easy to happen, and potential safety hazards exist, so that the ground engineering vehicle positioning method utilizing binocular vision of the unmanned aerial vehicle is provided.
A ground engineering vehicle positioning method utilizing binocular vision of an unmanned aerial vehicle comprises the following steps:
firstly, arranging Apriltag labels on multiple ground surfaces at the boundary of a construction site as ground labels; an Apriltag is arranged on the top surface of each vehicle and serves as a vehicle tag;
secondly, setting an unmanned aerial vehicle observation tag with a binocular camera, shooting one or more depth images including all tags, and resolving position-posture coordinates of each tag in the shot depth images under a camera coordinate system;
the unmanned aerial vehicle binocular recognition ground tag method comprises the following steps that an unmanned aerial vehicle binocular recognizes a ground tag, and a plane rectangular coordinate system of a construction site is established as a reference coordinate system;
the unmanned aerial vehicle binocular recognition method comprises the steps that an unmanned aerial vehicle binocular recognizes vehicle labels in a construction site, the attitude coordinates of the vehicle labels are resolved in an image through a reference coordinate system to position the vehicle labels, and corresponding vehicle positioning information is generated;
and thirdly, the unmanned aerial vehicle sends the vehicle positioning information to the terminal equipment in real time.
Preferably, in the second step:
the unmanned aerial vehicle is suspended above the construction site, after the position is stable, the first frame of depth image is generated through the binocular camera, and the plane rectangular coordinate system of the construction site is established through the first frame of depth image.
Preferably, in the second step, the specific step of establishing the planar rectangular coordinate system is as follows:
firstly, a ground label at one position of the boundary line is selected as a coordinate origin, then a connecting line of the ground label at the other position of the boundary line and the coordinate origin is selected to form a coordinate axis, and then another vertical coordinate axis extends from the coordinate origin so as to bring all labels into a first quadrant of the plane rectangular coordinate system.
Preferably, in the second step, the attitude coordinates of the vehicle tag are resolved by using the reference coordinate system to position the vehicle tag, and corresponding vehicle positioning information is generated, and the specific steps are as follows:
the unmanned aerial vehicle acquires coordinates of three ground tags around the target vehicle tag by shooting an image, calculates the distance between the target vehicle tag and each ground tag, takes the length of the distance as a radius, takes the corresponding ground tag as a circle center to make a circle, takes each two circles as intersecting chord lines, acquires coordinates of midpoints of the three chord lines, and takes a coordinate mean value of the midpoints of the three chord lines as the position of the target vehicle tag.
Preferably, in the third step, the terminal device is a mobile phone of a vehicle driver.
Preferably, in the third step, the vehicle positioning information is transmitted in the form of an array, including the vehicle number, the position of the vehicle with the corresponding number in the rectangular coordinate system, and the heading of the vehicle head.
The unmanned aerial vehicle has the advantages that after the unmanned aerial vehicle reaches a proper height, the visual angle of the camera can cover most vehicles or even all vehicles in the field, the arrangement of the binocular camera and the AprilTag label can clearly and timely detect and accurately feed back the position information of the vehicles, and the limitation that the all-round camera and the ultrasonic radar can only detect the surrounding environment of the vehicles is avoided.
All vehicle drivers can learn other vehicle positions in the construction site, so that the driving safety of the vehicle can be improved, and the collision safety risk is reduced.
The commander in the engineering field can acquire the positioning conditions of all vehicles through the network and effectively schedule the construction operation of the engineering vehicles.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
FIG. 1 is a positioning calculation schematic diagram according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of AprilTag in accordance with an embodiment of the present invention;
FIG. 3 is a three point positioning schematic of an embodiment of the present invention;
FIG. 4 is a top view of an embodiment of the present invention;
fig. 5 is a schematic positioning diagram according to an embodiment of the present invention.
Detailed Description
The technical solutions of the present invention will be described clearly and completely with reference to the accompanying drawings, and it should be understood that the described embodiments are some, but not all embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In the description of the present invention, it should be noted that the terms "center", "upper", "lower", "left", "right", "vertical", "horizontal", "inner", "outer", etc., indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings, and are only for convenience of description and simplicity of description, but do not indicate or imply that the device or element being referred to must have a particular orientation, be constructed and operated in a particular orientation, and thus, should not be construed as limiting the present invention. Furthermore, the terms "first," "second," and "third" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
In the description of the present invention, it should be noted that, unless otherwise explicitly specified or limited, the terms "mounted," "connected," and "connected" are to be construed broadly, e.g., as meaning either a fixed connection, a removable connection, or an integral connection; can be mechanically or electrically connected; they may be connected directly or indirectly through intervening media, or they may be interconnected between two elements. The specific meanings of the above terms in the present invention can be understood in specific cases to those skilled in the art.
In addition, the technical features involved in the different embodiments of the present invention described below may be combined with each other as long as they do not conflict with each other.
As shown in fig. 1, a ground engineering vehicle positioning method using binocular vision of an unmanned aerial vehicle comprises the following steps:
step one, AprilTag labels are arranged on multiple ground at the boundary of a construction site and serve as ground labels; an aprilat label is arranged on the top surface of each vehicle to be used as a vehicle label.
And secondly, setting an unmanned aerial vehicle observation tag with a binocular camera, shooting one or more depth images including all tags, and calculating position-posture coordinates of each tag in the shot depth images under a camera coordinate system.
The unmanned aerial vehicle binocular recognizes the ground label, and a plane rectangular coordinate system of a construction site is established as a reference coordinate system.
The unmanned aerial vehicle binocular identification construction site vehicle label generation method comprises the steps of identifying a vehicle label in a construction site, resolving an attitude coordinate of the vehicle label in an image by using a reference coordinate system so as to position the vehicle label, and generating corresponding vehicle positioning information.
And thirdly, the unmanned aerial vehicle sends the vehicle positioning information to the terminal equipment in real time.
In the second step, the unmanned aerial vehicle is hovered above the construction site, after the position is stabilized, a first depth image is generated through the binocular camera, and a plane rectangular coordinate system of the construction site is established through the first frame depth image.
In the second step, the specific steps of establishing the plane intuitive coordinate system are that the ground label at one position of the boundary line is selected as the coordinate origin, then the ground label at the other position of the boundary line is selected to be connected with the coordinate origin to form one coordinate axis of the rectangular coordinate system, then the other coordinate axis of the rectangular coordinate system extends out from the coordinate origin, and all labels are contained in the first quadrant of the plane rectangular coordinate system.
In the second step, the attitude coordinate of the vehicle label is resolved by using a reference coordinate system to position the vehicle label, and the specific steps are as follows:
the unmanned aerial vehicle acquires coordinates of three ground tags around the target vehicle tag by shooting an image, calculates the distance between the target vehicle tag and each ground tag, takes the length of the distance as a radius, takes the corresponding ground tag as a circle center to make a circle, takes every two circles as intersecting chord lines, acquires coordinates of midpoints of the three chord lines, and takes a coordinate mean value of the midpoints of the three chord lines as the position of the target vehicle tag.
And in the third step, the terminal equipment is a mobile phone of a vehicle driver.
Vehicle positioning information is sent to the mobile phone of the driver by the unmanned aerial vehicle through the network in real time, and the vehicle positioning system is convenient and fast.
The aprilatag tag shown in fig. 2 is a visual tag, and is recognized by a binocular camera arranged on the unmanned aerial vehicle, and the position-posture of each note relative to the unmanned aerial vehicle is obtained by calculation.
The original pose information is obtained from the camera, whose coordinate system is the three-dimensional coordinate system of the camera itself, in the format: roll, pitch, yaw. In conjunction with the three-dimensional position information, each note position-pose coordinate acquired by the final camera is represented as (x, y, z, roll, pitch, yaw).
x, y, z, which represents the position of the corresponding label on the three-dimensional rectangular coordinate system of the camera, roll, pitch, yawroll, pitch, yaw respectively represent the rotation around the z-axis, the y-axis, and the x-axis.
After the unmanned aerial vehicle calculates, send vehicle position information in array form, including vehicle number, the position and the vehicle gesture of corresponding serial number vehicle in rectangular coordinate system.
The vehicle attitude refers to the heading direction of the vehicle under a two-dimensional plane coordinate system, namely the heading direction of the vehicle head, and is represented by theta. And combining the position of the vehicle in a two-dimensional coordinate system, wherein the format is x, y and theta.
As shown in fig. 3, the relative pose solution is performed with three-point positioning away: in a rectangular space coordinate system, the distance from a certain point to a fixed 3 points is known, and the point can be determined. Therefore, before formal construction, the unmanned aerial vehicle takes a first frame of depth image after stabilization, the position postures of more than 3 field places relative to the camera are obtained, at the moment, a field coordinate point is selected as a coordinate system origin, and a connecting line between the field place and the other field place is selected as a plane coordinate axis, so that a plane rectangular coordinate system of the construction field is constructed, and the positions of all ground labels on the plane rectangular coordinate system are calculated. After that, as long as more than 3 field positions are detected in each frame of image and are solved through the position and posture, the relative positions of all the machines in the construction field can be obtained.
When unmanned aerial vehicle fixes a position, utilized the space relation invariance, self location was different when unmanned aerial vehicle was taken by plane at every turn promptly, but if the shooting thing all motionless, then the relative relation of the object that same camera was shot was unchangeable. Therefore, the position of the vehicle in the coordinate system can be positioned by taking 3 site points after the plane coordinate system is set by virtue of the fact that the site positioning points are fixed.
In this embodiment, after unmanned aerial vehicle reachd suitable height, the visual angle of camera can cover most vehicles in place or even all vehicles, and the setting of binocular camera and aprilTag label can know timely detection and accurate feedback vehicle position information, has avoided looking around camera, ultrasonic radar can only be to the limitation that vehicle surrounding environment surveyed.
All vehicle position information can be acquired, and vehicle management and scheduling are facilitated: after the method is used, the unmanned aerial vehicle can transmit the positions of all vehicles in the construction site to the mobile phone end of the engineering responsible person, and can display the position information of all vehicles so as to provide the position information for engineering commanders.
The cost of the sensor is reduced: only one unmanned aerial vehicle provided with a binocular camera is needed, and only one AprilTag label paper needs to be attached to the vehicle end, so that the equipment cost is greatly reduced;
the application range is flexible, and the number of vehicles can be easily expanded: the limitation to the use place is very little, and discernment and operation process are all accomplished by the computational element who carries on the unmanned aerial vehicle, can switch to other places fast according to the deployment step and use. In the use process, if other vehicles join in midway, the vehicles are provided with numbers and are pasted with corresponding notepads.
Example 2
As shown in fig. 4 and 5, a method for positioning a ground engineering vehicle by using binocular vision of an unmanned aerial vehicle includes the following steps:
1. and constructing a site coordinate system.
a. A plurality of tags are selected around the field (more than or equal to 3 tags, and the number of the field tags is required to be more than or equal to 3 because the subsequent vehicle positioning is based on the three-point positioning principle).
b. Unmanned aerial vehicle hovers behind suitable position (can shoot all notes as far as possible, if can not once cover all labels and can shoot many times), gathers the image and discerns, calculates through the binocular camera, acquires that m place notes position, attitude information are down: { (id _0, x _0, y _0, z _0, roll _0, pitch _0, yaw _0), … … (id _ m, x _ m, y _ m, z _ m, roll _ m, pitch _ m, yaw _ m) }; is equal to (1).
c. Constructing a site coordinate system, which comprises the following specific steps:
selecting 1 point as the origin of a field coordinate system, for example, selecting the point id _ 0; and selecting another field location, if id _1, determining a straight line principle according to two points, and setting id _0 and id _1 as an x axis.
Since the construction site can be considered as a plane, all the sites are also on one plane under the camera coordinate system. A straight line passing through the point id _0 and perpendicular to the line connecting id _0 and id _1 is selected on the plane, and the straight line is set as the y-axis (high school mathematics knowledge). And at this point, the construction of the site coordinate system is completed.
d. Calculating the positions and postures of all other points except the origin id _0 and the x-axis id _1 under a field coordinate system, wherein the calculation method comprises the following steps:
x _ n is the distance of the field location n from the y axis of the coordinate axis.
y _ n is the distance of the field location n from the y-axis of the coordinate axis.
And theta _ n is a space included angle between a connecting line of the field point n and the id _0 point and an x axis of the coordinate axis, and can be obtained by projecting the connecting line to the coordinate plane.
2. Vehicle positioning
a. The unmanned aerial vehicle image comprises more than 3 places and vehicles.
b. According to the three-point positioning principle, the position and the posture of the vehicle can be determined according to the distance between the vehicle and 3 field places.
It should be understood that the above examples are only for clarity of illustration and are not intended to limit the embodiments. Other variations and modifications will be apparent to persons skilled in the art in light of the above description. And are neither required nor exhaustive of all embodiments. And obvious variations or modifications therefrom are within the scope of the invention.
Claims (6)
1. A ground engineering vehicle positioning method utilizing binocular vision of an unmanned aerial vehicle is characterized by comprising the following steps:
step one, AprilTag labels are arranged on multiple ground at the boundary of a construction site and serve as ground labels; an Apriltag is arranged on the top surface of each vehicle and serves as a vehicle tag;
secondly, setting an unmanned aerial vehicle observation tag with a binocular camera, shooting one or more depth images including all tags, and resolving position-posture coordinates of each tag in the shot depth images under a camera coordinate system;
the unmanned aerial vehicle binocular recognition ground tag method comprises the following steps that an unmanned aerial vehicle binocular recognizes a ground tag, and a plane rectangular coordinate system of a construction site is established as a reference coordinate system;
the unmanned aerial vehicle binocular recognition method comprises the steps that an unmanned aerial vehicle binocular recognizes vehicle labels in a construction site, the attitude coordinates of the vehicle labels are resolved in an image through a reference coordinate system to position the vehicle labels, and corresponding vehicle positioning information is generated;
and thirdly, the unmanned aerial vehicle sends the vehicle positioning information to the terminal equipment in real time.
2. The method of claim 1, wherein in the second step:
the unmanned aerial vehicle is suspended above the construction site, after the position is stable, the first frame of depth image is generated through the binocular camera, and the plane rectangular coordinate system of the construction site is established through the first frame of depth image.
3. The method for positioning the ground engineering vehicle by using binocular vision of the unmanned aerial vehicle as claimed in claim 2, wherein in the second step, the specific step of establishing the rectangular plane coordinate system is as follows:
firstly, one ground label on the boundary line is selected as the origin of coordinates, then another ground label on the boundary line is selected to be connected with the origin of coordinates to form a coordinate axis, and another vertical coordinate axis extends from the origin of coordinates so as to bring all labels into a first quadrant of the plane rectangular coordinate system.
4. The ground engineering vehicle positioning method using binocular vision of the unmanned aerial vehicle as claimed in claim 3, wherein in the second step, the attitude coordinates of the vehicle tag are resolved by a reference coordinate system to position the vehicle tag, and corresponding vehicle positioning information is generated, and the specific steps are as follows:
the unmanned aerial vehicle acquires coordinates of three ground tags around a target vehicle tag by shooting an image, calculates the distance between the target vehicle tag and each ground tag, takes the length of the distance as a radius, takes the corresponding ground tag as a circle center to make a circle, takes every two circles as intersecting chord lines, acquires coordinates of the midpoints of the three chord lines, and takes the coordinate mean value of the midpoints of the three chord lines as the position of the target vehicle tag.
5. The method for positioning a ground engineering vehicle by using binocular vision of an unmanned aerial vehicle as recited in claim 4, wherein in the third step, the terminal device is a mobile phone of a vehicle driver.
6. The method as claimed in claim 4, wherein in the third step, the vehicle positioning information is transmitted in the form of arrays including vehicle numbers, positions of vehicles with corresponding numbers in the rectangular coordinate system, and orientations of the vehicle heads.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210481404.1A CN115031732A (en) | 2022-05-05 | 2022-05-05 | Ground engineering vehicle positioning method utilizing binocular vision of unmanned aerial vehicle |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210481404.1A CN115031732A (en) | 2022-05-05 | 2022-05-05 | Ground engineering vehicle positioning method utilizing binocular vision of unmanned aerial vehicle |
Publications (1)
Publication Number | Publication Date |
---|---|
CN115031732A true CN115031732A (en) | 2022-09-09 |
Family
ID=83119731
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210481404.1A Withdrawn CN115031732A (en) | 2022-05-05 | 2022-05-05 | Ground engineering vehicle positioning method utilizing binocular vision of unmanned aerial vehicle |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115031732A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116990830A (en) * | 2023-09-27 | 2023-11-03 | 锐驰激光(深圳)有限公司 | Distance positioning method and device based on binocular and TOF, electronic equipment and medium |
-
2022
- 2022-05-05 CN CN202210481404.1A patent/CN115031732A/en not_active Withdrawn
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116990830A (en) * | 2023-09-27 | 2023-11-03 | 锐驰激光(深圳)有限公司 | Distance positioning method and device based on binocular and TOF, electronic equipment and medium |
CN116990830B (en) * | 2023-09-27 | 2023-12-29 | 锐驰激光(深圳)有限公司 | Distance positioning method and device based on binocular and TOF, electronic equipment and medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11320833B2 (en) | Data processing method, apparatus and terminal | |
CN110174093B (en) | Positioning method, device, equipment and computer readable storage medium | |
US10878288B2 (en) | Database construction system for machine-learning | |
CN105512646B (en) | A kind of data processing method, device and terminal | |
JP2020500767A (en) | Automatic vehicle parking system and method | |
CN110462343A (en) | The automated graphics for vehicle based on map mark | |
CN109783588A (en) | Error message detection method, device, equipment, vehicle and the storage medium of map | |
CN112365549B (en) | Attitude correction method and device for vehicle-mounted camera, storage medium and electronic device | |
US20200218907A1 (en) | Hybrid lane estimation using both deep learning and computer vision | |
CN104590573A (en) | Barrier avoiding system and method for helicopter | |
EP3799618B1 (en) | Method of navigating a vehicle and system thereof | |
CN113240939B (en) | Vehicle early warning method, device, equipment and storage medium | |
CN113033280A (en) | System and method for trailer attitude estimation | |
CN110491156A (en) | A kind of cognitive method, apparatus and system | |
CN210377164U (en) | Air-ground cooperative operation system | |
CN109375629A (en) | A kind of cruiser and its barrier-avoiding method that navigates | |
CN115440034B (en) | Vehicle-road cooperation realization method and realization system based on camera | |
CN110162066A (en) | Intelligent cruise control system | |
CN115031732A (en) | Ground engineering vehicle positioning method utilizing binocular vision of unmanned aerial vehicle | |
CN112884892A (en) | Unmanned mine car position information processing system and method based on road side device | |
CN112447058B (en) | Parking method, parking device, computer equipment and storage medium | |
EP3223188A1 (en) | A vehicle environment mapping system | |
JP2019519051A (en) | Intelligent lighting system, lighting device, vehicle, in-vehicle terminal, vehicle driving support system, and vehicle driving support method | |
CN213999507U (en) | Mobile robot system | |
CN111964673A (en) | Unmanned vehicle positioning system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WW01 | Invention patent application withdrawn after publication | ||
WW01 | Invention patent application withdrawn after publication |
Application publication date: 20220909 |