CN112327898A - Unmanned aerial vehicle and well patrol navigation method and device thereof - Google Patents

Unmanned aerial vehicle and well patrol navigation method and device thereof Download PDF

Info

Publication number
CN112327898A
CN112327898A CN202011228810.4A CN202011228810A CN112327898A CN 112327898 A CN112327898 A CN 112327898A CN 202011228810 A CN202011228810 A CN 202011228810A CN 112327898 A CN112327898 A CN 112327898A
Authority
CN
China
Prior art keywords
aerial vehicle
unmanned aerial
vehicle body
current
initial
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011228810.4A
Other languages
Chinese (zh)
Other versions
CN112327898B (en
Inventor
唐崇
仲兆峰
黄立明
李基源
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hitachi Building Technology Guangzhou Co Ltd
Original Assignee
Hitachi Building Technology Guangzhou Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi Building Technology Guangzhou Co Ltd filed Critical Hitachi Building Technology Guangzhou Co Ltd
Priority to CN202011228810.4A priority Critical patent/CN112327898B/en
Publication of CN112327898A publication Critical patent/CN112327898A/en
Application granted granted Critical
Publication of CN112327898B publication Critical patent/CN112327898B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/08Control of attitude, i.e. control of roll, pitch, or yaw
    • G05D1/0808Control of attitude, i.e. control of roll, pitch, or yaw specially adapted for aircraft
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft

Abstract

The application relates to a shaft inspection navigation method and device of an unmanned aerial vehicle and the unmanned aerial vehicle. The hoistway inspection navigation method of the unmanned aerial vehicle comprises the steps of outputting a rising instruction, and acquiring initial position information of an unmanned aerial vehicle body under the condition that the unmanned aerial vehicle body is located at an initial station; acquiring current position information of an unmanned aerial vehicle body in a flight process, and processing the initial position information and the current position information to obtain position deviation; correcting the ascending route of the unmanned aerial vehicle body according to the position deviation; detecting the acquisition flow of point cloud data and image data of the unmanned aerial vehicle body entering the current station when the unmanned aerial vehicle body arrives at the next station according to the corrected ascending route; outputting a takeoff instruction until all ascending stations complete the acquisition process under the condition that the acquisition process is completed; the takeoff instruction is used for instructing the unmanned aerial vehicle body to move from the current station to the next station. By the method, the unmanned aerial vehicle can finish self-service inspection navigation under the condition that the GPS signal of the shaft is weak.

Description

Unmanned aerial vehicle and well patrol navigation method and device thereof
Technical Field
The application relates to the technical field of hoistway inspection, in particular to a hoistway inspection navigation method and device of an unmanned aerial vehicle and the unmanned aerial vehicle.
Background
The hoistway is an important component of elevator equipment, and can provide a closed space to play a role in sound insulation, shock absorption and protection of safe operation of the elevator. And unmanned aerial vehicle obtains the carrier as a novel information, has the flexibility height, strong operability, with low costs and requires advantages such as low to the operation environment, can utilize unmanned aerial vehicle to patrol and examine work.
In the implementation process, the inventor finds that at least the following problems exist in the conventional technology: the traditional unmanned aerial vehicle navigation method cannot be realized in an elevator shaft.
Disclosure of Invention
In view of the above, it is necessary to provide a hoistway inspection navigation method and apparatus for an unmanned aerial vehicle, and an unmanned aerial vehicle, which can implement hoistway inspection.
In order to achieve the above object, in one aspect, an embodiment of the present invention provides a hoistway inspection navigation method for an unmanned aerial vehicle, including:
outputting a rising instruction, and acquiring initial position information of the unmanned aerial vehicle body under the condition that the unmanned aerial vehicle body is at an initial station;
acquiring current position information of an unmanned aerial vehicle body in a flight process, and processing the initial position information and the current position information to obtain position deviation;
correcting the ascending route of the unmanned aerial vehicle body according to the position deviation;
detecting the acquisition flow of point cloud data and image data of the unmanned aerial vehicle body entering the current station when the unmanned aerial vehicle body arrives at the next station according to the corrected ascending route; outputting a takeoff instruction until all ascending stations complete the acquisition process under the condition that the acquisition process is completed; the takeoff instruction is used for instructing the unmanned aerial vehicle body to move from the current station to the next station.
In one embodiment, the current location information includes a first current horizontal coordinate of the hoistway wall relative to the drone body; the initial position information comprises first initial horizontal coordinates of four walls of the shaft relative to the unmanned aerial vehicle body;
the step of processing the initial position information and the current position information to obtain the position deviation comprises the following steps:
and confirming the coordinate difference value of the first current horizontal coordinate and the first initial horizontal coordinate as the position deviation.
In one embodiment, the step of processing the initial location information and the current location information to obtain the location offset comprises:
acquiring first current horizontal coordinates within a preset time length, and acquiring average horizontal coordinates according to the first current horizontal coordinates;
and confirming the coordinate difference value of the average horizontal coordinate and the first initial horizontal coordinate as the position deviation.
In one embodiment, the step of obtaining the initial position information of the unmanned aerial vehicle body includes:
obtaining an initial distance value and an initial scanning angle returned by the beam light in a scanning period through a laser radar;
obtaining a first initial horizontal coordinate according to the initial distance value and the initial scanning angle;
the step of obtaining the current horizontal position of unmanned aerial vehicle body among the flight process includes:
obtaining a current distance value returned by the beam light in a scanning period through a laser radar;
acquiring the attitude variation of the unmanned aerial vehicle body through an inertia measurement unit; the attitude variation comprises a roll angle and a pitch angle;
and obtaining a first current horizontal coordinate according to the current distance value, the roll angle and the pitch angle.
In one embodiment, the initial position information includes a second initial horizontal coordinate of the drone body relative to the laser emitting device; the laser emitting device is arranged in a well pit;
the step of obtaining the initial position information of the unmanned aerial vehicle body includes:
acquiring initial position coordinates transmitted by the photoelectric position sensor, and determining the initial position coordinates as second initial horizontal coordinates; the photoelectric position sensor is arranged on the unmanned aerial vehicle body; the initial position coordinate is under the condition that the unmanned aerial vehicle body arrived initial website, and photoelectric position sensor responds to the laser that laser emission device sent and obtains.
In one embodiment, the current position information includes a second current horizontal coordinate of the drone body relative to the laser emitting device;
the step of obtaining the current horizontal position of unmanned aerial vehicle body among the flight process includes:
acquiring a current position coordinate transmitted by a photoelectric position sensor, and acquiring the attitude variation of the unmanned aerial vehicle body through an inertia measurement unit; the attitude variation comprises a roll angle and a pitch angle; the current position coordinate is obtained by a photoelectric position sensor responding to laser emitted by a laser emitting device in the flight process;
acquiring a distance value between the gravity center of the unmanned aerial vehicle body and an induction surface of the photoelectric position sensor;
and processing the current position coordinate, the distance value, the roll angle and the pitch angle to obtain a second current horizontal coordinate.
In one embodiment, in the step of processing the current position coordinates, the distance value, the roll angle, and the pitch angle to obtain the second current horizontal coordinate, the second current horizontal coordinate is obtained based on the following formula:
Xti=Xbias t-La·cos(arctan(tan2θt+tan2Φt)1/2)·tanθt
Yti=YBias t-La·cos(arctan(tan2θt+tan2Φt)1/2)·tanΦt
Wherein, XtiThe abscissa is the abscissa of the second current horizontal coordinate; y istiIs the ordinate of the second current horizontal coordinate; l isaIs a distance value; thetatIs a pitch angle; phitIs the roll angle.
In one embodiment, the step of processing the initial location information and the current location information to obtain the location offset comprises:
and confirming the coordinate difference value of the second current horizontal coordinate and the second initial horizontal coordinate as the position deviation.
In one embodiment, the method further comprises the following steps:
and entering a return journey flow under the condition that all ascending stations finish the acquisition flow.
In one embodiment, the return journey process comprises:
inputting a descending instruction, and correcting a descending route of the unmanned aerial vehicle body according to the position deviation;
detecting the acquisition flow of point cloud data and image data of the unmanned aerial vehicle body entering the current station when the unmanned aerial vehicle body arrives at the next station according to the corrected descending route; and under the condition of completing the acquisition process, outputting a descending instruction until all the stations complete the acquisition process; the descending instruction is used for indicating the unmanned aerial vehicle body to move from the current station to the next station;
and entering a landing process under the condition that all descending stations finish the acquisition process.
In one embodiment, the step of detecting that the unmanned aerial vehicle body arrives at the next station according to the corrected ascending route includes:
acquiring the attitude variation of the unmanned aerial vehicle body through an inertia measurement unit; the attitude variation comprises a roll angle and a pitch angle;
acquiring a first distance between the unmanned aerial vehicle body and a pit, a second distance between the beam center of the laser radar and a radiation surface of a lower right-angle emission prism and a third distance between an intersection point of rotation axes of all swing arms and the beam center, wherein the first distance is output by the laser radar; the lower right-angle emission prism and the laser radar are both arranged on any swing arm; the swing arm is arranged on the unmanned aerial vehicle body and used for swinging the laser radar;
processing the first distance, the second distance, the third distance, the roll angle and the pitch angle to obtain the current height of the unmanned aerial vehicle body;
and if the current height is the same as the height of the next station, confirming that the unmanned aerial vehicle body reaches the next station according to the corrected ascending route.
In one embodiment, in the step of processing the first distance, the second distance, the third distance, the roll angle, and the pitch angle to obtain the current height of the drone body, the current height is obtained based on the following formula:
Z=H·cos(arctan(tan2θ+tan2Φ)1/2)±Lb·sin(arctan(tan2θ+tan2Φ)1/2)-(Lc-Lc·cos(arctan(tan2θ+tan2Φ)1/2));
wherein Z is the current height; theta is the pitch angle; phi is the roll angle; h is the first distance; l isbIs the second distance; l iscIs the third distance.
On one hand, the embodiment of the invention also provides a hoistway inspection navigation device of the unmanned aerial vehicle, which comprises:
the initial position information acquisition module is used for outputting a rising instruction and acquiring initial position information of the unmanned aerial vehicle body under the condition that the unmanned aerial vehicle body is at an initial station;
the position deviation acquiring module is used for acquiring current position information of the unmanned aerial vehicle body in the flight process, and processing the initial position information and the current position information to obtain position deviation;
the correction module is used for correcting the flight path of the unmanned aerial vehicle body according to the position deviation;
the acquisition module is used for detecting that the unmanned aerial vehicle body arrives at the next station according to the corrected flight path and enters the acquisition process of point cloud data and image data of the current station; outputting a takeoff instruction until all the stations complete the acquisition process under the condition that the acquisition process is completed; the takeoff instruction is used for instructing the unmanned aerial vehicle body to move from the current station to the next station.
On one hand, the embodiment of the invention also provides the unmanned aerial vehicle which comprises an unmanned aerial vehicle body, a memory and a processor, wherein the memory and the processor are arranged on the unmanned aerial vehicle body, the memory stores a computer program, and the processor realizes the steps of any one of the methods when executing the computer program.
In one embodiment, the unmanned aerial vehicle further comprises a laser radar, a photoelectric position sensor, an inertial measurement unit and image acquisition equipment which are arranged on the unmanned aerial vehicle body;
the processor is respectively connected with the laser radar, the photoelectric position sensor, the inertia measurement unit and the image acquisition equipment.
In another aspect, an embodiment of the present invention further provides a computer-readable storage medium, on which a computer program is stored, and the computer program, when executed by a processor, implements the steps of any one of the methods described above.
One of the above technical solutions has the following advantages and beneficial effects:
the application provides an unmanned aerial vehicle's well routing inspection navigation method, through acquireing initial position information in initial website to will be handled at the current position information and the initial position information of flight in-process unmanned aerial vehicle body, obtain positional deviation. And correcting the ascending route of the unmanned aerial vehicle body according to the position deviation, so that the unmanned aerial vehicle can vertically ascend in the hoistway without colliding with the four walls of the hoistway. When the unmanned aerial vehicle body arrives at the next station, then get into and gather the flow until all stations have all accomplished the collection flow. By the method, the unmanned aerial vehicle can finish self-service inspection navigation under the condition that the GPS signal of the shaft is weak.
Drawings
The foregoing and other objects, features and advantages of the application will be apparent from the following more particular description of preferred embodiments of the application, as illustrated in the accompanying drawings. Like reference numerals refer to like parts throughout the drawings, and the drawings are not intended to be drawn to scale in actual dimensions, emphasis instead being placed upon illustrating the subject matter of the present application.
Fig. 1 is a schematic flow chart of a hoistway inspection navigation method of an unmanned aerial vehicle according to an embodiment;
FIG. 2 is a schematic flow chart of the steps to obtain a position offset in one embodiment;
fig. 3 is a schematic flow chart illustrating a step of acquiring initial position information of an unmanned aerial vehicle body in one embodiment;
fig. 4 is a first flowchart illustrating the step of obtaining the current horizontal position of the drone body during flight in one embodiment;
fig. 5 is a second flowchart illustrating the step of obtaining the current horizontal position of the drone body during flight in one embodiment;
FIG. 6 is a schematic flow chart of a return journey process in one embodiment;
fig. 7 is a schematic flow chart illustrating steps of detecting that the main body of the drone arrives at a next station according to a corrected ascending route in one embodiment;
fig. 8 is a block diagram of a hoistway inspection navigation device of the unmanned aerial vehicle in one embodiment;
FIG. 9 is a front view of a drone in one embodiment;
fig. 10 is a left side view of the drone in one embodiment;
fig. 11 is a three-view of a radar pan-tilt structure of an unmanned aerial vehicle in an embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
In one embodiment, as shown in fig. 1, there is provided a hoistway inspection navigation method for an unmanned aerial vehicle, including the steps of:
s110, outputting a rising instruction, and acquiring initial position information of the unmanned aerial vehicle body under the condition that the unmanned aerial vehicle body is at an initial station;
wherein, the instruction of rising is used for instructing the vertical (perpendicular) flight that rises of unmanned aerial vehicle body. Specifically, the ascending instruction can be output to a propeller motor of the unmanned aerial vehicle. The initial station is the first station that the unmanned aerial vehicle body arrived, the position when also being the unmanned aerial vehicle body has not broken away from the pit yet. The initial position information can be used as a reference to judge the positions of the rest stations. Note that the height at which each station is located is different. The station positions are generally preset according to a hoistway design drawing and can be upper and lower frames of each door opening, upper and lower frames of a ring beam and the like. The initial position information may be any data representing position information in the art.
Specifically, whether the unmanned aerial vehicle body reaches the initial station can be detected by any means in the field. In a specific example, the height of the unmanned aerial vehicle body can be obtained through a laser radar, and whether the unmanned aerial vehicle body reaches the initial station or not is confirmed according to the height. In another embodiment, whether the initial station is reached can also be confirmed by setting an identifier at each station and identifying the identifier.
It should be noted that, before outputting the ascending instruction, the drone needs to be instructed to enter the initialization and self-inspection steps.
S120, acquiring current position information of the unmanned aerial vehicle body in the flight process, and processing the initial position information and the current position information to obtain position deviation;
specifically, the current position information of the unmanned aerial vehicle body can be the real-time position information of the unmanned aerial vehicle body in the flight process. The current position information can be obtained through any means in the field, for example, the positions of the four walls of the shaft can be obtained through a laser radar, and then the current position information of the unmanned aerial vehicle body is obtained. Laser radar locates on the unmanned aerial vehicle body, and each cross section size of well is invariable, and the length and width height of each cross section of well and the size of relative cross section central point put are invariable also. On this basis, can reflect the current position information of unmanned aerial vehicle body according to the position of well wall. For another example: be equipped with photoelectric position sensor on the unmanned aerial vehicle body, be equipped with laser emission device at the well pit, the light beam that laser emission device sent can be received by photoelectric position sensor's response face to output corresponding positional information. Under the condition that the unmanned aerial vehicle body removed, the light beam that laser emission device sent was received by the different positions of response face, output different positional information. Therefore, the current position information of the unmanned aerial vehicle body can be reflected through the position information output by the photoelectric position sensor. In one specific example, the current position information and the initial position information are both horizontal position information.
Further, the current position information and the initial position information may be processed by any means in the art to obtain the position deviation. For example, when the current position information and the initial position information are represented by coordinates, the position deviation may be represented by an X-axis coordinate difference and a Y-axis coordinate difference. For another example, the distance between the current position and the initial position may be obtained according to the current position information and the initial position information, and the position deviation may be represented by the distance.
S130, correcting the ascending air line of the unmanned aerial vehicle body according to the position deviation;
in particular, the ascending path may be modified by a position offset. In one example, if the position deviation is greater than the set value, the ascending route of the unmanned aerial vehicle body is adjusted to maintain the position deviation within the set value.
S140, detecting that the unmanned aerial vehicle body arrives at the next station according to the corrected ascending route, and enters the acquisition flow of point cloud data and image data of the current station; outputting a takeoff instruction until all ascending stations complete the acquisition process under the condition that the acquisition process is completed; the takeoff instruction is used for instructing the unmanned aerial vehicle body to move from the current station to the next station.
The point cloud data may include, among other things, the geometric location and color information of the current site.
Specifically, the image data of the well station can be collected through the image collecting equipment, and the point cloud data of the well can be collected through the laser radar. Whether the unmanned aerial vehicle body reaches the next station according to the corrected ascending route can be detected by any means in the field. It should be noted that, in the case of completing the acquisition process, a takeoff instruction may be output until the distance from the top layer is less than the preset distance, that is, the distance from the top layer is used as a condition for finishing the inspection.
In a specific example, the current height may be directly obtained by the distance detection sensor, and when the current height is the same as the height of the next station, it is determined that the unmanned aerial vehicle body arrives at the next station according to the corrected ascending route. In another specific example, the step of detecting that the unmanned aerial vehicle body arrives at the next station according to the corrected ascending route includes: acquiring the attitude variation of the unmanned aerial vehicle body through an inertia measurement unit; the attitude variation comprises a roll angle and a pitch angle; acquiring a first distance between the unmanned aerial vehicle body and a pit, a second distance between the beam center of the laser radar and a radiation surface of a lower right-angle emission prism and a third distance between an intersection point of rotation axes of all swing arms and the beam center, wherein the first distance is output by the laser radar; the lower right-angle emission prism and the laser radar are both arranged on any swing arm; the swing arm is arranged on the unmanned aerial vehicle body and used for swinging the laser radar; processing the first distance, the second distance, the third distance, the roll angle and the pitch angle to obtain the current height of the unmanned aerial vehicle body; and if the current height is the same as the height of the next station, confirming that the unmanned aerial vehicle body reaches the next station according to the corrected ascending route. It should be noted that, in the step of processing the first distance, the second distance, the third distance, the roll angle, and the pitch angle to obtain the current height of the unmanned aerial vehicle body, the current height is obtained based on the following formula:
Z=H·cos(arctan(tan2θ+tan2Φ)1/2)±Lb·sin(arctan(tan2θ+tan2Φ)1/2)-(Lc-Lc·cos(arctan(tan2θ+tan2Φ)1/2));
wherein Z is the current height; theta is the pitch angle; phi is the roll angle; h is the first distance; l isbIs the second distance; l iscIs the third distance.
It should be noted that the flow of acquiring point cloud data and image data of the current station may be any one of the flow of acquiring point cloud data and image data in the field. When the collection process is completed, a takeoff instruction is output for indicating the unmanned aerial vehicle body to move to the next station. And circulating the actions of moving from the current station to the next station until all the ascending stations finish the acquisition process. The ascending station is a station which needs to perform an acquisition process in the ascending process.
According to the hoistway inspection navigation method of the unmanned aerial vehicle, initial position information is obtained in an initial station, and the current position information and the initial position information of the unmanned aerial vehicle body are processed in the flight process to obtain position deviation. And correcting the ascending route of the unmanned aerial vehicle body according to the position deviation, so that the unmanned aerial vehicle can vertically ascend in the hoistway without colliding with the four walls of the hoistway. When the unmanned aerial vehicle body arrives at the next station, then get into and gather the flow until all stations have all accomplished the collection flow. By the method, the unmanned aerial vehicle can finish self-service inspection navigation under the condition that the GPS signal of the shaft is weak.
In one embodiment, the current location information includes a first current horizontal coordinate of the hoistway wall relative to the drone body; the initial position information comprises first initial horizontal coordinates of four walls of the shaft relative to the unmanned aerial vehicle body;
the step of processing the initial position information and the current position information to obtain the position deviation comprises the following steps:
and confirming the coordinate difference value of the first current horizontal coordinate and the first initial horizontal coordinate as the position deviation.
Wherein, the first current horizontal coordinate of well wall for the unmanned aerial vehicle body can regard the unmanned aerial vehicle body as the initial point, also can regard other reference objects (like the work initial point) as the initial point, and it is real-time horizontal coordinate among the unmanned aerial vehicle body flight process. The first initial horizontal coordinate of well wall for the unmanned aerial vehicle body can use the unmanned aerial vehicle body as the initial point, also can use other reference objects as the initial point, and it is the horizontal coordinate of unmanned aerial vehicle body when initial website.
Specifically, the coordinate difference between the X-axis coordinate of the first current horizontal coordinate and the X-axis coordinate of the first initial horizontal coordinate, and the coordinate difference between the Y-axis coordinate of the first current horizontal coordinate and the Y-axis coordinate of the first initial horizontal coordinate are determined as the positional deviation.
In one embodiment, as shown in fig. 2, the step of processing the initial position information and the current position information to obtain the position deviation comprises:
s210, acquiring first current horizontal coordinates within a preset time length, and obtaining average horizontal coordinates according to the first current horizontal coordinates;
specifically, each first current horizontal coordinate within a preset time length is obtained, and an average horizontal coordinate of each first current horizontal coordinate is calculated.
And S220, confirming the coordinate difference value of the average horizontal coordinate and the first initial horizontal coordinate as the position deviation.
Specifically, the coordinate difference of the X-axis coordinate of the average horizontal coordinate and the X-axis coordinate of the first initial horizontal coordinate, and the coordinate difference of the Y-axis coordinate of the average horizontal coordinate and the Y-axis coordinate of the first initial horizontal coordinate are determined as the positional deviation.
In one embodiment, as shown in fig. 3, the step of obtaining the initial position information of the drone body includes:
s310, obtaining an initial distance value and an initial scanning angle returned by the beam light in a scanning period through a laser radar;
specifically, when the unmanned aerial vehicle is locally located at an initial station, the laser radar emits beam light to the periphery, and an initial distance value and an initial scanning angle returned by any line of beam light in one scanning period are obtained.
S320, obtaining a first initial horizontal coordinate according to the initial distance value and the initial scanning angle;
specifically, the first initial horizontal coordinate may be obtained by the following formula:
X1i=r1i·cosεi
Y1i=r1i·sinεi
wherein r isi、εiRespectively returning distance value and initial scanning angle value r of each beam light of the laser radar in one period of scanning1iThe initial range value returned by the laser radar line beam light in one cycle of scanning is used as the initial station.
As shown in fig. 4, the step of obtaining the current horizontal position of the unmanned aerial vehicle body in the flight process includes:
s410, obtaining a current distance value returned by the beam light in one scanning period through the laser radar;
specifically, when the unmanned aerial vehicle is locally located at the next station, the laser radar emits beam light to the periphery, and obtains the current distance value returned by any one line of beam light in one scanning period.
S420, acquiring the attitude variation of the unmanned aerial vehicle body through an inertia measurement unit; the attitude variation comprises a roll angle and a pitch angle;
and S430, obtaining a first current horizontal coordinate according to the current distance value, the roll angle and the pitch angle.
Specifically, the first current horizontal coordinate may be obtained by the following formula:
Xti=rti·cos(arctan(tan2θt+tan2Φt)1/2)·cosεi
Yti=rti·cos(arctan(tan2θt+tan2Φt)1/2)·sinεi
wherein r istiThe current distance value returned by the laser radar line beam light in the t station in one period of scanning;
θtthe pitch angle of the unmanned aerial vehicle body of the tth station; phitFor the roll angle of tth station unmanned aerial vehicle body.
Based on this, the positional deviation is:
△Xti=rti·cos(arctan(tan2θt+tan2Φt)1/2)·cosεi-r1i·cosεi
△Yti=rti·cos(arctan(tan2θt+tan2Φt)1/2)·sinεi-r1i·sinεi
in one embodiment, the initial position information includes a second initial horizontal coordinate of the drone body relative to the laser emitting device; the laser emitting device is arranged in a well pit;
the step of obtaining the initial position information of the unmanned aerial vehicle body includes:
acquiring initial position coordinates transmitted by the photoelectric position sensor, and determining the initial position coordinates as second initial horizontal coordinates; the photoelectric position sensor is arranged on the unmanned aerial vehicle body; the initial position coordinate is under the condition that the unmanned aerial vehicle body arrived initial website, and photoelectric position sensor responds to the laser that laser emission device sent and obtains.
Specifically, the photoelectric position sensor is arranged on the unmanned aerial vehicle body, and the laser emitting device is arranged in a well pit; a beam of light emitted by the laser emitting device upwardly from the hoistway pit may be received by a sensing surface of the photoelectric position sensor, the sensing surface being responsive to the beam and outputting initial position coordinates. In one specific example, the initial horizontal coordinates may be confirmed as (0,0), that is, the initial position information may be confirmed as the reference coordinates.
In one embodiment, the current position information includes a second current horizontal coordinate of the drone body relative to the laser emitting device;
as shown in fig. 5, the step of obtaining the current horizontal position of the unmanned aerial vehicle body in the flight process includes:
s510, acquiring current position coordinates transmitted by the photoelectric position sensor, and acquiring the attitude variation of the unmanned aerial vehicle body through the inertial measurement unit; the attitude variation comprises a roll angle and a pitch angle; the current position coordinate is obtained by a photoelectric position sensor responding to laser emitted by a laser emitting device in the flight process;
specifically, if the unmanned aerial vehicle body takes place to remove, then laser irradiation is on photoelectric position sensor's the different positions of response face, therefore the current position coordinate of photoelectric position sensor transmission is different.
S520, obtaining a distance value between the center of gravity of the unmanned aerial vehicle body and a sensing surface of the photoelectric position sensor;
wherein, the distance value of unmanned aerial vehicle body focus and photoelectric position sensor's response face can be a default, through save in advance in the memory of unmanned aerial vehicle body or other positions, directly call when needs use can.
And S530, processing the current position coordinate, the distance value, the roll angle and the pitch angle to obtain a second current horizontal coordinate.
In one embodiment, in the step of processing the current position coordinates, the distance value, the roll angle, and the pitch angle to obtain the second current horizontal coordinate, the second current horizontal coordinate is obtained based on the following formula:
Xti=Xbias t-La·cos(arctan(tan2θt+tan2Φt)1/2)·tanθt
Yti=YBias t-La·cos(arctan(tan2θt+tan2Φt)1/2)·tanΦt
Wherein, XtiThe abscissa is the abscissa of the second current horizontal coordinate; y istiIs the ordinate of the second current horizontal coordinate; l isaIs a distance value; thetatIs a pitch angle; phitIs the roll angle.
Note that, when the second initial horizontal coordinate is set as the reference value, that is, (0,0), the second current horizontal coordinate, that is, the positional deviation.
In one embodiment, the step of processing the initial location information and the current location information to obtain the location offset comprises:
and confirming the coordinate difference value of the second current horizontal coordinate and the second initial horizontal coordinate as the position deviation.
Specifically, the coordinate difference of the X-axis coordinate of the second current horizontal coordinate and the X-axis coordinate of the second initial horizontal coordinate, and the coordinate difference of the Y-axis coordinate of the second initial horizontal coordinate and the Y-axis coordinate of the second initial horizontal coordinate are determined as the positional deviation.
In one embodiment, the method further comprises the steps of:
and entering a return journey flow under the condition that all ascending stations finish the acquisition flow.
Specifically, the return journey process may be landing by returning to the takeoff point.
In one embodiment, as shown in fig. 6, the return journey process includes:
s610, inputting a descending instruction, and correcting a descending route of the unmanned aerial vehicle body according to the position deviation;
specifically, under the condition that all ascending stations finish the acquisition process, a descending instruction is input, and the descending route of the unmanned aerial vehicle body is corrected according to the output position deviation.
S620, detecting that the unmanned aerial vehicle body arrives at the next station according to the corrected descending route, and entering the acquisition flow of point cloud data and image data of the current station; and under the condition of completing the acquisition process, outputting a descending instruction until all the stations complete the acquisition process; the descending instruction is used for indicating the unmanned aerial vehicle body to move from the current station to the next station;
specifically, all stations can be collected again in the descending route according to the order of height, so as to facilitate verification. Furthermore, a new site can be added to the original site.
And S630, entering a landing process under the condition that all descending stations finish the acquisition process.
Specifically, the landing procedure may be any one of the landing procedures in the art, and is not limited herein.
In one embodiment, as shown in fig. 7, the step of detecting that the main body of the drone arrives at the next station according to the corrected ascending route includes:
s710, acquiring the attitude variation of the unmanned aerial vehicle body through an inertia measurement unit; the attitude variation comprises a roll angle and a pitch angle;
specifically, the inertial measurement unit may be a 9-axis MEMS inertial measurement unit, including a three-axis gyroscope, a three-axis accelerometer, and a three-axis magnetometer.
S720, acquiring a first distance between the unmanned aerial vehicle body and a pit, a second distance between the beam center of the laser radar and the radiation surface of the lower right-angle emission prism and a third distance between the intersection point of the rotation axis of each swing arm and the beam center, wherein the first distance is output by the laser radar; the lower right-angle emission prism and the laser radar are both arranged on any swing arm; the swing arm is arranged on the unmanned aerial vehicle body and used for swinging the laser radar;
specifically, laser radar's lower right angle emission prism can be with laser radar's beam reflection to the pit of well, can measure the height of unmanned aerial vehicle for the pit. Still further, still be equipped with right angle emission prism, go up right angle emission prism and can reflect laser radar's beam to the top of well for measure the height of unmanned aerial vehicle body relative to the top. The position information of each station may be based on the pit or the top.
S730, processing the first distance, the second distance, the third distance, the roll angle and the pitch angle to obtain the current height of the unmanned aerial vehicle body;
specifically, in the step of processing the first distance, the second distance, the third distance, the roll angle and the pitch angle to obtain the current height of the unmanned aerial vehicle body, the current height is obtained based on the following formula:
Z=H·cos(arctan(tan2θ+tan2Φ)1/2)±Lb·sin(arctan(tan2θ+tan2Φ)1/2)-(Lc-Lc·cos(arctan(tan2θ+tan2Φ)1/2));
wherein Z is the current height; theta is the pitch angle; phi is the roll angle; h is the first distance; l isbIs the second distance; l iscIs the third distance.
And S740, if the current height is the same as the height of the next station, confirming that the unmanned aerial vehicle body reaches the next station according to the corrected ascending route.
Specifically, if the current height of the unmanned aerial vehicle is the same as the height of the next station, the unmanned aerial vehicle body is confirmed to reach the next station.
It should be understood that although the various steps in the flow charts of fig. 1-7 are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least some of the steps in fig. 1-7 may include multiple sub-steps or multiple stages that are not necessarily performed at the same time, but may be performed at different times, and the order of performance of the sub-steps or stages is not necessarily sequential, but may be performed in turn or alternating with other steps or at least some of the sub-steps or stages of other steps.
In one embodiment, as shown in fig. 8, there is provided a hoistway inspection navigation device of an unmanned aerial vehicle, including:
the initial position information acquisition module is used for outputting a rising instruction and acquiring initial position information of the unmanned aerial vehicle body under the condition that the unmanned aerial vehicle body is at an initial station;
the position deviation acquiring module is used for acquiring current position information of the unmanned aerial vehicle body in the flight process, and processing the initial position information and the current position information to obtain position deviation;
the correction module is used for correcting the flight path of the unmanned aerial vehicle body according to the position deviation;
the acquisition module is used for detecting that the unmanned aerial vehicle body arrives at the next station according to the corrected flight path and enters the acquisition process of point cloud data and image data of the current station; outputting a takeoff instruction until all the stations complete the acquisition process under the condition that the acquisition process is completed; the takeoff instruction is used for instructing the unmanned aerial vehicle body to move from the current station to the next station.
In one embodiment, the position deviation acquiring module further includes:
and the first position deviation acquiring module is used for confirming the coordinate difference value of the first current horizontal coordinate and the first initial horizontal coordinate as the position deviation.
In one embodiment, the position deviation acquiring module further includes:
the second position deviation acquiring module is used for acquiring first current horizontal coordinates within a preset time length and acquiring average horizontal coordinates according to the first current horizontal coordinates; and confirming the coordinate difference value of the average horizontal coordinate and the first initial horizontal coordinate as the position deviation.
For specific definition of the hoistway inspection navigation device of the unmanned aerial vehicle, reference may be made to the above definition of the hoistway inspection navigation method of the unmanned aerial vehicle, and details are not repeated here. Each module in the shaft inspection navigation device of the unmanned aerial vehicle can be completely or partially realized through software, hardware and a combination of the software and the hardware. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
In one embodiment, there is provided a drone, including a drone body, a memory provided on the drone body, and a processor, where the memory stores a computer program, and the processor implements the steps of any one of the above methods when executing the computer program.
In one embodiment, the unmanned aerial vehicle further comprises a laser radar, a photoelectric position sensor, an inertial measurement unit and image acquisition equipment which are arranged on the unmanned aerial vehicle body;
the processor is respectively connected with the laser radar, the photoelectric position sensor, the inertia measurement unit and the image acquisition equipment.
Specifically, the image acquisition device may be a camera; the photoelectric position sensor is used for receiving laser of the laser emitting device arranged in the pit and outputting position information.
In order to further explain the unmanned aerial vehicle of the present application, the following description is further made with reference to a specific example:
unmanned aerial vehicle includes screw 1, unmanned aerial vehicle body 2, fixed undercarriage 3, upward launches right angle prism 4, sends down right angle prism 5, two-dimensional laser radar 6, radar cloud platform 7, HDR camera 8, camera cloud platform 9, IMU module 10, machine carries treater 11, power 12, wireless picture biography and communication module 13, SOS module 14, RC module 15 (do not have the picture), 16 laser alignment system. The main structure is shown in figures 9, 10 and 11.
The overall layout of the unmanned aerial vehicle ensures that the center of gravity of the unmanned aerial vehicle is positioned at the geometric center as far as possible.
Two upper/lower right-angle transmitting prisms 4/5 are arranged on a swing arm 7d in a cloud deck 7 like a laser radar 6, the positions of the upper/lower right-angle transmitting prisms and the swing arm are always consistent with the relative positions of the radar, and a small part of light beams of the two-dimensional laser radar 6 are reflected to the top/bottom pit of the shaft and are used for measuring the height of the unmanned aerial vehicle relative to the top surface/bottom pit.
The two-dimensional laser radar 6 is fixed on a swing arm 7d in the holder 7, and the center of gravity of the two-dimensional laser radar is adjusted to pass through the axes of the two swing arm motors. Under the initial condition, the work initial point, laser radar center adjustment and unmanned aerial vehicle focus Z axle direction coincide.
The radar pan-tilt 7 is a mechanical framework which is arranged on a bottom plate of the unmanned aerial vehicle and used for mounting the laser radar 6, and mainly comprises a lifting device 7a, a support 7b, a swing arm 7c and a swing arm 7 d. The lifting device 7a enables the support 7b to vertically lift, and the vertical height of the holder is finely adjusted. The swing arm 7c can rotate left and right around the support 7b, the swing arm 7d can rotate front and back around the swing arm 7c, and each axis is provided with a motor. The radar 6 is mounted on a swing arm 7d to swing with it.
The camera cloud platform 9 is installed and is used for hanging the mechanical framework of HDR camera 8 on the unmanned aerial vehicle top, and it mainly comprises column spinner 9a, support 9b, swing arm 9c, swing arm formula camera installation card groove 9 d. The support 9b can rotate around the center of the rotating column 9a, so that the camera can shoot in 360 degrees in the same horizontal plane without dead angles. The swing arm 9c rotates left and right around the bracket 9b, the swing arm type camera mounting clamping groove 9d rotates front and back around the swing arm 9c, and each axis is provided with a motor. The HDR camera 8 is mounted on the swing arm type camera mounting slot 9d to swing with it.
The IMU module (i.e., the above inertial measurement unit) 10 employs a 9-axis MEMS inertial measurement unit (a three-axis gyroscope, a three-axis accelerometer, a three-axis magnetic field meter), can output three-axis acceleration, three-axis rotational speed, and three-axis geomagnetic field intensity, can output a roll angle Φ, a pitch angle θ, and a yaw angle Ψ without drift, and employs an anti-vibration gyroscope design.
The SOS module 14 sends a distress signal in an emergency by flashing a light with a red signal and ultrasonic waves.
RC module 15 is used for under emergency to be deprived back unmanned aerial vehicle's control right by operating personnel, and when manual control unmanned aerial vehicle control operation and operation or SOS module 13 sent distress signal, manual control brought unmanned aerial vehicle to ground.
The laser alignment system 16 includes a laser emitting device 16a and a photoelectric position sensor device 16 b. The photoelectric position sensor can adopt an area array CCD. The laser emitting device 16a is installed in the hoistway pit. Photoelectric position sensor device 16b dress is on laser radar, and its centre of target is unanimous with laser radar focus adjustment and unmanned aerial vehicle focus Z axle direction coincidence. The relative position of the laser radar 6 and the radar pan-tilt 7 is always consistent with the relative position of the radar 6.
In order to further explain the unmanned aerial vehicle hoistway inspection navigation method, the following specific examples are specifically combined for further explanation:
the method comprises the following steps: establishing a coordinate system
And establishing an unmanned aerial vehicle body coordinate system Bcoor. The unmanned aerial vehicle is horizontally placed at the center position of a pit of a shaft and is stably placed, and a laser beam of the laser emitting device 16a is adjusted to be aligned with the center position of a light spot of the photoelectric position sensor device 16 b. Use unmanned aerial vehicle focus (unmanned aerial vehicle geometric centre and focus coincidence as far as possible when designing unmanned aerial vehicle) as the initial point, in the unmanned aerial vehicle plane, use X direction definition in the triaxial acceleration of IMU (also be above-mentioned inertia measurement unit) output to be the X axle forward of unmanned aerial vehicle organism coordinate system, anticlockwise rotation 90 is the Y axle forward in the unmanned aerial vehicle plane, and the ascending direction definition in perpendicular unmanned aerial vehicle plane is the Z axle forward.
A world coordinate system Gcoor with the origin of work is established. Translating an unmanned aerial vehicle body coordinate system Bcoor in the vertical direction to a world coordinate system Gcoor O of a pit plane for unmanned aerial vehicle working0(0,0,0). The coordinate of the laser radar center in the world coordinate system is O1(0,0,Z1)(Z1The height of a laser mine reaching a pit of a well at a working starting point) and the coordinate of an unmanned aerial vehicle body coordinate system Bcoor in a world coordinate system is O (0,0, Z)B)(ZBThe height of the unmanned aerial vehicle from the center of gravity to the pit of the shaft at the starting point of work). The coordinate system and navigation-related parameters are shown in fig. 9, fig. 10, and fig. 11.
Step two: initialization work
And (5) turning on a power supply, and starting initialization work and unmanned aerial vehicle self-checking work.
Step three: first station workstation data acquisition
The working starting point is that the coordinate of the laser radar center in a world coordinate system is O1(0,0,Z1) And then the data acquisition of the first station is carried out. And simultaneously, the photoelectric position sensor records the deviation value (0,0) of the position relative to the initial position of the laser spot.
And lidar data X1i=r1i·cosεi,Y1i=r1i·sinεi. Wherein r isi、εiRespectively returning distance value and scanning angle value r of each beam light of the laser radar in one period of scanning1iThe returned distance value of each beam light of the laser radar of the first station (namely the initial station) in one period of scanning is obtained. The height dimension of the data collected by the first station is Z1
Step four: course positioning in flight process after data acquisition of first station is finished
When the unmanned aerial vehicle displaces, the IMU module 10 outputs a roll angle Φ t, a pitch angle θ t, and a yaw angle Ψ t (all angles are turned counterclockwise to positive), the height from the laser radar scanner to the pit is Ht, and the photoelectric position sensor device 16b outputs (Xt)Deflection,YtDeflection) And the laser radar scans the profile relative position change of the four walls of the well.
1) The horizontal direction positioning adopts a first method:
the laser radar scans the profile relative position change of the four walls of the shaft to determine whether the unmanned aerial vehicle deviates from the air route.
And in the scanning matching process, comparing the relative deviation value of the coordinates of the data points acquired by the laser radar and the data points on the four walls of the hoistway acquired by the first station laser radar, or comparing the average value of the data frame set in a period of time with the relative deviation value of the data of the first station.
△Xti=rti·cos(arctan(tan2θt+tan2Φt)1/2)·cosεi-r1i·cosεi
△Yti=rti·cos(arctan(tan2θt+tan2Φt)1/2)·sinεi-r1i·sinεi
2) The horizontal direction positioning adopts a second method:
△Xti=Xbias t-La·cos(arctan(tan2θt+tan2Φt)1/2)·tanθt
△Yti=YBias t-La·cos(arctan(tan2θt+tan2Φt)1/2)·tanΦt
Wherein L isaThe distance from the center of gravity of the unmanned aerial vehicle to the sensing surface of the photoelectric position sensor device 16b is the work starting point.
(△Xti,△Yti) I.e. an estimated value (i.e. position deviation) of the unmanned aerial vehicle deviation from the flight path.
Step five: hovering second station data acquisition
When the laser radar reaches the set acquisition position Z2 estimationAnd in time, hovering, shooting and laser radar data acquisition. IMU Module 10 output roll Angle Φ2Angle of pitch theta2With yaw angle Ψ2(all angles turned counterclockwise to positive) the lidar scanner outputs a height H2 to the pit.
Z2 estimation=H2·cos(arctan(tan2θ2+tan2Φ2)1/2)±Lb·sin(arctan(tan2θ2+tan2Φ2)1/2)-(Lc-Lc·cos(arctan(tan2θ2+tan2Φ2)1/2))
Wherein L isbIs the distance, L, from the center of the laser radar beam to the radiating surface of the lower right-angle transmitting prism 5cThe distance from the intersection point of the rotation axes of the swing arm 7c and the swing arm 7d to the center of the laser radar beam.
After hovering, the motors of the radar pan-tilt 7 and the camera pan-tilt 9 impose power in corresponding directions to prevent the laser radar and the camera from inclining along with the unmanned aerial vehicle,And (4) dithering. I.e. the swing arm 7b rotates around the bracket 7a to phi2Swing arm 7c rotates back and forth about swing arm 7b by-theta2(ii) a The swing arm 9c rotates around the bracket 9b to-phi2The swing arm type camera mounting slot 9d rotates around the swing arm 9c back and forth by-theta2. Namely, after the camera and the radar recover to the initialized pose information, the data acquisition is started.
At the moment, the photoelectric position sensor records the deviation value (X) of the position relative to the initial position of the laser spot2 partial deviation,Y2 partial deviation) The precise distance Z from the laser radar 6 to the pit is measured by the lower transmitting right-angle prism 52 extract of honeysuckle flower
And lidar data X2i=r2icosεi+X2 partial deviation,Y2i=r2isinεi+Y2 partial deviation
Wherein r is2iThe returned distance values for each beam of the second station lidar during a periodic scan. The height of the second station for data acquisition is Z2 extract of honeysuckle flower
Meanwhile, the rotating column 9a in the camera pan-tilt 9 rotates 360 degrees to acquire image data of the well.
After data acquisition is finished, the swing arm 7b rotates around the bracket 7a to the left and right by phi2The swing arm 7c rotates back and forth around the swing arm 7b by theta2(ii) a The swing arm 9c rotates around the bracket 9b to the left and right2The swing arm type camera mounting slot 9d rotates around the swing arm 9c back and forth by theta2. The camera and the radar restore the pose unchanged relative to the unmanned aerial vehicle body.
Step six: repeated taking off and suspension
And after the data acquisition of the second station is finished, the unmanned aerial vehicle takes off again. To a set acquisition position Zn estimationAnd (5) hovering the fixed point.
IMU Module 10 output roll Angle ΦnAngle of pitch thetanWith yaw angle Ψn(all angles are turned counterclockwise to positive), the height of the laser radar scanner output to the pit is Hn
n station hover height estimates Zn estimation=Hn·cos(arctan(tan2θn+tan2Φn)1/2)±Lb·sin(arctan(tan2θn+tan2Φn)1/2)-(Lc-Lc·cos(arctan(tan2θn+tan2Φn)1/2))
After hovering, the radar cloud platform 7 and the camera cloud platform 9 motor impose corresponding direction power, so that the laser radar and the camera are prevented from inclining and shaking along with the unmanned aerial vehicle. I.e. the swing arm 7b rotates around the bracket 7a to phinSwing arm 7c rotates back and forth about swing arm 7b by-thetan(ii) a The swing arm 9c rotates around the bracket 9b to-phinThe swing arm type camera mounting slot 9d rotates around the swing arm 9c back and forth by-thetan. Namely, after the camera and the radar recover to the initialized pose information, the data acquisition is started.
Zn essenceThe accurate distance Z from the laser radar 6 to the pit is measured by the lower transmitting right-angle prism 5n essenceI.e. the height of the data collected at the nth station is Zn essenceAnd lidar data Xni=rnicosεi+Xn is offset;Yni=rni sinεi+Yn is offset。(Xn is offset,Yn is offset) And recording deviation values of the n laser radars relative to the initial positions of the laser spots for the photoelectric position sensor.
Meanwhile, the rotating column 9a in the camera pan-tilt 9 rotates 360 degrees to acquire image data of the well.
After data acquisition is finished, the swing arm 7b rotates around the bracket 7a to the left and right by phinThe swing arm 7c rotates back and forth around the swing arm 7b by thetan(ii) a The swing arm 9c rotates around the bracket 9b to the left and rightnThe swing arm type camera mounting slot 9d rotates around the swing arm 9c back and forth by thetan. The camera and the radar restore the pose unchanged relative to the unmanned aerial vehicle body.
Repeatedly taking off and hovering at fixed points until all the stations finish acquisition or when the safe distance Z is preset from the top floorS estimation(this value is estimated by the upper rectangular emission prism 4 output fused to the IMU) for the last acquisition.
Step seven: return landing
The data acquisition work in the process of returning can also be added in the process of returning, and the direct returning landing can also be carried out. The navigation of the obstacle avoidance strategy can be carried out by depending on the laser radar in the direct return landing or the emergency landing under the abnormal condition.
In one embodiment, a computer-readable storage medium is provided, having a computer program stored thereon, which when executed by a processor, performs the steps of:
outputting a rising instruction, and acquiring initial position information of the unmanned aerial vehicle body under the condition that the unmanned aerial vehicle body is at an initial station;
acquiring current position information of an unmanned aerial vehicle body in a flight process, and processing the initial position information and the current position information to obtain position deviation;
correcting the ascending route of the unmanned aerial vehicle body according to the position deviation;
detecting the acquisition flow of point cloud data and image data of the unmanned aerial vehicle body entering the current station when the unmanned aerial vehicle body arrives at the next station according to the corrected ascending route; outputting a takeoff instruction until all ascending stations complete the acquisition process under the condition that the acquisition process is completed; the takeoff instruction is used for instructing the unmanned aerial vehicle body to move from the current station to the next station.
In one embodiment, the step of processing the initial position information and the current position information to obtain the position deviation is further performed by the processor to:
and confirming the coordinate difference value of the first current horizontal coordinate and the first initial horizontal coordinate as the position deviation.
In one embodiment, the step of processing the initial position information and the current position information to obtain the position deviation is further performed by the processor to:
acquiring first current horizontal coordinates within a preset time length, and acquiring average horizontal coordinates according to the first current horizontal coordinates;
and confirming the coordinate difference value of the average horizontal coordinate and the first initial horizontal coordinate as the position deviation.
In one embodiment, the step of obtaining the initial position information of the drone body further implements the following steps when executed by the processor:
obtaining an initial distance value and an initial scanning angle returned by the beam light in a scanning period through a laser radar;
obtaining a first initial horizontal coordinate according to the initial distance value and the initial scanning angle;
in one embodiment, the step of obtaining the current horizontal position of the drone body during flight when executed by the processor further comprises the steps of:
obtaining a current distance value returned by the beam light in a scanning period through a laser radar;
acquiring the attitude variation of the unmanned aerial vehicle body through an inertia measurement unit; the attitude variation comprises a roll angle and a pitch angle;
and obtaining a first current horizontal coordinate according to the current distance value, the roll angle and the pitch angle.
In one embodiment, the step of obtaining the initial position information of the drone body further implements the following steps when executed by the processor:
acquiring initial position coordinates transmitted by the photoelectric position sensor, and determining the initial position coordinates as second initial horizontal coordinates; the photoelectric position sensor is arranged on the unmanned aerial vehicle body; the initial position coordinate is under the condition that the unmanned aerial vehicle body arrived initial website, and photoelectric position sensor responds to the laser that laser emission device sent and obtains.
In one embodiment, the step of obtaining the current horizontal position of the drone body during flight when executed by the processor further comprises the steps of:
in one embodiment, the computer program when executed by the processor further performs the steps of:
acquiring a current position coordinate transmitted by a photoelectric position sensor, and acquiring the attitude variation of the unmanned aerial vehicle body through an inertia measurement unit; the attitude variation comprises a roll angle and a pitch angle; the current position coordinate is obtained by a photoelectric position sensor responding to laser emitted by a laser emitting device in the flight process;
acquiring a distance value between the gravity center of the unmanned aerial vehicle body and an induction surface of the photoelectric position sensor;
and processing the current position coordinate, the distance value, the roll angle and the pitch angle to obtain a second current horizontal coordinate.
In one embodiment, the step of processing the initial position information and the current position information to obtain the position deviation is further performed by the processor to:
and confirming the coordinate difference value of the second current horizontal coordinate and the second initial horizontal coordinate as the position deviation.
In one embodiment, the computer program when executed by the processor further performs the steps of:
and entering a return journey flow under the condition that all ascending stations finish the acquisition flow.
In one embodiment, the return flow when executed by the processor further implements the steps of:
inputting a descending instruction, and correcting a descending route of the unmanned aerial vehicle body according to the position deviation;
detecting the acquisition flow of point cloud data and image data of the unmanned aerial vehicle body entering the current station when the unmanned aerial vehicle body arrives at the next station according to the corrected descending route; and under the condition of completing the acquisition process, outputting a descending instruction until all the stations complete the acquisition process; the descending instruction is used for indicating the unmanned aerial vehicle body to move from the current station to the next station;
and entering a landing process under the condition that all descending stations finish the acquisition process.
In one embodiment, the step of detecting that the unmanned aerial vehicle body arrives at the next station according to the corrected ascending route further implements the following steps when executed by the processor:
acquiring the attitude variation of the unmanned aerial vehicle body through an inertia measurement unit; the attitude variation comprises a roll angle and a pitch angle;
acquiring a first distance between the unmanned aerial vehicle body and a pit, a second distance between the beam center of the laser radar and a radiation surface of a lower right-angle emission prism and a third distance between an intersection point of rotation axes of all swing arms and the beam center, wherein the first distance is output by the laser radar; the lower right-angle emission prism and the laser radar are both arranged on any swing arm; the swing arm is arranged on the unmanned aerial vehicle body and used for swinging the laser radar;
processing the first distance, the second distance, the third distance, the roll angle and the pitch angle to obtain the current height of the unmanned aerial vehicle body;
and if the current height is the same as the height of the next station, confirming that the unmanned aerial vehicle body reaches the next station according to the corrected ascending route.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in the embodiments provided herein may include non-volatile and/or volatile memory, among others. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms, such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDRSDRAM), Enhanced SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), Rambus DRAM (RDRAM), and interface DRAM (DRDRAM).
The technical features of the embodiments described above may be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the embodiments described above are not described, but should be considered as being within the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (16)

1. A hoistway inspection navigation method of an unmanned aerial vehicle is characterized by comprising the following steps:
outputting a rising instruction, and acquiring initial position information of an unmanned aerial vehicle body under the condition that the unmanned aerial vehicle body is at an initial station;
acquiring current position information of the unmanned aerial vehicle body in the flight process, and processing the initial position information and the current position information to obtain position deviation;
correcting the ascending air line of the unmanned aerial vehicle body according to the position deviation;
detecting that the unmanned aerial vehicle body arrives at the next station according to the corrected ascending route and enters the acquisition flow of point cloud data and image data of the current station; outputting a takeoff instruction until all ascending stations complete the acquisition process under the condition that the acquisition process is completed; the takeoff instruction is used for indicating the unmanned aerial vehicle body to move from the current station to the next station.
2. The hoistway inspection navigation method for the unmanned aerial vehicle according to claim 1, wherein the current position information includes first current horizontal coordinates of four walls of the hoistway relative to the unmanned aerial vehicle body; the initial position information includes first initial horizontal coordinates of the hoistway walls relative to the drone body;
processing the initial position information and the current position information to obtain a position deviation, wherein the step of obtaining the position deviation comprises the following steps:
and confirming the coordinate difference value of the first current horizontal coordinate and the first initial horizontal coordinate as the position deviation.
3. The unmanned aerial vehicle hoistway inspection tour navigation method according to claim 2, wherein the step of processing the initial position information and the current position information to obtain a position deviation comprises:
acquiring the first current horizontal coordinates within a preset time length, and acquiring average horizontal coordinates according to the first current horizontal coordinates;
and confirming the coordinate difference value of the average horizontal coordinate and the first initial horizontal coordinate as the position deviation.
4. The hoistway inspection navigation method for the unmanned aerial vehicle according to claim 2, wherein the step of obtaining the initial position information of the unmanned aerial vehicle body comprises:
obtaining an initial distance value and an initial scanning angle returned by the beam light in a scanning period through a laser radar;
obtaining the first initial horizontal coordinate according to the initial distance value and the initial scanning angle;
obtain the flight in-process the current horizontal position's of unmanned aerial vehicle body step includes:
obtaining a current distance value returned by the beam light in a scanning period through a laser radar;
acquiring the attitude variation of the unmanned aerial vehicle body through an inertia measurement unit; the attitude variation comprises a roll angle and a pitch angle;
and obtaining the first current horizontal coordinate according to the current distance value, the roll angle and the pitch angle.
5. The hoistway inspection navigation method for the unmanned aerial vehicle according to claim 1, wherein the initial position information includes a second initial horizontal coordinate of the unmanned aerial vehicle body relative to a laser emitting device; the laser emitting device is arranged in a well pit;
obtain the initial position information's of unmanned aerial vehicle body step includes:
acquiring initial position coordinates transmitted by a photoelectric position sensor, and determining the initial position coordinates as the second initial horizontal coordinates; the photoelectric position sensor is arranged on the unmanned aerial vehicle body; the initial position coordinate is obtained by the photoelectric position sensor responding to laser emitted by the laser emitting device under the condition that the unmanned aerial vehicle body reaches the initial station.
6. The hoistway inspection navigation method for the unmanned aerial vehicle according to claim 5, wherein the current position information includes a second current horizontal coordinate of the unmanned aerial vehicle body relative to the laser emitting device;
obtain the flight in-process the current horizontal position's of unmanned aerial vehicle body step includes:
acquiring current position coordinates transmitted by a photoelectric position sensor, and acquiring the attitude variation of the unmanned aerial vehicle body through an inertia measurement unit; the attitude variation comprises a roll angle and a pitch angle; the current position coordinate is obtained by the photoelectric position sensor responding to laser emitted by the laser emitting device in the flight process;
acquiring a distance value between the gravity center of the unmanned aerial vehicle body and an induction surface of the photoelectric position sensor;
and processing the current position coordinate, the distance value, the roll angle and the pitch angle to obtain a second current horizontal coordinate.
7. The unmanned aerial vehicle hoistway inspection tour navigation method according to claim 6, wherein in the step of processing the current position coordinates, the distance value, the roll angle and the pitch angle to obtain the second current horizontal coordinate, the second current horizontal coordinate is obtained based on the following formula:
Xti=Xbias t-La·cos(arctan(tan2θt+tan2Φt)1/2)·tanθt
Yti=YBias t-La·cos(arctan(tan2θt+tan2Φt)1/2)·tanΦt
Wherein, XtiIs the abscissa of the second current horizontal coordinate; y istiIs the ordinate of the second current horizontal coordinate; la is the distance value; thetatIs the pitch angle; phitIs the roll angle.
8. The unmanned aerial vehicle hoistway inspection tour navigation method according to claim 6, wherein the step of processing the initial position information and the current position information to obtain a position deviation comprises:
and confirming the coordinate difference value of the second current horizontal coordinate and the second initial horizontal coordinate as the position deviation.
9. The hoistway inspection navigation method for the unmanned aerial vehicle according to any one of claims 1 to 8, further comprising the steps of:
and entering a return journey flow under the condition that all ascending stations finish the acquisition flow.
10. The hoistway inspection navigation method for the unmanned aerial vehicle according to claim 9, wherein the return journey process comprises the following steps:
inputting a descending instruction, and correcting a descending route of the unmanned aerial vehicle body according to the position deviation;
detecting that the unmanned aerial vehicle body arrives at the next station according to the corrected descending route, and enters the acquisition flow of point cloud data and image data of the current station; under the condition of finishing the acquisition process, outputting a descending instruction until all the stations finish the acquisition process; the descending instruction is used for instructing the unmanned aerial vehicle body to move from the current station to the next station;
and entering a landing process under the condition that all descending stations finish the acquisition process.
11. The hoistway inspection navigation method for the unmanned aerial vehicle according to claim 1, wherein the step of detecting that the unmanned aerial vehicle body reaches the next station according to the corrected ascending route comprises:
acquiring the attitude variation of the unmanned aerial vehicle body through an inertia measurement unit; the attitude variation comprises a roll angle and a pitch angle;
acquiring a first distance between the unmanned aerial vehicle body and a pit, a second distance between the beam center of the laser radar and a radiation surface of a lower right-angle emission prism and a third distance between an intersection point of rotation axes of all swing arms and the beam center, wherein the first distance is output by the laser radar; the lower right-angle emission prism and the laser radar are both arranged on any swing arm; the swing arm is arranged on the unmanned aerial vehicle body and used for swinging the laser radar;
processing the first distance, the second distance, the third distance, the roll angle and the pitch angle to obtain the current height of the unmanned aerial vehicle body;
and if the current height is the same as the height of the next station, confirming that the unmanned aerial vehicle body reaches the next station according to the corrected ascending route.
12. The hoistway inspection navigation method for the unmanned aerial vehicle according to claim 11, wherein in the step of obtaining the current height of the unmanned aerial vehicle body by processing the first distance, the second distance, the third distance, the roll angle and the pitch angle, the current height is obtained based on the following formula:
Z=H·cos(arctan(tan2θ+tan2Φ)1/2)±Lb·sin(arctan(tan2θ+tan2Φ)1/2)-(Lc-Lc·cos(arctan(tan2θ+tan2Φ)1/2));
wherein Z is the current height; theta is the pitch angle; phi is the roll angle; h is the first distance; l isbIs the second distance; l iscIs the third distance.
13. The utility model provides a navigation head is patrolled and examined to unmanned aerial vehicle's well, its characterized in that includes:
the system comprises an initial position information acquisition module, a data acquisition module and a data processing module, wherein the initial position information acquisition module is used for outputting a rising instruction and acquiring initial position information of an unmanned aerial vehicle body under the condition that the unmanned aerial vehicle body is positioned at an initial station;
the position deviation acquiring module is used for acquiring current position information of the unmanned aerial vehicle body in the flight process, and processing the initial position information and the current position information to obtain position deviation;
the correction module is used for correcting the flight path of the unmanned aerial vehicle body according to the position deviation;
the acquisition module is used for detecting the acquisition process of point cloud data and image data of the unmanned aerial vehicle body entering the current station when the unmanned aerial vehicle body arrives at the next station according to the corrected flight route; outputting a takeoff instruction until all the stations complete the acquisition process under the condition of completing the acquisition process; the takeoff instruction is used for indicating the unmanned aerial vehicle body to move from the current station to the next station.
14. An unmanned aerial vehicle comprising an unmanned aerial vehicle body, a memory and a processor, the memory being provided on the unmanned aerial vehicle body and storing a computer program, wherein the processor implements the steps of the method of any one of claims 1 to 12 when executing the computer program.
15. The unmanned aerial vehicle of claim 14, further comprising a lidar, a photoelectric position sensor, an inertial measurement unit, and an image acquisition device disposed on the unmanned aerial vehicle body;
the processor is respectively connected with the laser radar, the photoelectric position sensor, the inertia measurement unit and the image acquisition equipment.
16. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method of any one of claims 1 to 12.
CN202011228810.4A 2020-11-06 2020-11-06 Unmanned aerial vehicle well inspection navigation method and device and unmanned aerial vehicle Active CN112327898B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011228810.4A CN112327898B (en) 2020-11-06 2020-11-06 Unmanned aerial vehicle well inspection navigation method and device and unmanned aerial vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011228810.4A CN112327898B (en) 2020-11-06 2020-11-06 Unmanned aerial vehicle well inspection navigation method and device and unmanned aerial vehicle

Publications (2)

Publication Number Publication Date
CN112327898A true CN112327898A (en) 2021-02-05
CN112327898B CN112327898B (en) 2023-08-29

Family

ID=74316246

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011228810.4A Active CN112327898B (en) 2020-11-06 2020-11-06 Unmanned aerial vehicle well inspection navigation method and device and unmanned aerial vehicle

Country Status (1)

Country Link
CN (1) CN112327898B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113359829A (en) * 2021-06-10 2021-09-07 西安图迹信息科技有限公司 Unmanned aerial vehicle power plant intelligent inspection method based on big data
CN114217626A (en) * 2021-12-14 2022-03-22 集展通航(北京)科技有限公司 Railway engineering detection method and system based on unmanned aerial vehicle inspection video

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104843176A (en) * 2015-04-28 2015-08-19 武汉大学 Unmanned-gyroplane system used for automatic-inspection of bridges and tunnels and navigation method
WO2016065623A1 (en) * 2014-10-31 2016-05-06 SZ DJI Technology Co., Ltd. Systems and methods for surveillance with visual marker
US20160311648A1 (en) * 2015-04-23 2016-10-27 Kone Corporation Arrangement and a method for measuring the position of an installation platform in an elevator shaft
JP2017128440A (en) * 2016-01-22 2017-07-27 株式会社日立ビルシステム Elevator inspection device and elevator inspection system
CN107783106A (en) * 2016-08-25 2018-03-09 大连楼兰科技股份有限公司 Data fusion method between unmanned plane and barrier
JP2018144981A (en) * 2017-03-07 2018-09-20 株式会社日立ビルシステム Inspection device for elevator, inspection system for elevator, and control method and terminal device thereof
CN109101039A (en) * 2018-06-29 2018-12-28 太原理工大学 Vertical detection method and system
CN109189088A (en) * 2018-08-21 2019-01-11 中南林业科技大学 Captive unmanned plane adaptive cruise tracking, terminal and storage medium
US20190031342A1 (en) * 2017-07-31 2019-01-31 Queen's University At Kingston Autorotating unmanned aerial vehicle surveying platform
EP3489184A1 (en) * 2017-11-28 2019-05-29 Otis Elevator Company Hoistway inspection device
JP2020040781A (en) * 2018-09-10 2020-03-19 株式会社日立ビルシステム Measurement system and measurement method
JP6720382B1 (en) * 2019-04-24 2020-07-08 東芝エレベータ株式会社 Elevator system, unmanned aerial vehicle used therefor, and elevator pretreatment method
CN111573461A (en) * 2020-05-20 2020-08-25 迅达(中国)电梯有限公司 Elevator maintenance system
WO2020202289A1 (en) * 2019-03-29 2020-10-08 三菱電機株式会社 Physical distribution system and unmanned flying object

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016065623A1 (en) * 2014-10-31 2016-05-06 SZ DJI Technology Co., Ltd. Systems and methods for surveillance with visual marker
US20160311648A1 (en) * 2015-04-23 2016-10-27 Kone Corporation Arrangement and a method for measuring the position of an installation platform in an elevator shaft
CN104843176A (en) * 2015-04-28 2015-08-19 武汉大学 Unmanned-gyroplane system used for automatic-inspection of bridges and tunnels and navigation method
JP2017128440A (en) * 2016-01-22 2017-07-27 株式会社日立ビルシステム Elevator inspection device and elevator inspection system
CN107783106A (en) * 2016-08-25 2018-03-09 大连楼兰科技股份有限公司 Data fusion method between unmanned plane and barrier
JP2018144981A (en) * 2017-03-07 2018-09-20 株式会社日立ビルシステム Inspection device for elevator, inspection system for elevator, and control method and terminal device thereof
US20190031342A1 (en) * 2017-07-31 2019-01-31 Queen's University At Kingston Autorotating unmanned aerial vehicle surveying platform
EP3489184A1 (en) * 2017-11-28 2019-05-29 Otis Elevator Company Hoistway inspection device
CN109101039A (en) * 2018-06-29 2018-12-28 太原理工大学 Vertical detection method and system
CN109189088A (en) * 2018-08-21 2019-01-11 中南林业科技大学 Captive unmanned plane adaptive cruise tracking, terminal and storage medium
JP2020040781A (en) * 2018-09-10 2020-03-19 株式会社日立ビルシステム Measurement system and measurement method
WO2020202289A1 (en) * 2019-03-29 2020-10-08 三菱電機株式会社 Physical distribution system and unmanned flying object
JP6720382B1 (en) * 2019-04-24 2020-07-08 東芝エレベータ株式会社 Elevator system, unmanned aerial vehicle used therefor, and elevator pretreatment method
CN111573461A (en) * 2020-05-20 2020-08-25 迅达(中国)电梯有限公司 Elevator maintenance system

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
TSUN KIT HUI 等: "Autonomous Elevator Inspection with Unmanned Aerial Vehicle", 2016 3RD ASIA-PACIFIC WORLD CONGRESS ON COMPUTER SCIENCE AND ENGINEERING (APWC ON CSE), pages 26 - 33 *
屈利伟: "桥梁检测无人机控制技术研究", 中国优秀硕士学位论文全文数据库 工程科技Ⅱ辑, no. 01, pages 034 - 1494 *
张沛;孙运强;石喜玲;: "应用于矿井的多旋翼飞行器姿态信息融合设计", 煤炭技术, no. 05, pages 279 - 281 *
雷嘉伟等: "基于MCF5214电梯并联群控CAN通信设计的实现", 电子世界, no. 12, pages 63 - 64 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113359829A (en) * 2021-06-10 2021-09-07 西安图迹信息科技有限公司 Unmanned aerial vehicle power plant intelligent inspection method based on big data
CN113359829B (en) * 2021-06-10 2022-12-09 西安图迹信息科技有限公司 Unmanned aerial vehicle power plant intelligent inspection method based on big data
CN114217626A (en) * 2021-12-14 2022-03-22 集展通航(北京)科技有限公司 Railway engineering detection method and system based on unmanned aerial vehicle inspection video
CN114217626B (en) * 2021-12-14 2022-06-28 集展通航(北京)科技有限公司 Railway engineering detection method and system based on unmanned aerial vehicle routing inspection video

Also Published As

Publication number Publication date
CN112327898B (en) 2023-08-29

Similar Documents

Publication Publication Date Title
CN106124517B (en) The multi-rotor unmanned aerial vehicle detection platform system of detection structure part surface crack and its method for detection structure part surface crack
JP6029446B2 (en) Autonomous flying robot
JP5303873B2 (en) Vehicle shape measuring method and apparatus
KR102159376B1 (en) Laser scanning system, laser scanning method, mobile laser scanning system and program
CN108154084A (en) For the farm environment cognitive method of the unpiloted Multi-sensor Fusion of agricultural machinery
CN111670419A (en) Active supplemental exposure settings for autonomous navigation
JP2017144784A (en) Flight plan creation method and flight body guidance system
CN112363176B (en) Elevator hoistway inspection and modeling method and device and inspection and modeling system
CN112327898A (en) Unmanned aerial vehicle and well patrol navigation method and device thereof
CN109239725A (en) Ground mapping method and terminal based on laser ranging system
US20180147998A1 (en) Aerial Photogrammetric Device And Aerial Photogrammetric Method
CN112799422A (en) Unmanned aerial vehicle flight control method and device for power inspection
JP2014142828A (en) Autonomous mobile robot
JP7436657B2 (en) Flight photography system and method
JP6014484B2 (en) Autonomous mobile robot
JP6577083B2 (en) Measuring system
CN112478968B (en) Elevator hoistway inspection control method, device and system and storage medium
JP2023100642A (en) inspection system
US20210229810A1 (en) Information processing device, flight control method, and flight control system
JP2018138922A (en) Measuring system
CN115718298A (en) System for UGV and UAV automatically provide lidar data reference thereof for 3D detection
CN212623088U (en) Iron tower attitude early warning device based on image recognition and laser ranging
WO2020204201A1 (en) Aircraft
WO2021087785A1 (en) Terrain detection method, movable platform, control device and system, and storage medium
CN114942421A (en) Omnidirectional scanning multiline laser radar autonomous positioning device and method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant