CN111099018B - Material throwing device for offshore unmanned aerial vehicle and control method - Google Patents

Material throwing device for offshore unmanned aerial vehicle and control method Download PDF

Info

Publication number
CN111099018B
CN111099018B CN201911362031.0A CN201911362031A CN111099018B CN 111099018 B CN111099018 B CN 111099018B CN 201911362031 A CN201911362031 A CN 201911362031A CN 111099018 B CN111099018 B CN 111099018B
Authority
CN
China
Prior art keywords
unmanned aerial
aerial vehicle
image
target
throwing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911362031.0A
Other languages
Chinese (zh)
Other versions
CN111099018A (en
Inventor
顾沈明
吴伟志
管林挺
谭小球
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Ocean University ZJOU
Original Assignee
Zhejiang Ocean University ZJOU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Ocean University ZJOU filed Critical Zhejiang Ocean University ZJOU
Priority to CN201911362031.0A priority Critical patent/CN111099018B/en
Publication of CN111099018A publication Critical patent/CN111099018A/en
Application granted granted Critical
Publication of CN111099018B publication Critical patent/CN111099018B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENTS OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D1/00Dropping, ejecting, releasing, or receiving articles, liquids, or the like, in flight
    • B64D1/02Dropping, ejecting, or releasing articles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/60UAVs specially adapted for particular uses or applications for transporting passengers; for transporting goods other than weapons
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2201/00UAVs characterised by their flight controls

Abstract

The invention relates to a material throwing device and a control method for a marine unmanned aerial vehicle, wherein the device comprises laser ranging equipment, image acquisition equipment, image analysis equipment, wireless communication equipment, a material grabbing frame and a controller, wherein the controller is used for controlling the unmanned aerial vehicle to deflect according to the deflection angle of a shot image of a throwing target relative to a standard image of the throwing target, so that the unmanned aerial vehicle is opposite to the throwing target, and controlling the distance between the unmanned aerial vehicle and the throwing target according to the measurement data of the laser ranging equipment, so that the distance between the unmanned aerial vehicle and the throwing target is within a preset distance range; the controller controls the unmanned aerial vehicle to move to the position where the unmanned aerial vehicle is right opposite to the throwing target and the distance between the unmanned aerial vehicle and the throwing target is within a preset distance range, and the material grabbing frame is controlled to loosen the material to be thrown. The invention can realize the precise material throwing of the offshore unmanned aerial vehicle.

Description

Material throwing device for offshore unmanned aerial vehicle and control method
Technical Field
The invention relates to the technical field of marine unmanned aerial vehicles, in particular to a material throwing device and a control method for a marine unmanned aerial vehicle.
Background
With the rapid development of society, unmanned aerial vehicles can provide more and more services for people. Unmanned aerial vehicle generally has characteristics such as miniaturization, dexterity, convenience, can the wide application in scenes such as shooting, monitoring, goods and materials input. The unmanned aerial vehicle is adopted to deliver the materials, so that great help can be provided for remote material transportation and remote material rescue of people.
However, in the prior art, the unmanned aerial vehicle supplies are generally only controlled by position data, and supplies are started to be delivered after the unmanned aerial vehicle flies to a specified position. However, this method is affected by the accuracy of position location, and there may be a large deflection between the drone and the designated launch area, so that the specific position of the material launched by the drone on the launch target cannot be precisely controlled.
Disclosure of Invention
In order to solve the problems in the prior art, the invention provides a material throwing device and a control method for a marine unmanned aerial vehicle, which can accurately control the material throwing angle and position of the unmanned aerial vehicle and realize accurate material throwing.
In order to achieve the above object, the present invention has the following configurations:
the invention provides a material throwing device for a marine unmanned aerial vehicle, which comprises
The laser ranging device is arranged at the bottom of the unmanned aerial vehicle and used for measuring the distance between the unmanned aerial vehicle and a throwing target;
the image acquisition equipment is used for shooting an image of the target;
the positioning equipment is used for acquiring the position data of the unmanned aerial vehicle and sending the position data to the remote server;
the wireless communication equipment is arranged in the unmanned aerial vehicle and used for communicating with a remote server, receiving a standard image and a position of a throwing target from the remote server, sending position data of the unmanned aerial vehicle to the remote server, and receiving a planned path from the position of the unmanned aerial vehicle to the position of the throwing target from the remote server;
the image analysis device is arranged inside the unmanned aerial vehicle and used for comparing the shot image of the throwing target with the standard image of the throwing target and calculating the deflection angle of the shot image of the throwing target relative to the standard image of the throwing target;
the material grabbing frame is arranged at the bottom of the unmanned aerial vehicle and used for grabbing materials to be thrown;
the controller is arranged in the unmanned aerial vehicle and used for controlling the unmanned aerial vehicle to deflect according to the deflection angle of the shot image of the throwing target relative to the standard image of the throwing target, so that the unmanned aerial vehicle is right opposite to the throwing target, and controlling the distance between the unmanned aerial vehicle and the throwing target according to the measurement data of the laser ranging device, so that the distance between the unmanned aerial vehicle and the throwing target is within a preset distance range;
the controller controls the unmanned aerial vehicle to move to the position where the unmanned aerial vehicle is right opposite to the throwing target and the distance between the unmanned aerial vehicle and the throwing target is within a preset distance range, and the material grabbing frame is controlled to loosen the material to be thrown.
Optionally, the image acquisition device is further configured to shoot the current image of the delivery target after the material delivery is completed, the wireless communication device sends the current image of the delivery target to the remote server, and the remote server takes the current image of the delivery target as the standard image of the delivery target and sends the updated standard image of the delivery target to the corresponding unmanned aerial vehicle when the next unmanned aerial vehicle requests to acquire the standard image.
Optionally, the image analysis device compares the shot image of the delivery target with the standard image of the delivery target, and calculates a deflection angle of the shot image of the delivery target relative to the standard image of the delivery target by the following steps:
the image analysis equipment selects at least one first reference line in the standard image and selects a corresponding second reference line in the shot image;
the image analysis equipment calculates the included angle between each first datum line and the corresponding second datum line as the deflection angle of each second datum line;
the image analysis device calculates an average value of the deflection angles of the respective second reference lines as the deflection angle of the captured image with respect to the standard image.
Optionally, the image analysis device selects at least one first reference line in the standard image, including the image analysis device determining at least two first marker points in the standard image, and connecting every two first marker points as a first reference line;
and selecting a corresponding second reference line in the shot image by the image analysis equipment, wherein the image analysis equipment adopts the first mark point to perform pattern matching in the shot image, matches the first mark point with the corresponding second mark point in the shot image, and connects every two second mark points to be used as a second reference line.
Optionally, the image acquisition device captures a current image of the delivered target after the material delivery is completed, and the image analysis device analyzes the current image of the delivered target and matches the current image to obtain a third mark point corresponding to the first mark point of the standard image;
when the wireless communication equipment sends the current image of the release target to the remote server, the wireless communication equipment sends the position of a third mark point in the current image of the release target to the remote server;
and after receiving the current image of the launching target, the remote server updates the standard image of the launching target and marks the third mark point as the first mark point in the standard image.
Optionally, the image analysis device is further configured to extract an indicator light region from the captured image of the target, identify the color of the indicator light region, analyze the RGB value of the color, and determine whether the RGB value of the color is within a first preset range or a second preset range.
Optionally, the controller is further configured to receive a color recognition result of the image analysis device before controlling the material grabbing frame to loosen materials, if the RGB value of the color is within a first preset range, the controller controls the material grabbing frame to loosen the materials, and if the RGB value of the color is within a second preset range, the controller does not control the material grabbing frame to loosen the materials until detecting that the RGB value of the color is within the first preset range.
Optionally, the controller is further configured to control the unmanned aerial vehicle to fly to the material storage location according to pre-stored position data of the material storage location when the wireless communication device receives a release task from the remote controller.
The embodiment of the invention also provides a control method of the material throwing device for the offshore unmanned aerial vehicle, which comprises the following steps:
the wireless communication equipment receives a release task from the remote server;
the positioning equipment acquires position data of the unmanned aerial vehicle and sends the position data to the remote server through the wireless communication equipment;
the wireless communication device receiving a standard image and a location of a launch target from the remote server and receiving a planned path from the location of the drone to the location of the launch target from the remote server;
the controller controls the unmanned aerial vehicle to fly to a target position according to the planned path;
the laser ranging equipment measures the distance between the unmanned aerial vehicle and a throwing target;
the controller adjusts the height of the unmanned aerial vehicle according to the measurement data of the laser ranging device, so that the distance between the unmanned aerial vehicle and the throwing target is within a preset distance range;
the image acquisition equipment shoots an image of a throwing target;
the image analysis device compares the shot image of the throwing target with the standard image of the throwing target and calculates the deflection angle of the shot image of the throwing target relative to the standard image of the throwing target;
the controller controls the unmanned aerial vehicle to deflect according to the deflection angle of the shot image of the throwing target relative to the shot image of the throwing target, so that the unmanned aerial vehicle is over against the throwing target;
the controller controls the unmanned aerial vehicle to move to the position where the unmanned aerial vehicle is right opposite to the throwing target and the distance between the unmanned aerial vehicle and the throwing target is within a preset distance range, and the material grabbing frame is controlled to loosen the material to be thrown.
Optionally, after receiving the release task from the remote server, the wireless communication device further includes the following steps:
the controller controls the unmanned aerial vehicle to fly to the material storage place according to prestored position data of the material storage place;
the laser ranging equipment measures the distance between the unmanned aerial vehicle and the material to be thrown;
the controller controls the distance between the unmanned aerial vehicle and the material to be thrown according to the measurement data of the laser ranging equipment;
the controller controls the material grabbing frame to grab the material to be thrown.
In summary, compared with the prior art, the controller of the present invention controls the unmanned aerial vehicle to deflect according to the deflection angle of the shot image of the drop target relative to the standard image of the drop target, so that the unmanned aerial vehicle faces the drop target, and controls the distance between the unmanned aerial vehicle and the drop target according to the measurement data of the laser ranging device, so that the distance between the unmanned aerial vehicle and the drop target is within the preset distance range, thereby achieving accurate position control of material drop, and particularly being applicable to a scene of accurate position material drop, for example, dropping the dropped material within a specified area range at a specified angle, and stacking the multiple dropped materials, and being applicable to wide-range popularization and application.
Drawings
Fig. 1 is a schematic structural diagram of a material delivery device for a marine unmanned aerial vehicle according to an embodiment of the present invention;
fig. 2 is a flowchart of a control method of a material delivery device for a marine unmanned aerial vehicle according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of calculating a deflection angle according to an embodiment of the present invention;
FIG. 4 is a flowchart of calculating a deflection angle according to an embodiment of the present invention.
Reference numerals:
laser ranging apparatus M100
Image capturing apparatus M200
Positioning device M300
Wireless communication device M400
Image analysis apparatus M500
Material grabbing frame M600
Controller M700
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The same reference numerals in the drawings denote the same or similar structures, and thus their repetitive description will be omitted.
The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to provide a thorough understanding of embodiments of the invention. One skilled in the relevant art will recognize, however, that the invention may be practiced without one or more of the specific details, or with other methods, components, materials, and so forth. In some instances, well-known structures, materials, or operations are not shown or described in detail to avoid obscuring the invention.
As shown in fig. 1, in order to solve the technical problem in the prior art, the invention provides a material throwing device for a marine unmanned aerial vehicle, which comprises the following devices:
the laser ranging device M100 is arranged at the bottom of the unmanned aerial vehicle and used for measuring the distance between the unmanned aerial vehicle and a throwing target, the laser ranging device M100 can adopt a laser transmitter and a laser receiver, ranging laser is sent to the throwing target through the laser transmitter, the laser receiver receives laser reflected by the throwing target, and the distance between the unmanned aerial vehicle and the throwing target is determined through the time difference between the laser emission and the laser reception;
the image acquisition equipment M200 is used for shooting an image of a target to be launched, the image acquisition equipment M200 can adopt a camera arranged at the bottom or in front of the unmanned aerial vehicle, and is preferably arranged at the bottom of the unmanned aerial vehicle, so that the calculation of the deflection angle of the unmanned aerial vehicle based on the image is facilitated;
the positioning device M300 is configured to acquire position data of the unmanned aerial vehicle and send the position data to the remote server, and the positioning device M300 may employ a GPS positioning module, a beidou satellite positioning module, a GNNS positioning module, or the like;
the wireless communication equipment M400 is arranged in the unmanned aerial vehicle and is used for communicating with a remote server, receiving a standard image and a position of a throwing target from the remote server, sending position data of the unmanned aerial vehicle to the remote server, and receiving a planned path from the position of the unmanned aerial vehicle to the position of the throwing target from the remote server; the wireless communication device M400 may be a 4G communication device, a 3G communication device, a WIFI communication device, or the like;
the image analysis device M500 is arranged inside the unmanned aerial vehicle and used for comparing the shot image of the launching target with the standard image of the launching target and calculating the deflection angle of the shot image of the launching target relative to the standard image of the launching target;
the material grabbing frame M600 is arranged at the bottom of the unmanned aerial vehicle and used for grabbing materials to be thrown; the material grabbing frame M600 at least comprises two clamps, the two clamps can be closed and opened under the control of a driving motor, and the driving motor receives an instruction of the controller M700 to control the state of the clamps;
the controller M700 is arranged inside the unmanned aerial vehicle and is used for controlling the unmanned aerial vehicle to deflect according to the deflection angle of the shot image of the throwing target relative to the standard image of the throwing target, so that the unmanned aerial vehicle is over against the throwing target, and controlling the distance between the unmanned aerial vehicle and the throwing target according to the measurement data of the laser ranging device M100, so that the distance between the unmanned aerial vehicle and the throwing target is within a preset distance range;
the controller M700 controls the unmanned aerial vehicle to move to the position where the unmanned aerial vehicle is just opposite to the throwing target, and when the distance between the unmanned aerial vehicle and the throwing target is within a preset distance range, the material grabbing frame M600 is controlled to loosen the material to be thrown.
Therefore, the invention can realize the accurate control of the deflection angle of the unmanned aerial vehicle through image comparison, and realize the accurate measurement of the distance through laser ranging, thereby realizing the accurate determination of the material throwing position of the unmanned aerial vehicle and realizing the accurate throwing of the material. Be applicable to and very accurately put in the scene to the specified area with unmanned aerial vehicle, also can carry out the goods and materials when a lot of goods and materials are put in and pile up, reduce the occupation space of goods and materials at the input target, be applicable to extensive popularization and application.
In this embodiment, the image capturing device M200 is further configured to capture a current image of the delivery target after the material delivery is completed, the wireless communication device M400 sends the current image of the delivery target to the remote server, and the remote server takes the current image of the delivery target as a standard image of the delivery target and sends the updated standard image of the delivery target to the corresponding unmanned aerial vehicle when the next unmanned aerial vehicle requests to acquire the standard image.
Because, the standard image of the release target stored in the remote server is updated in real time, the situation that the scene changes and the standard image does not change after the previous unmanned aerial vehicle releases the materials is avoided, and the subsequent unmanned aerial vehicle can conveniently release the accurate materials. The material stacking device is particularly suitable for scenes in which materials are required to be placed in order or stacked up and down. After the previous unmanned aerial vehicle is put in, because unmanned aerial vehicle is just to putting in the target this moment to with put in the distance of target within reasonable scope, consequently, can regard this unmanned aerial vehicle's shooting image this moment as new standard image, refer when supplying next unmanned aerial vehicle to put in goods and materials.
As shown in fig. 2, an embodiment of the present invention further provides a control method for a material delivery device for a marine unmanned aerial vehicle, where the material delivery device for a marine unmanned aerial vehicle is adopted, and the method includes the following steps:
the wireless communication device M400 receives the delivery task from the remote server; the remote server may actively allocate a task to each unmanned aerial vehicle and send a task instruction to the corresponding wireless communication device, or the controller M700 controls the wireless communication device M400 to actively request the remote server to receive a task after the unmanned aerial vehicle completes a task;
the positioning equipment M300 acquires the position data of the unmanned aerial vehicle and sends the position data to the remote server through the wireless communication equipment; the remote server is used for distributing tasks to all the unmanned aerial vehicles and supervising the completion condition of the tasks of all the unmanned aerial vehicles;
the wireless communication device M400 receives the standard image and the position of the launch target from the remote server, and receives the planned path from the position of the drone to the position of the launch target from the remote server;
the controller M700 controls the unmanned aerial vehicle to fly to a target position according to the planned path; the step is coarse adjustment of the position of the unmanned aerial vehicle, namely, the unmanned aerial vehicle is firstly adjusted to fly to the position of a throwing target according to positioning data, but the fine adjustment of the position of the unmanned aerial vehicle is further performed below in consideration of the problems of positioning precision and the angle deflection of the unmanned aerial vehicle;
the laser ranging device M100 measures the distance between the unmanned aerial vehicle and a throwing target;
the controller M700 adjusts the height of the unmanned aerial vehicle according to the measurement data of the laser ranging device M100, so that the distance between the unmanned aerial vehicle and the drop target is within a preset distance range;
the image acquisition equipment M200 shoots an image of a target to be launched;
the image analysis device M500 compares the shot image of the launch target with the standard image of the launch target, and calculates a deflection angle of the shot image of the launch target with respect to the standard image of the launch target;
the controller M700 controls the unmanned aerial vehicle to deflect according to the deflection angle of the shot image of the drop target relative to the shot image of the drop target, so that the unmanned aerial vehicle is directly opposite to the drop target;
the controller M700 controls the unmanned aerial vehicle to move to the position where the unmanned aerial vehicle is just opposite to the throwing target, and when the distance between the unmanned aerial vehicle and the throwing target is within a preset distance range, the material grabbing frame M600 is controlled to loosen the material to be thrown.
In this embodiment, the controller M700 is further configured to control the unmanned aerial vehicle to fly to the material storage according to the pre-stored position data of the material storage when receiving a task from the remote controller M700 through the wireless communication device.
Specifically, after receiving the delivery task from the remote server, the wireless communication device further includes the following steps:
the controller M700 controls the unmanned aerial vehicle to fly to the material storage place according to the pre-stored position data of the material storage place;
the laser ranging device M100 measures the distance between the unmanned aerial vehicle and the material to be launched;
the controller M700 controls the distance between the unmanned aerial vehicle and the material to be delivered according to the measurement data of the laser ranging device M100, so that the distance between the unmanned aerial vehicle and the material to be delivered can reach a point that the clamp of the material grabbing frame M600 can grab the material to be delivered;
the controller M700 controls the material grabbing frame M600 to grab the material to be delivered.
Therefore, the material throwing device and the control method for the marine unmanned aerial vehicle can realize the automation of the unmanned aerial vehicle throwing task execution. The worker only needs to add tasks and assign tasks at the remote controller M700. The worker may set a task list including a plurality of tasks and task information at the remote controller M700, the task information including position data of material storages of the tasks, position data of the drop targets, standard images of the drop targets, and the like. The remote server receives task execution feedback of each unmanned aerial vehicle, records whether each unmanned aerial vehicle is in an idle state currently, and selects the uppermost task from the task list to be allocated to the idle unmanned aerial vehicle if the idle unmanned aerial vehicle appears, so that automatic streamlined task execution is realized. The staff can also view the task execution situation and the state of each unmanned aerial vehicle in the current task list from the remote controller M700 in real time.
As shown in fig. 3 and 4, in this embodiment, the image analysis device M500 compares the shot image of the drop target with the standard image of the drop target, and calculates the deflection angle of the shot image of the drop target with respect to the standard image of the drop target by the following steps:
the image analysis device M500 selects at least one first reference line in the standard image, and selects a corresponding second reference line in the captured image;
the image analysis device M500 calculates an included angle between each first reference line and the corresponding second reference line as a deflection angle of each second reference line;
the image analysis device M500 calculates an average value of the deflection angles of the respective second reference lines as the deflection angles of the photographed image with respect to the standard image when the plurality of second reference lines exist, thereby calculating a relatively objective and true deflection angle through the plurality of second reference lines, and may delete data if there is a second reference line whose deflection angle is greatly different from that of other second reference lines when calculating the average value, the measurement of the deflection angle of the second reference line may have a large error or error.
In this embodiment, the image analysis device M500 selects at least one first reference line in the standard image, including the image analysis device M500 determining at least two first marker points in the standard image, and connecting each two first marker points as a first reference line. The first mark point may be a mark point which is manually marked in the standard image in advance and can be well distinguished from the surrounding mark points. The size of the mark point is not suitable for being selected too large, otherwise, the accuracy of a reference line is influenced, and the calculation accuracy of the deflection angle of the unmanned aerial vehicle is influenced;
the image analysis device M500 selects a corresponding second reference line in the captured image, including the image analysis device M500 performing pattern matching in the captured image by using the first marker point, matching to the corresponding second marker point in the captured image, and connecting every two second marker points to serve as a second reference line.
Taking fig. 3 as an example, the image analysis apparatus M500 selects two first marker points A1 and a2 in the standard image, which are connected to form a first reference line A1a2, selects second marker points B1 and B2 in the captured image, which are connected to form a second reference line B1B2, and calculates an angle C between them. By adopting the mode, the accurate deflection angle can be obtained without requiring the sizes of the shot image and the standard image to be the same or the centers of the shot image and the standard image to be superposed.
The invention calculates the deflection angle between the two graphs by selecting the reference line, is more convenient to calculate, can directly calculate locally for the unmanned aerial vehicle without sending to a remote server for processing, saves the time of data transmission, and does not cause great calculation burden to the unmanned aerial vehicle. In addition, the number of the datum lines can be reasonably selected, and the deviation angle of a plurality of datum lines is calculated, so that the problem that the unmanned aerial vehicle is not aligned due to the fact that a single datum line is calculated to have large deviation is avoided.
In this embodiment, the image capturing device M200 captures a current image of the delivery target after the material delivery is completed, and the image analyzing device M500 analyzes the current image of the delivery target, and matches a third mark point corresponding to the first mark point of the standard image in the current image.
When the wireless communication device M400 sends the current image of the placement target to the remote server, the wireless communication device M400 sends the position of the third landmark point in the current image of the placement target to the remote server.
And after receiving the current image of the launching target, the remote server updates the standard image of the launching target and marks the third mark point as the first mark point in the standard image.
Therefore, the remote controller M700 does not need to re-label the mark point on the standard image updated each time, and directly marks the standard image with the third mark point. Only the first standard image of the target to be delivered needs to be marked manually with the first marker point.
In this embodiment, the image analysis device M500 is further configured to extract an indicator light region from the captured image of the target, identify the color of the indicator light region, analyze the RGB value of the color, and determine whether the RGB value of the color is within a first preset range or a second preset range.
In this embodiment, the controller M700 is further configured to receive a color recognition result of the image analysis apparatus M500 before controlling the material grabbing rack M600 to loosen the material, and if the RGB value of the color is within a first preset range, the controller M700 controls the material grabbing rack M600 to loosen the material, and if the RGB value of the color is within a second preset range, the controller M700 does not control the material grabbing rack M600 to loosen the material until detecting that the RGB value of the color is within the first preset range.
For example, the first preset range may be an RGB value range corresponding to green, and the second preset range may be an RGB value range corresponding to red. An indicator lamp may be disposed at the target, and the image analysis device M500 extracts an indicator lamp region from the captured image according to a shape or a color feature of the indicator lamp. If the indicator light shows green, it indicates that the target of delivery allows the delivery of the materials, the controller M700 may control the material grasping rack M600 to release the materials, if the indicator light shows red, it indicates that the target of delivery does not allow the delivery of the materials, and the controller M700 controls the material grasping rack M600 to continue waiting.
Therefore, the method and the device can automatically determine the preparation state of the target position without communicating with the target position through the indicator light and the color analysis, and are particularly suitable for the situations that the communication of the target position is inconvenient or the wireless communication signal is not good.
In other embodiments, the corresponding relationship between the light color and the status type of the target to be delivered may be further increased, and the status of the target to be delivered may be determined by identifying the color of the indicator. Further, also can set up liquid crystal display on the target of input, show the quantity of the goods and materials of input, unmanned aerial vehicle can directly discern the quantity in the liquid crystal display to confirm the goods and materials quantity of input on the target of input, if the goods and materials quantity of input has reached preset threshold value on the target of input, then no longer put in.
The surface of putting in the target can set up weight sensor, and when the weight change value that weight sensor sensed was greater than predetermineeing weight threshold value, the counter that liquid crystal display corresponds adds one, and the quantity that shows in the liquid crystal display also corresponds and adds one.
In the control method of the material delivery device for the marine unmanned aerial vehicle, the specific implementation manner of each step can adopt the function implementation manner of each device in the material delivery device for the marine unmanned aerial vehicle, which is not described herein again.
In summary, compared with the prior art, the controller of the present invention controls the unmanned aerial vehicle to deflect according to the deflection angle of the shot image of the drop target relative to the standard image of the drop target, so that the unmanned aerial vehicle faces the drop target, and controls the distance between the unmanned aerial vehicle and the drop target according to the measurement data of the laser ranging device, so that the distance between the unmanned aerial vehicle and the drop target is within the preset distance range, thereby achieving accurate position control of material drop, and particularly being applicable to a scene of accurate position material drop, for example, dropping the dropped material within a specified area range at a specified angle, and stacking the multiple dropped materials, and being applicable to wide-range popularization and application.
In this specification, the invention has been described with reference to specific embodiments thereof. It will, however, be evident that various modifications and changes may be made thereto without departing from the broader spirit and scope of the invention. The specification and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense.

Claims (8)

1. A goods and materials input device for marine unmanned aerial vehicle, its characterized in that, the device includes:
the laser ranging device is arranged at the bottom of the unmanned aerial vehicle and used for measuring the distance between the unmanned aerial vehicle and a throwing target;
the image acquisition equipment is used for shooting an image of the target;
the positioning equipment is used for acquiring the position data of the unmanned aerial vehicle and sending the position data to the remote server;
the wireless communication equipment is arranged in the unmanned aerial vehicle and used for communicating with a remote server, receiving a standard image and a position of a throwing target from the remote server, sending position data of the unmanned aerial vehicle to the remote server, and receiving a planned path from the position of the unmanned aerial vehicle to the position of the throwing target from the remote server;
the image analysis device is arranged inside the unmanned aerial vehicle and used for comparing the shot image of the throwing target with the standard image of the throwing target and calculating the deflection angle of the shot image of the throwing target relative to the standard image of the throwing target;
the material grabbing frame is arranged at the bottom of the unmanned aerial vehicle and used for grabbing materials to be thrown;
the controller is arranged in the unmanned aerial vehicle and used for controlling the unmanned aerial vehicle to deflect according to the deflection angle of the shot image of the throwing target relative to the standard image of the throwing target, so that the unmanned aerial vehicle is right opposite to the throwing target, and controlling the distance between the unmanned aerial vehicle and the throwing target according to the measurement data of the laser ranging device, so that the distance between the unmanned aerial vehicle and the throwing target is within a preset distance range;
the controller controls the unmanned aerial vehicle to move until the unmanned aerial vehicle is over against the throwing target, and when the distance between the unmanned aerial vehicle and the throwing target is within a preset distance range, the controller controls the material grabbing frame to loosen the material to be thrown;
the image acquisition equipment is further used for shooting a current image of the release target after the material release is finished, the wireless communication equipment sends the current image of the release target to the remote server, the remote server takes the current image of the release target as a standard image of the release target, and the updated standard image of the release target is sent to the corresponding unmanned aerial vehicle when the next unmanned aerial vehicle requests to acquire the standard image;
the image analysis device compares the shot image of the launching target with the standard image of the launching target and calculates the deflection angle of the shot image of the launching target relative to the standard image of the launching target by adopting the following steps:
the image analysis equipment selects at least one first reference line in the standard image and selects a corresponding second reference line in the shot image;
the image analysis equipment calculates the included angle between each first datum line and the corresponding second datum line as the deflection angle of each second datum line;
the image analysis device calculates an average value of the deflection angles of the respective second reference lines as the deflection angle of the captured image with respect to the standard image.
2. The material delivery device for marine unmanned aerial vehicle of claim 1, wherein the image analysis device selects at least one first reference line in the standard image, and comprises the image analysis device determining at least two first marker points in the standard image, and connecting every two first marker points as a first reference line;
and selecting a corresponding second reference line in the shot image by the image analysis equipment, wherein the image analysis equipment adopts the first mark point to perform pattern matching in the shot image, matches the first mark point with the corresponding second mark point in the shot image, and connects every two second mark points to be used as a second reference line.
3. The material delivery device for marine unmanned aerial vehicle according to claim 2, wherein the image acquisition device captures a current image of the delivery target after the material delivery is completed, the image analysis device analyzes the current image of the delivery target, and a third mark point corresponding to the first mark point of the standard image is obtained by matching in the current image;
when the wireless communication equipment sends the current image of the release target to the remote server, the wireless communication equipment sends the position of a third mark point in the current image of the release target to the remote server;
and after receiving the current image of the launching target, the remote server updates the standard image of the launching target and marks the third mark point as the first mark point in the standard image.
4. The material delivery device for marine unmanned aerial vehicle of claim 1, wherein the image analysis device is further configured to extract an indicator light region from the captured image of the delivery target, identify colors of the indicator light region, analyze RGB values of the colors, and determine whether the RGB values of the colors are within a first preset range or a second preset range.
5. The material delivery device for marine unmanned aerial vehicle of claim 4, wherein the controller is further configured to receive a color recognition result of the image analysis device before controlling the material grabbing rack to loosen the material, and if the RGB value of the color is within a first preset range, the controller controls the material grabbing rack to loosen the material, and if the RGB value of the color is within a second preset range, the controller does not control the material grabbing rack to loosen the material until the RGB value of the color is detected to be within the first preset range.
6. The material delivery device for marine unmanned aerial vehicle of claim 1, wherein the controller is further configured to control the unmanned aerial vehicle to fly to the material storage according to pre-stored position data of the material storage when receiving a delivery task from the remote server through the wireless communication device.
7. A control method of a material delivery device for a marine unmanned aerial vehicle, characterized in that the material delivery device for a marine unmanned aerial vehicle of any one of claims 1 to 6 is adopted, the method comprises the following steps:
the wireless communication equipment receives a release task from the remote server;
the positioning equipment acquires position data of the unmanned aerial vehicle and sends the position data to the remote server through the wireless communication equipment;
the wireless communication device receiving a standard image and a location of a launch target from the remote server and receiving a planned path from the location of the drone to the location of the launch target from the remote server;
the controller controls the unmanned aerial vehicle to fly to a target position according to the planned path;
the laser ranging equipment measures the distance between the unmanned aerial vehicle and a throwing target;
the controller adjusts the height of the unmanned aerial vehicle according to the measurement data of the laser ranging device, so that the distance between the unmanned aerial vehicle and the throwing target is within a preset distance range;
the image acquisition equipment shoots an image of a throwing target;
the image analysis device compares the shot image of the throwing target with the standard image of the throwing target and calculates the deflection angle of the shot image of the throwing target relative to the standard image of the throwing target;
the controller controls the unmanned aerial vehicle to deflect according to the deflection angle of the shot image of the throwing target relative to the shot image of the throwing target, so that the unmanned aerial vehicle is over against the throwing target;
the controller controls the unmanned aerial vehicle to move to the position where the unmanned aerial vehicle is right opposite to the throwing target and the distance between the unmanned aerial vehicle and the throwing target is within a preset distance range, and the material grabbing frame is controlled to loosen the material to be thrown.
8. The method for controlling the material delivering device for offshore unmanned aerial vehicle as claimed in claim 7, wherein the wireless communication device further comprises the following steps after receiving the delivering task from the remote server:
the controller controls the unmanned aerial vehicle to fly to the material storage place according to prestored position data of the material storage place;
the laser ranging equipment measures the distance between the unmanned aerial vehicle and the material to be thrown;
the controller controls the distance between the unmanned aerial vehicle and the material to be thrown according to the measurement data of the laser ranging equipment;
the controller controls the material grabbing frame to grab the material to be thrown.
CN201911362031.0A 2019-12-26 2019-12-26 Material throwing device for offshore unmanned aerial vehicle and control method Active CN111099018B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911362031.0A CN111099018B (en) 2019-12-26 2019-12-26 Material throwing device for offshore unmanned aerial vehicle and control method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911362031.0A CN111099018B (en) 2019-12-26 2019-12-26 Material throwing device for offshore unmanned aerial vehicle and control method

Publications (2)

Publication Number Publication Date
CN111099018A CN111099018A (en) 2020-05-05
CN111099018B true CN111099018B (en) 2021-10-22

Family

ID=70425175

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911362031.0A Active CN111099018B (en) 2019-12-26 2019-12-26 Material throwing device for offshore unmanned aerial vehicle and control method

Country Status (1)

Country Link
CN (1) CN111099018B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115779299B (en) * 2022-11-15 2024-05-03 亿航智能设备(广州)有限公司 Unmanned aerial vehicle automatic fire extinguishing system and method

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107532921A (en) * 2015-03-25 2018-01-02 威罗门飞行公司 The Machine To Machine of certainty identification is maintained to aim at
CN108771803A (en) * 2018-08-10 2018-11-09 中国人民解放军火箭军工程大学 Control system and fire extinguishing system for the fire extinguishing of skyscraper rotor wing unmanned aerial vehicle
CN108820215A (en) * 2018-05-21 2018-11-16 南昌航空大学 A kind of automatic air-drop unmanned plane of autonomous searching target
CN109248390A (en) * 2018-09-14 2019-01-22 北京机械设备研究所 A kind of fire-fighting and rescue integrated system and method based on unmanned aerial vehicle platform
WO2019186713A1 (en) * 2018-03-27 2019-10-03 株式会社自律制御システム研究所 Unmanned aircraft

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101535401B1 (en) * 2015-04-01 2015-07-08 오인선 Drone type life ring dropping device
CN111770881A (en) * 2017-10-05 2020-10-13 欧弗沃克斯有限公司 Remotely controllable aviation ordnance

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107532921A (en) * 2015-03-25 2018-01-02 威罗门飞行公司 The Machine To Machine of certainty identification is maintained to aim at
WO2019186713A1 (en) * 2018-03-27 2019-10-03 株式会社自律制御システム研究所 Unmanned aircraft
CN108820215A (en) * 2018-05-21 2018-11-16 南昌航空大学 A kind of automatic air-drop unmanned plane of autonomous searching target
CN108771803A (en) * 2018-08-10 2018-11-09 中国人民解放军火箭军工程大学 Control system and fire extinguishing system for the fire extinguishing of skyscraper rotor wing unmanned aerial vehicle
CN109248390A (en) * 2018-09-14 2019-01-22 北京机械设备研究所 A kind of fire-fighting and rescue integrated system and method based on unmanned aerial vehicle platform

Also Published As

Publication number Publication date
CN111099018A (en) 2020-05-05

Similar Documents

Publication Publication Date Title
CN107358796B (en) Vehicle detection method based on unmanned aerial vehicle
CN109765930B (en) Unmanned aerial vehicle vision navigation
CN108983807B (en) Unmanned aerial vehicle fixed-point landing method and system
KR101494654B1 (en) Method and Apparatus for Guiding Unmanned Aerial Vehicle and Method and Apparatus for Controlling Unmanned Aerial Vehicle
WO2017000875A1 (en) Aircraft and obstacle avoidance method and system thereof
CN112650267B (en) Flight control method and device of aircraft and aircraft
US11430148B2 (en) Apparatus and method for pallet volume dimensioning through 3D vision capable unmanned aerial vehicles (UAV)
CN105045281A (en) Unmanned aerial vehicle flight control method and device
JP2017015704A (en) Camera unit adapted to be mounted on drone to map land, and image pickup management method by camera unit
CN113038016A (en) Unmanned aerial vehicle image acquisition method and unmanned aerial vehicle
US11924539B2 (en) Method, control apparatus and control system for remotely controlling an image capture operation of movable device
WO2017084240A1 (en) Target positioning and tracking system, device, and positioning and tracking method
WO2019061111A1 (en) Path adjustment method and unmanned aerial vehicle
CN111099018B (en) Material throwing device for offshore unmanned aerial vehicle and control method
US11372410B2 (en) Methods and apparatus for regulating a position of a drone
CN107783555B (en) Target positioning method, device and system based on unmanned aerial vehicle
CN105487550A (en) Autonomous landing system of flight device and method
WO2020147085A1 (en) Photographing control method and movable platform
JP2020138681A (en) Control system for unmanned flight vehicle
KR20170014841A (en) Providing method of putting information on the golf green
KR101949782B1 (en) Auto-checking system for overlap of picture by dron taking
CN114476063B (en) Unmanned aerial vehicle control method and system for geographical mapping
KR20210023138A (en) Apparatus and method for controlling landing of a unmanned aerial vehicle
CN110989645A (en) Target space attitude processing method based on compound eye imaging principle
KR101763294B1 (en) Shooting Control System and Method Using Thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant