GB2533140A - Drone - Google Patents

Drone Download PDF

Info

Publication number
GB2533140A
GB2533140A GB1422065.1A GB201422065A GB2533140A GB 2533140 A GB2533140 A GB 2533140A GB 201422065 A GB201422065 A GB 201422065A GB 2533140 A GB2533140 A GB 2533140A
Authority
GB
United Kingdom
Prior art keywords
drone
work machine
machine
control data
docking station
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB1422065.1A
Inventor
J Hanks Benjamin
Paul Forrester Adrian
M Varu Anand
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Caterpillar Inc
Original Assignee
Caterpillar Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Caterpillar Inc filed Critical Caterpillar Inc
Priority to GB1422065.1A priority Critical patent/GB2533140A/en
Publication of GB2533140A publication Critical patent/GB2533140A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0011Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
    • G05D1/0033Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement by having the operator tracking the vehicle either by direct line of sight or via one or more cameras located remotely from the vehicle
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0094Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots involving pointing a payload, e.g. camera, weapon, sensor, towards a fixed or moving target
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01BSOIL WORKING IN AGRICULTURE OR FORESTRY; PARTS, DETAILS, OR ACCESSORIES OF AGRICULTURAL MACHINES OR IMPLEMENTS, IN GENERAL
    • A01B69/00Steering of agricultural machines or implements; Guiding agricultural machines or implements on a desired track
    • A01B69/007Steering or guiding of agricultural vehicles, e.g. steering of the tractor to keep the plough in the furrow
    • A01B69/008Steering or guiding of agricultural vehicles, e.g. steering of the tractor to keep the plough in the furrow automatic
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U10/00Type of UAV
    • B64U10/10Rotorcrafts
    • B64U10/13Flying platforms
    • B64U10/14Flying platforms with four distinct rotor axes, e.g. quadcopters
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U80/00Transport or storage specially adapted for UAVs
    • B64U80/80Transport or storage specially adapted for UAVs by vehicles
    • B64U80/86Land vehicles
    • EFIXED CONSTRUCTIONS
    • E02HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
    • E02FDREDGING; SOIL-SHIFTING
    • E02F3/00Dredgers; Soil-shifting machines
    • E02F3/04Dredgers; Soil-shifting machines mechanically-driven
    • E02F3/28Dredgers; Soil-shifting machines mechanically-driven with digging tools mounted on a dipper- or bucket-arm, i.e. there is either one arm or a pair of arms, e.g. dippers, buckets
    • E02F3/36Component parts
    • E02F3/42Drives for dippers, buckets, dipper-arms or bucket-arms
    • E02F3/425Drive systems for dipper-arms, backhoes or the like
    • EFIXED CONSTRUCTIONS
    • E02HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
    • E02FDREDGING; SOIL-SHIFTING
    • E02F9/00Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
    • E02F9/26Indicating devices
    • E02F9/261Surveying the work-site to be treated
    • EFIXED CONSTRUCTIONS
    • E02HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
    • E02FDREDGING; SOIL-SHIFTING
    • E02F9/00Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
    • E02F9/26Indicating devices
    • E02F9/264Sensors and their calibration for indicating the position of the work tool
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01BSOIL WORKING IN AGRICULTURE OR FORESTRY; PARTS, DETAILS, OR ACCESSORIES OF AGRICULTURAL MACHINES OR IMPLEMENTS, IN GENERAL
    • A01B79/00Methods for working soil
    • A01B79/005Precision agriculture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2201/00UAVs characterised by their flight controls
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U50/00Propulsion; Power supply
    • B64U50/30Supply or distribution of electrical power
    • B64U50/37Charging when not in flight

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Mechanical Engineering (AREA)
  • Remote Sensing (AREA)
  • Structural Engineering (AREA)
  • Mining & Mineral Resources (AREA)
  • Civil Engineering (AREA)
  • General Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Transportation (AREA)
  • Soil Sciences (AREA)
  • Environmental Sciences (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

A drone 3 is associated with a work machine 1 (e.g. a digger, backhoe, bulldozer, earth mover, fork-lift, agricultural tractor, forestry vehicle, truck or other heavy or construction vehicle). Control data (e.g. a navigation path or 3D coordinates) is determined for controlling the drone with respect to the work machine 1. The work machine may have a docking station 9 comprising a housing and cover; a latch for securing the drone; a quick-release spring-loaded thrust device for launching the drone; and data-link and charging ports. The drone may comprise sensors which capture image or audio data of an area near the work vehicle and present them via a user interface in the work machine. Thus the drone may transmit an image of an obscured area 21 near the bucket 7 of an excavator

Description

Technical field
The present disclosure relates to a method for controlling a 5 drone in a system comprising the drone and a work machine.
Background
Machine operators may he required to exercise much skill and judgement when performing operations in a work site in order to avoid negatively disturbing the surrounding environment. For example, an excavation operation may be performed in a construction site in order to dig a trench, however various obstacles and hazards may be apparent in the vicinity of the trench, such as buildings and pipelines, and so the machine operator must be careful not to damage the buildings and pipelines.
In such situations, it may often be necessary for several workers to work together in order to perform a task diligently, and without negatively disturbing the surrounding construction site. For example, the digging of the trench may require a machine operator to perform the digging operation and an another worker to monitor the digging of the trench externally from the machine in order 25 to help guide the machine operator (e.g. using two-way radios).
There is a need to improve the efficiency of work site operations. -2 -
Summary
According to one aspect of the present disclosure, there is provided a method for controlling a drone in a system comprising the drone and a work machine, the method comprising: determining control data for controlling the drone with respect to the work machine.
According to another aspect of the present disclosure, there is provided a work machine for communications in a communications network comprising a drone, the work machine comprising: an engine; at least one work implement; a docking station constructed and arranged to receive the drone; and a controller adapted to enable wireless communications between the work machine and the drone, the controller being adapted to determine control data for the drone with respect to the work machine.
According to another aspect of the present disclosure, there is provided a drone for communications in a communications network comprising a work machine, the drone comprising: a sensor assembly; a propulsion means; a power source; and a controller adapted to enable wireless communications between the drone and the work machine, the controller being adapted to determine control data for the drone with respect to the work machine.
Brief description of the drawings
Aspects of the present disclosure will now be described, by way of example only, with reference to the following figures, in which: -3 -Figure 1 is a side view representation of a machine and drone according to an exemplary embodiment of the present disclosure, showing the drone docked on the machine; Figure 2 is another side view of the machine and drone of Figure 1, showing the drone being undocked from the machine; Figure 3 is a representation of a docking station according to the system of Figure 1; Figure 4 is a representation of a drone according to the system of Figure 1; Figure 5 is a side view representation of a drone and machine according to another exemplary embodiment of the present disclosure; Figure 6 is another side view representation of a drone and machine according to exemplary embodiment of Figure 5; and Figure 7 is a process diagram showing the signalling and processes that may occur in a communications network comprising the machine and drone.
Detailed description
Figure 1 is a representation of a work site, such as a construction site, having a machine 1 and a drone 3. The drone 3 may be in communication with the machine I. The machine 1 and the drone 3 may form a part of a system or communications network. -4 -
Figure 1 shows the drone 3 being 'docked" on the machine 1. Figure 2 shows the drone 3 being "undocked" from the machine 1 and hovering in a predetermined position with respect to the machine 1.
The drone 3 may comprise a sensor assembly 5, which may provide a monitoring capability so as to allow the drone 3 to capture data from the environment surrounding the drone 3 (and the machine 1) and send the captured data back to the machine 1 in substantially real-time. This may enable the drone 3 to provide the machine 1 with information about its surrounding environment. For example, the drone 3 may be provided with an image sensing capability so as to send real-time image data to the machine 1 so that an operator of the machine 1 may have a better visual perspective of the machine 1 as it interacts with its environment.
The drone 3 may be provided with an autonomous flight and "hovering" capability. This may allow the drone 3 to navigate to a predetermined region with respect to the machine 1 and stay within that region. Accordingly, the drone 3 may be capable of continuously monitoring an operation being performed by the machine 1.
As the flight of the drone 3 may be autonomous, the machine operator need not be concerned with controlling both the machine 1 as well as the drone 3. Instead, the machine operator may only need to focus on controlling the machine 1. -5 -
The drone 3 may be wirelessly "tethered" to the machine 1 so as to follow the machine 1 and stay within a predetermined region, even when the machine 1 is travelling.
In general, and as used herein, the term "work site" may be understood to be a region in which work may be performed in order to transform or develop the region. The work may sometimes be termed "construction" work. Such construction work may comprise, for example, one or more of: preparing a site, forming structures, and extracting materials such as minerals or ores from a landscape. More generally, construction work may encompass any environment in which a construction machine (as is well known in the art) may perform an operation (regardless of whether the work is intended to construct, demolish, or simply modify a landscape). For example, the construction work may comprise one or more of excavation, digging, foundation laving, demolition and building. In addition to ground-based construction work, the work site may include construction work which may take place underground or on the ocean floor. For example, the work site may include mining sites and offshore geotechnical engineering sites such as oil platforms, artificial islands and submarine pipelines. It will be appreciated that this is a non-exhaustive list and many other scenarios may be included within the definition of a work site.
In the examples described herein with reference to Figures 1 to 7, the machine 1 is a backhoe loader, however it will be appreciated that the principles of the present disclosure may also be applicable to any other type of machine. For example, the machine 1 may be any construction vehicle, -6 - heavy equipment vehicle or "off-highway" vehicle (in the sense that the machine 1 has a primary function other than for travel on a road surface, but nevertheless be capable of travelling on a road surface). The following is a non-exhaustive list of possible machines in which the principles of the present disclosure may be incorporated: articulated truck, asphalt paver, backhoe loader, cold planer, compactor, compact track and multi-terrain loader, dozer, dragline, drill, electric rope shovel, excavator, feller buncher, forest machine, fork lift, forwarder, harvester, highwall miner, hydraulic mining shovel, knuckleboom loader, material handler, motor grader, off-highway truck, pipelayer, road reclaimer, site prep tractor, skidder, skid steer loader, surface mining conveyor system, telehandler, track loader, underground mining loader, underground mining truck, wheel dozer, wheel excavator, wheel loader, wheel tractor scraper. The machine 1 may also be a marine or submarine vessel/machine having one or more work implements.
In general, the term "work implement" as used herein may refer to any implement associated with a machine so as to assist the work being performed by the machine. In this regard, it will be appreciated that each work implement associated with a particular machine may be adapted to perform a particular function or task. The work implement may comprise an attachment for a machine. As an example, the work implement may comprise an excavator bucket adapted to excavate an area of land. The following is a non-exhaustive list of possible work implements with which the principles of the present disclosure may be associated: ground engaging tool, work tool, bucket, excavator bucket, auger, backhoe, blade, broom, brushcutter, backhoe front bucket, backhoe rear bucket, compact wheel loader bucket, loader bucket, skid steer loader bucket, telehandler bucket, cold planer, compactor, coupler, declimber, felling head, backhoe loader fork, loader fork, compact wheel loader fork, 5 skid steer loader fork, telehandler fork, grapple, hammer, harvester head, material handling arm, mulcher, multiprocessor, pulverizer, rake, ripper, saw, shear, snow blower, stump grinder, tiller, trencher and truss boom. The work implement may comprise one or more support elements such as a boom. The work implement may comprise one or more coupling elements.
As will be appreciated in the art, the machine 1 may comprise alternative or additional features according to the particular type of machine 1 being considered. For example, in the scenario whereby the machine 1 is a backhoe loader, it will be appreciated that the machine 1 will have at least four wheels, and two work implements, including a front loader bucket and an excavator/digging bucket. In another example, in the case whereby the machine 1 is a track type tractor, the machine 1 would be provided with tracks as opposed to wheels (amongst many other differences which will be appreciated by a person skilled in the art).
As used herein, "drones" may refer to unmanned vehicles that may be either autonomously or remotely controlled/piloted. Drones may often be used in given scenarios to mitigate any danger from hazards that may otherwise be apparent to a vehicle operator exposed to such a scenario. Drones may also be used for their relatively compact size and high manoeuvrability when compared to, for example, a manned-machine. An example of a drone may be an Unmanned -6 -Aerial Vehicle (UAV), which may take the form of aircraft such as rotocraft, helicopters, quadcopters, multirotors, planes, fixed-wing aircraft, and the like (generally termed "airborne" drones). Further examples of drones may be ground-based vehicles such as track type or wheeled vehicles, marine vessels such as boats, submarine vessels, amphibious vehicles and hovercraft. Other types of drone may take a robotic form having moveable limbs in order to allow travel.
As used herein, the term "hover" or "hovering" is intended to refer to a controlled manoeuvring operation by a drone in a specific region with respect to a fixed point of reference. In the example of an airborne drone being used in a construction site, the drone may be commanded to hover in a position with respect to the machine so as to monitor a construction operation. In this regard, the drone may maintain a substantially fixed coordinate position in three-dimensional space with respect to a fixed coordinate reference point associated with the machine. For non-airborne drone examples, for example, a submarine drone, it will be appreciated that a similar meaning is implied in that the submarine drone is configured to maintain a substantially fixed coordinate position in three-dimensional space with respect to a fixed coordinate reference point associated with the machine.
The movement of the drone may be autonomous in the sense that the drone may use control logic to guide its movement, independent from any operator control (after an initial operator control input/instruction). Accordingly, the drone may navigate to a position without continuous operator input. The drone may maintain such a position with respect to the machine, irrespective of movement of the machine. In this regard, the drone may be said to be wireless-2y "tethered" to the machine.
Referring back to Figure 1, the general construction of the machine 1 will now be described.
The machine 1 comprises an engine (not shown), a work implement 7, a docking station 9 or landing pad adapted to receive the drone 3, and a controller (not shown) adapted to enable wireless communications between the machine 1 and the drone 3. The machine 1, in this example, may be a backhoe loader having an operator cab 11, a second work implement 13, four wheels 15-1, 15-2 (of which only two are shown) and a hydraulic system (not shown) for operating various elements of the machine, including the first and second work implements 7, 13.
It will be appreciated that not all machines will require operator cabs, but instead, autonomous machines may be used whereby no such operator cab is required (e.g. the machines may be set to either autonomous or remote control).
The engine may be an internal combustion engine, such as a diesel engine, electric engine, gas engine, dual fuel engine, or a hybrid of two or more of these engine systems.
The controller may, for example, comprise one or more processors and one or more memory units so as to store data.
As shown in Figure 1, the docking station 9 is integrated or mounted on a roof 17 of the machine 1. The docking station 9 is constructed and arranged so as to receive the drone 3 and securely hold the drone in what is generally known as a "docked" arrangement (as shown in Figure 1). When the drone 3 is separated from the docking station 9, the drone 3 is said to be "undocked" (see Figure 2).
Referring to Figure 3, the docking station 9 may comprise a secure holding mechanism 23 for enabling the drone 3 to be securely held in position with respect to the machine 1 when docked. In one example, the secure holding mechanism 23 may form a type of cradle arranged to retain the drone 3 in a certain position with respect to the roof 17 of the machine 1. The secure holding mechanism 23 may comprise a housing 25 having an open-top configuration with side walls 27 that define a recessed area 29 for receiving the drone 3. The side walls 27 may enable the drone 3 to be secured in position when docked and may act to restrict any lateral movement of the drone 3 with respect to the roof 17. The docking station 9 may comprise at least one releasable latch 31 for securing the drone 3 to the docking station 9 when docked. The releasable latch 31 may selective allow the drone 3 to be released from the docking station 9. The docking station 9 may be provided with a quick-release mechanism (not shown) for allowing the drone 3 to be launched quickly. In this regard, the quick-release mechanism may comprise a means of selectively imparting a force to the drone 3 in order to assist the drone 3 in achieving required propulsion. For example, the quick-release mechanism may comprise a spring-loaded thrust device (not shown). The spring-loaded thrust device may comprise one or more coil springs which may be pre-loaded (i.e. compressed) when the drone 3 is docked. The releasable latch may keep the coil springs compressed when the releasable latch is closed and may allow the coil springs to extend (i.e. uncompress) when opened (t.e. when the drone 3 is to be undocked).
When docked, the drone 3 may connect with the docking station 9 via a connection means 33 provided on the docking station 9. The connection means 33 may comprise one or more of a charging port 35, a data-link connection port 37, and a secondary charging port (not shown), for example, in instances whereby auxiliary functions of the drone such as the sensor assembly 5 is provided by a power source that is separate from the power source used to achieve propulsion.
The releasable latch 31, quick-release mechanism and connection means 33 may be integrated together as a single element or alternatively, may be provided as separate and distinct features about the drone 3. Similarly, the charging port 35, data-link connection port 37, and secondary charging port may he integrated together as a single element or alternatively, may be provided as separate and distinct features about the drone 3.
The docking station 9 may comprise an antenna (transmitter, receiver and/or transceiver) (not shown) for enabling transmission and reception of wireless signals in the communications network. Alternatively, the antenna may be provided elsewhere about the machine 1.
-12 -Additionally or alternatively, the docking station 9 may be provided with a cover (not shown) that may close the housing 25 such that the open-top configuration becomes a closed-top configuration. The cover may be integrated with the docking station 9 and be actuated to open or close the docking station 9 (i.e. to thereby reveal or conceal the recessed area 29 of the docking station 9 and associated connection means 33). The cover may be retractable in that it may be retracted into the housing 25 of the docking station 9 when not in use (i.e. in a retracted configuration), and may extend therefrom when in use (i.e. in an extended configuration). Alternatively, the cover may be hinged in that it may be connected to the housing 25 via a hinged connection. When in the extended configuration, the cover may enable further securing and protection of the drone 3 when docked and/or may enable protection of the connection means 33, for example, when the drone 3 is undocked. In this regard, as well as ensuring that the drone 3 cannot Inadvertently disconnect from and separate from the docking station 9, the cover may also act as a shield to provide protection from falling objects, dirt ingress and water ingress. The cover may be opened or closed independently from the docking or undocking of the drone 3.
Additionally or alternatively to the housing 25 described with reference to Figure 3, the drone 3 may be secured to the docking station 9 via the connection means 33. In this regard, the connection means 33 may comprise a rigid body that protrudes from the docking station 9. The rigid body may be constructed and arranged to mate with a corresponding recess feature provided on the drone 3. Alternatively, the -13 -connection means 33 may comprise a recess for receiving a corresponding rigid body feature provided on the drone 3.
Accordingly, the connection between the drone 3 and the docking station 9 may take a plug and socket type arrangement with a female-male configuration or a male-female configuration.
The docking station 9 may comprise the same material as that 10 of the roof 17 of the machine 1. Alternatively or additionally, the docking station 9 may comprise a plastics material, such as Acrylonitrile Butadiene Styrene (ABS).
Referring to Figure 4, the drone 3 may be in the form of a quadcopter (also known as a quadrotor helicopter or quadrotor) having propulsion means 39 in the form of vertically oriented propellers known as rotors. The propulsion means 39 may be constructed and arranged, as is commonly known in the art, so as to provide a flight capability with directional control, and also a hovering capability.
The propulsion means 39 may be powered by a motor (not shown).
The drone 3 may also comprise a controller (not shown). The controller may be in communication with the rotors so as to control the flight of the drone 3.
The controller may also be in communication with the sensor assembly 5 of the drone 3 so as to operate the sensor assembly 5. The controller may, for example, comprise one -14 -or more processors and one or more memory units so as to store data. The one or more memory units may be removable from the controller and drone 3 to allow quick access to data stored within the memory units.
The sensor assembly 5 of the drone 3 may comprise various sensors to assist drone control, drone stabilisat_on and sensing of the environment surrounding the drone 3. For example, the sensor assembly 5 may comprise one or more of image sensors, proximity sensors, acoustic sensors, infrared sensors, radar antennas, electric field sensors, position sensing means (such as GPS), and AHRS (attitude and heading reference system) based devices (e.g. which may Include solid-state or microelectromechanical systems gyroscopes, accelerometers and magnetomometers). The sensor assembly 5 may also include an alert means for providing an alert when a condition is satisfied. For example, a condition may be the detection of an object by the proximity sensor within a specified proximity range. When the condition has been satisfied, an alert may be output by the alert means, for example, in the form of an audible signal and/or a visual signal. The alert means may also include a transmittal of data to a network element (i.e. which may Include the machine 1) for outputting an alert by that network element.
The propulsion means 39, controller and sensor assembly 5 may be powered by a powering means (not shown) such as a battery.
The drone 3 may be provided with a connection port 41 for communications with the connection means 33 of the docking station 9. The connection port 41 may comprise one or more of a charging connection 43, a data-link connection 45, and a secondary charging connection for communication with the charging port 35, data-link connection port 37, and secondary charging port of the docking station 9.
The drone 3 may also comprise a communications capability (not shown), for example, a transmitter, a receiver and/or a transceiver, for communicating data in the communications 10 network.
Where the docking station 9 may be of the cradle-type, the drone 3 may comprise a body that is shaped and dimensioned so as to conform with the shape and dimensions of the docking station. For example, the docking station 9 may comprise substantially curved side walls 27 to conform with a curvature of the drone 3. The recessed area 29 of the docking station 9 may be lined with impact absorption material so as to cushion the drone 3 when it lands and docks within the docking station 9.
The drone 3 may comprise an outer shell of rigid material so as to protect the components housed within the drone 3, such as the sensor assembly 5.
The drone 3 may be made out of various materials according to the type of drone 3. For example, for an airborne drone, such as that illustrated in Figure 4, the drone 3 may be made out of a lightweight but strong material. Materials may include plastic, reinforced plastic (e.g. with carbon or other fiber reinforcement), metal alloy, and so forth. For a drone that is of the submarine type, for example, the -16 -drone may be made out of materials capable of withstanding submarine pressures, as will be known in the art.
Figure 5 is perspective view above the drone 3 and machine 5 1, showing the drone 3 undocked from the docking station 9.
Figure 6 is an enlarged perspective view of a portion of the drone 3 when docked in the docking station 9. As shown, the latch mechanism 31 may comprise first and second Latch portions 47-1, 47-2, which may fold over a portion of the drone 3 in order to clamp the drone 3 in place when docked in the docking station 9. The first and second latch portions 47-1, 47-2 may be hingedly attached to the housing 25 of the docking station 9 and may be electronically controlled via actuators (not shown).
The communications network may include one or more network elements, including the machine 1 and the drone 3.
Data may be relayed between the machine 1 and drone 3 via one or more network elements in the communications network. The one or more network elements may include servers, satellites, other machines, radio masts, base stations, and any other device having communications capability.
Data sent from the machine 1 to the drone 3 via the communications network may generally be termed "uplink" data. Data sent from the drone 3 to the machine 1 via the communications network may generally be termed "downlink" data. -1 -
Control data for the drone 3 may be wirelessly transmitted by the machine 1 to the drone 3, via the uplink, for example, in order to specify a navigation path (e.g. a flight path) of the drone 3. Sensory data, such as image data may be wirelessly transmitted by the drone 3 to the machine 1, via the downlink. The drone 3 may also continuously send AHRS-based data and other such positional data to the machine 1, via the downlink so that the current positional data of the drone 3 may be known.
In general, the drone 3 may comprise different operational modes: (i) an "active" mode; (ii) a "passive" mode; and (iii) a "semi-active" mode.
In the case where the drone 3 may be operating in the active mode, the drone 3 may comprise an enhanced processing capability allowing the drone 3 to process the data received from the machine 1 and captured by the sensor assembly 5 in order to map out its own navigation path. In this regard, the drone 3 may apply control algorithms to correct its instantaneous position, for example to stay within a specified region with respect to the machine or to employ measures for collision avoidance.
In the case where the drone 3 may be operating in a passive mode, the drone 3 may comprise a limited processing capability and navigation may be instructed via the communications network only. In this regard, the drone 3 may provide continuous downlink data and receive continuous uplink data, allowing another network element (such as the machine 1) in the communications network to determine any corrections and adjustments of the instantaneous position of -18 -the drone 3. For example, the machine 1 may calculate the navigate path for the drone 3 and any required adjustments based on the downlink data received from the drone.
In the case where the drone 3 may be operating in a semi-active mode, the drone 3 may comprise the enhanced processing capability described with respect to the active mode but may only use such capability for collision avoidance. Hence, the drone 3 may normally be instructed with navigational information by another network element in the communications network but may override such instruction in order to execute collision avoidance measures.
The navigation of the drone 3 to a destination may initially 15 be set by the machine operator in the form of control data. In particular, control data may be transmitted by the machine 1 to the drone 3 via the communications network.
The control data may comprise three-dimensional coordinate data with respect to a pre-defined fixed reference point. For example, the pre-defined fixed reference point may be an origin of the three-dimensional coordinate system defined in association with the machine 1. In such an example, the origin may be defined as the intersection of an x-axis, y-axis and z-axis at a point which coincides with a determined centre-point of the docking station 9 (i.e. thereby defining a (0,0,0) coordinate position in three-dimensional space). The x-axis may extend parallel to a longitudinal axis of the machine 1. The y-axis may extend parallel to a vertical axis of the machine 1. The z-axis may extend parallel to a transverse, horizontal axis of the machine 1.
-19 -Accordingly, the control data may comprise a destination position (A,B,C), where A, B and C are respective x, y and z coordinates. For example, the A, B and C coordinates may be selected so as to cause the drone 3 to navigate to a point that is approximately: (i) -300 mm along the x-axis (e.g. in a direction towards the rear of the machine); (ii) -50 mm along the y-axis (e.g. in a direction vertically downwards); and (iii) SO mm along the z-axis (e.g. in a direction towards the right side of the machine 1 when viewing the front of the machine 1).
For example, this destination may be suitable for the drone 3 to monitor a trenching operation.
Additionally, the control data may comprise angles of pitch, yaw and roll for the drone 3 with respect to a three-dimensional axis system of the drone. The control data may also include an orientation for the drone 3 with respect to the machine orientation. For example, the drone 3 may be deemed to be "front-facing" when a front of the drone 3 faces a front side of the machine 1 and "rear-facing" when the front of the drone 3 faces a rear side of the machine 1.
The actual navigation path navigated by the drone 3 to the 30 designated coordinates may take a predetermined "shape". For example, the navigation path shape may take a generally parabolic form between the docking station 9 and the -20 -intended destination. In other examples, the drone 3 may initially be launched in a generally upwardly direction before then taking the shortest path to the intended destination. In other examples, the drone 3 may take the shortest, direct path to the intended destination.
The drone 3 may use a collision avoidance system so as to avoid obstructions and predetermined hazard zones in the navigation path. For example, the drone 3 may use its proximity sensors to sense objects in its vicinity so that correctional measures may be taken to adjust the navigation path to the intended destination. Additionally or alternatively, the machine 1 dimensions and current position information (including dimension and current position information of its implements) may be provided via the communications uplink so that the drone 3 may be provided with an increased awareness of the machine 1 for collision avoidance. In this regard, the drone 3 may determine its own dimensions as well as the dimensions of the machine 1 so as to avoid any collisions between the two.
The hovering of the drone 3 with respect to the destination coordinates may be within a coordinate tolerance range.
Referring back to Figure 2, such a coordinate tolerance range may define a navigation zone 19 within which the drone 3 may be permitted to move. Thus, the navigation zone 19 may be provided as a three-dimensional zone surrounding the destination coordinates. The drone 3 may be permitted to hover at any point within the navigation zone 19 and need not remain at the destination coordinates. Should the drone 3 determine that it is outside the navigation zone 19, the -21 -drone 3 may employ correctional measures to navigate back within the navigation zone 19 by determining a correctional navigation path from its current position to the destination coordinates. In this manner, if the machine 1 moves, the drone 3 may follow the machine 1.
The control data may also comprise sensing control data, which may essentially tell the drone 3 where to "look".
More particularly, the directional-sensing data may instruct relevant sensors to focus in a particular designated monitoring region 21 for data capture once the drone 3 is in the destination position. For example, image sensors may be focussed to sense primarily in the designated monitoring region 21. Beam-forming techniques may be employed by some of sensors, such as the radar antennas, in order to sense in the designated monitoring region 21.
In this regard, the sensing control data may define a look angle with respect to the three-dimensional axis system of 20 the drone 3, and with respect of the three-dimensional axis system of the machine 1.
The sensing control data may comprise particular instructions in order to assist the machine operator in performing an operation. For example, the sensing control data may comprise "zoom" instructions which may enable the drone 3 to zoom in or out of a particular designated monitoring region 21 so as to allow better monitoring of an activity, based on an operator input. A 'dynamic" zoom function may also be provided, for example, so as to retain a particular depth of focus with respect to a work implement as the work implement is moving.
-22 -The sensing control data may also determine which one or more sensors are to be used for performing the required monitoring operation. For example, the sensing control data may instruct use of the image sensors so that an operator may visually inspect an operation being performed by the work implement 7. In another example, the sensing control data may instruct use of the infrared sensors so that an operator may assess a heat signature of the operation.
Figures 7 and 8 provide further examples of a machine 101 and drone 103, whereby the drone 103 has been instructed to monitor an operation performed by a work implement 113. In these examples, the machine 101 is a backhoe loader and the work implement 113 is a fork lift. The drone 103 may be commanded to hover above the work implement 113 in a first navigation zone 119 and focus on a designated monitoring region 121.
It will be appreciated that the drone 3, 103 may be instructed to hover in any number of pre-determined positions with respect to the machine 1, 101 and may be preprogrammed to look in specific designated monitoring regions 21, 121.
In this manner, the drone 3, 103 may monitor the work operation and send back sensory data in substantially real-time. The sensory data may include any data associated with the drone 3, 103, including, for example, image data.
-23 -Referring back to the machine 1 of Figure 1, the operator cab 11 may be provided with operator controls including a user interface (not shown), such as a control panel, for interacting with the drone 3. The user interface may comprise a plurality of operator selection inputs for instructing a different operation of the drone 3. For example, the operator selection inputs may comprise pre-programmed navigation controls for the drone 3, which, when selected by the operator, instruct the drone 3 to navigate to a particular navigation zone for performing a monitoring activity. In this regard, the operator need only execute a "one-touch" selection to enable autonomous drone 3 control.
The user interface may also comprise a display for outputting graphical data to the machine operator. For example, the display may be for outputting image data captured by the drone 3.
The user interface may also comprise one or more alert outputs, such as a visual or audible alert for outputting alerts when one or more conditions are satisfied. For example, an alert may be output when the drone 3 has detected an object in its navigation path for which collision avoidance may be required.
The user interface may comprise manual override controls for allowing the operator to assume control of the drone 3. In this regard, the user interface may also comprise navigation controls for allowing the operator to control the navigation of the drone 3.
-24 -Additionally, the user interface may comprise dedicated operation selection inputs for quick-launch, auto-dock and quick-view. For example, quick-launch may allow the drone to initially be launched and navigate to an arbitrary position to await further navigational control input from the operator (such as manual control). Auto-dock may Instruct the drone 3 to automatically return to the docking station 9 and dock. Quick-view may Instruct the sensor assembly 5 of the drone 3 to be used when the drone 3 is docked. For example, quick-view may be used to switch on the image sensors of the drone 3 to assist machine manoeuvring (e.g. use of image data when the machine 1 is reversing).
The user interface may be provided as a detachable control panel for enabling an operator to control the drone 3 independently from the machine 1. In this regard, the user interface may be provided with a separate power source (i.e. battery), antenna for communications in the communications network and in-panel display and controls.
Computer program instructions may be provided, which, when performed by a computer, cause the drone 3 and/or machine 1 to execute navigation, sensory control and general data transmission. In this regard, the computer may be considered as the controller provided in the machine 1 and/or the drone 3, and may comprise at least one processor and at least one memory for achieving such navigation, sensory control and general data transmission.
-25 -
Industrial applicability
The system described above may be implemented by a machine 1 and drone 3. For example, the machine I may be a heavy equipment vehicle, such as a backhoe loader, which may comprise an internal combustion engine, such as a diesel engine. The drone 3 may be a quadcopter.
The navigation path or travel by the drone 3 may be autonomous in the sense that, following an initial operator input, the drone 3 may undock from the docking station 9 and travel to a position designated by the initial operator input without any continuous directional input from the operator of the machine 1.
An example of a method of operating the system will now be described with reference to Figure 9. In this example, the machine 1 is a backhoe loader and the drone 3 is a quadcopter drone, which is configured in the active mode.
The drone 3 may initially be docked in the docking station 9 of the machine 1 (as shown in Figure 1). The machine I may be performing a particular operation using the work implement 7 and the machine operator may, for example, desire a better visual perspective of the work being performed compared with the current view from the operator cab 11, which may be restricted. In this example, the work may be a digging operation using an excavator bucket.
At step 201, the machine operator may select one of a plurality of selection inputs (provided on the user -26 -interface in the operator cab 11) to enable the drone 3 to be activated. For example, the operator may select an input that causes the quadcopter to monitor the digging operation.
At step 203, based on the selected input, pre-programmed control data may be transmitted by the machine 1 to the drone 3, via the communications network. In this example, the control data is transmitted directly from the machine 1 to the drone 3.
At step 205, the drone 3 may determine a set of destination coordinates based on the control data received via the communications network.
At step 207, the drone 3 may then determine a flight path from its current, docked position to the determined destination coordinates. The drone 3 may also determine a navigation zone based on the destination coordinates. Also, the drone 3 may determine characteristics of the sensory control data in order to determine operational requirements of the sensory assembly 5.
At step 209, the drone 3 may then send a "ready-to-launch" signal to the machine 1 via the downlink.
At step 211, the machine 1 may then signal the docking station 9 to launch the drone 3, based on the ready-to-launch signal. This may cause actuation of the quick-release mechanism and thereby enable the drone 3 to be placed on an initial flight trajectory.
At step 213, the drone 3 may detect that it has been undocked from the docking station 9 and therefore may determine that it has been launched.
At step 215, the drone 3 may navigate to the destination coordinates based on the determined flight path. For example, the destination coordinates may be associated with a position towards the rear of the machine 1 and above the work implement 7, as shown in Figure 2.
At step 217, the drone 3 may configure its sensory assembly, based on the determined characteristics and begins monitoring the determined region for monitoring.
At step 219, the drone 3 may transmit sensory data to the machine 1, via the downlink, in substantially real-time.
If the monitoring by the drone 3 may no longer be required, the operator may select an auto-dock input, which may cause the machine 1, at step 221, to transmit a signal to the drone 3 instructing the drone 3 to return to the docking station 9.
Then, at step 223, the drone 3 determines a new flight path 25 to return to the docking station 9.
The drone 3 may then fly to the machine 1 and dock in the docking station 9 in a manner so that its connection port 41 may connect with the connection means 33 provided within the 30 docking station.
-28 -Whilst exemplary embodiments of the present disclosure have been described herein with reference to Figures 1 to 7, it will be appreciated that various modifications and alternatives are envisaged.
For example, whilst the docking station 9 described herein with reference to Figures 1 to 9 may be positioned on the roof 17, it will be appreciated that, for other machine examples, the docking station 9 may be provided elsewhere about the machine.
Exemplary embodiments of the present disclosure have been described herein with reference to a sensor assembly 5 including image sensors. In other embodiments, the sensor assembly 5 may incorporate other technology according to the type of scenario in which the drone 3 will be used. For example, a submarine drone may use sonar technology for object detection. In this regard, the drone 3 and/or machine 1 may also be provided with mapping algorithms to build virtual images from the object detection data obtained via the sensor assembly 5.
In the exemplary embodiments described herein with reference to Figures 1 to 9, the centre-point of the docking station 9 was used as a fixed point reference. In alternative or additional embodiments, the fixed point reference may be variable according to a desired function. For example, the fixed point reference may refer to a point on a relevant work implement 7, 13, 113 being used. Accordingly, there may be various fixed points of reference located about the machine 1, depending on the operator input which has been selected.
-29 -Although exemplary embodiments of the present disclosure have been described herein, it will be appreciated that various improvements and modifications may be incorporated 5 without departing from the scope of the following claims.

Claims (31)

  1. -30 -Claims 1. A method for controlling a drone in a system comprising the drone and a work machine, the method comprising: determining control data for controlling the drone with respect to the work machine.
  2. 2. A method according to claim 1, wherein the control data comprises three-dimensional destination position coordinates 10 with respect to a three-dimensional reference position of the work machine.
  3. 3. A method according to claim 1 or 2, comprising determining a navigation path for the drone, based on the 15 control data.
  4. 4. A method according to any preceding claim, comprising determining a navigation zone for the drone, based on the control data, wherein the navigation zone comprises a three-dimensional coordinate range within which drone movement is permitted.
  5. S. A method according to any preceding claim, wherein the control data comprises sensory control data instructing 25 configuration of a sensor assembly of the drone.
  6. 6. A method according to claim 5, wherein the sensory control data comprises instruction on which of a plurality of sensors are to be activated for monitoring an environment 30 associated with the drone.
    -31 -
  7. 7. A method according to any preceding claim, wherein the control data comprises directional-sensing data indicative of a region in the surrounding environment to be monitored by the drone.
  8. 8. A method according to any preceding claim, comprising providing sensory data, associated with the drone, for output by the work machine.
  9. 9. A method according to claim 8, wherein the sensory data comprises at least one of image data and audio data.
  10. 10. A work machine for communications in a communications network comprising a drone, the work machine comprising: an engine; at least one work implement; a docking station constructed and arranged to receive the drone; and a controller adapted to enable wireless communications 20 between the work machine and the drone, the controller being adapted to determine control data for the drone with respect to the work machine.
  11. 11. A work machine according to claim 10, wherein the 25 control data comprises three-dimensional destination position coordinates with respect to a three-dimensional reference position of the work machine.
  12. 12. A work machine according to claim 10 or 11, comprising 30 determining a navigation path for the drone, based on the control data.
    -32 -
  13. 13. A work machine according to any of claims 10 to 12, comprising determining a navigation zone for the drone, based on the control data, wherein the navigation zone comprises a three-dimensional coordinate range within which drone movement is permitted.
  14. 14. A work machine according to any of claims 10 to 13, wherein the docking station comprising a secure holding mechanism adapted to securely hold the drone with respect to 10 the work machine.
  15. 15. A work machine according to claim 14, wherein the secure holding mechanism comprises a housing for receiving the drone, the housing being constructed and arranged to restrict movement of the drone when docked in the docking station.
  16. 16. A work machine according to any of claims 10 to 15, wherein the docking station comprises a releasable latch adapted to secure the drone to the docking station and selectively allow the drone to be released from the docking station.
  17. 17. A work machine according to any of claims 10 to 16, 25 wherein the docking station comprises a quick-release mechanism adapted to selectively impart a force to the drone 3 for launching the drone onto an initial navigation path.
  18. 18. A work machine according to claim 17, wherein the 30 quick-release mechanism comprises a spring-loaded thrust device.
    -33 -
  19. 19. A work machine according to any of claims 10 to 18, wherein the docking station comprises a connection means including at least one of: a charging port, a data-link connection port, and a secondary charging port.
  20. 20. A work machine according to any of claims 10 to 19, wherein the controller is provided within the docking station.
  21. 21. A work machine according to any of claims 10 to 20, wherein the docking station comprises a retractable cover adapted to selectively cover the docking station.
  22. 22. A work machine according to any of claims 10 to 21, 15 comprising a user interface for communication with the drone.
  23. 23. A work machine according to claim 22, wherein the user interface comprises a plurality of selectable inputs, each selectable input being preconfigured with three-dimensional destination position coordinates for navigation of the drone with respect to the work machine.
  24. 24. A work machine according to claim 22 or 23, wherein the 25 user interface comprises a display for output of data received from the drone.
  25. 25. A drone for communications in a communications network comprising a work machine, the drone comprising: a sensor assembly; a propulsion means; a power source; and -34 -a controller adapted to enable wireless communications between the drone and the work machine, the controller being adapted to determine control data for the drone with respect to the work machine.
  26. 26. A drone according to claim 25, wherein the control data comprises three-dimensional destination position coordinates with respect to a three-dimensional reference position of the work machine.
  27. 27. A drone according to claim 25 or 26, comprising determining a navigation path for the drone, based on the control data.
  28. 28. A drone according to any of claims 25 to 27, comprising determining a navigation zone for the drone, based on the control data, wherein the navigation zone comprises a three-dimensional coordinate range within which drone movement is permitted.
  29. 29. A drone according to any of claims 25 to 28, comprising a connection port for connection with a work machine.
  30. 30. A drone according to claim 29, wherein the connection 25 port comprises at least one of: a charging connection, a data-link connection and a secondary charging connection.
  31. 31. A computer program comprising computer program instructions, which when run on a computer cause the 30 computer to perform the method of any of claims 1 to 9.
GB1422065.1A 2014-12-11 2014-12-11 Drone Withdrawn GB2533140A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB1422065.1A GB2533140A (en) 2014-12-11 2014-12-11 Drone

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1422065.1A GB2533140A (en) 2014-12-11 2014-12-11 Drone

Publications (1)

Publication Number Publication Date
GB2533140A true GB2533140A (en) 2016-06-15

Family

ID=56007665

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1422065.1A Withdrawn GB2533140A (en) 2014-12-11 2014-12-11 Drone

Country Status (1)

Country Link
GB (1) GB2533140A (en)

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160282872A1 (en) * 2015-03-25 2016-09-29 Yokogawa Electric Corporation System and method of monitoring an industrial plant
CN106716062A (en) * 2016-11-24 2017-05-24 深圳市大疆创新科技有限公司 Flight route planning method of agricultural unmanned aerial vehicle and ground control terminal
GB2548369A (en) * 2016-03-15 2017-09-20 Jaguar Land Rover Ltd System for providing land vehicle support operations using an unmanned autonomous vehicle
EP3346347A1 (en) * 2017-01-10 2018-07-11 CNH Industrial Belgium NV Aerial vehicle systems and methods
RU2671138C1 (en) * 2017-11-27 2018-10-29 Юрий Иосифович Полевой Unmanned combat vehicle and remote control system of motion and armament of unmanned combat vehicle
CN109024722A (en) * 2018-07-16 2018-12-18 长沙南理技术推广服务有限公司 A kind of robot and its working method for excavation of earthwork
WO2019034365A1 (en) * 2017-08-15 2019-02-21 Zf Friedrichshafen Ag Control of a transportation vehicle
EP3409849A4 (en) * 2016-01-29 2019-07-10 Sumitomo (S.H.I.) Construction Machinery Co., Ltd. Excavator and autonomous flying body to fly around excavator
WO2020069747A1 (en) * 2018-10-04 2020-04-09 Volvo Construction Equipment Ab A working machine comprising an illumination system
BE1027171B1 (en) * 2019-04-03 2020-11-05 Thyssenkrupp Ind Solutions Ag Method and device for the automated operation of a material extraction system that can be used primarily in open-cast mining
EP3767040A1 (en) * 2019-07-17 2021-01-20 Kobelco Construction Machinery Co., Ltd. Work machine and work machine support server
US20210017739A1 (en) * 2019-07-17 2021-01-21 Kobelco Construction Machinery Co., Ltd. Work machine and work machine support server
US20210018937A1 (en) * 2019-07-17 2021-01-21 Kobelco Construction Machinery Co., Ltd. Work machine and work machine support server
DE102019211042A1 (en) * 2019-07-25 2021-01-28 Hochschule für Technik und Wirtschaft Dresden Procedure for monitoring the handling of goods with a loading vehicle
US10927528B2 (en) 2015-09-15 2021-02-23 Sumitomo(S.H.I.) Construction Machinery Co., Ltd. Shovel
EP3273201B1 (en) 2016-07-21 2021-06-30 Arquus Method of calculating an itinerary for an off-road vehicle
US11066000B2 (en) 2018-08-13 2021-07-20 United Parcel Service Of America, Inc. Systems, methods, and apparatuses for engaging and transporting objects
US11162241B2 (en) 2018-03-27 2021-11-02 Deere & Company Controlling mobile machines with a robotic attachment
US11254336B2 (en) * 2018-12-19 2022-02-22 Nordco Inc. Rail flaw detector
US20220100212A1 (en) * 2020-09-30 2022-03-31 Kobelco Construction Machinery Co., Ltd. Work support apparatus for work machine
US11429106B2 (en) 2019-09-17 2022-08-30 United Parcel Service Of America, Inc. Methods and systems for shifting objects
US11713117B2 (en) 2020-03-31 2023-08-01 Cnh Industrial America Llc System and method for anchoring unmanned aerial vehicles to surfaces
US20230374754A1 (en) * 2015-02-13 2023-11-23 ESCO Group LLLC Monitoring ground-engaging products for earth working equipment

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB892838A (en) * 1959-09-18 1962-03-28 British Soc For Res In Agricul Apparatus for spreading liquids and powdered solids over the ground
JPH02114095A (en) * 1988-10-25 1990-04-26 Yamaha Motor Co Ltd Spraying device
GB2312056A (en) * 1996-04-09 1997-10-15 James Dalgliesh Remote flying craft with mobile control
WO2000073727A1 (en) * 1999-05-27 2000-12-07 Steadicopter Ltd. Bordered flying tool
US6246932B1 (en) * 1997-02-20 2001-06-12 Komatsu Ltd. Vehicle monitor for controlling movements of a plurality of vehicles
US20120029732A1 (en) * 2010-07-29 2012-02-02 Axel Roland Meyer Harvester with a sensor mounted on an aircraft
EP2433867A2 (en) * 2010-09-28 2012-03-28 Kabushiki Kaisha Topcon Automatic take-off and landing system
CN202896375U (en) * 2012-11-30 2013-04-24 山东电力集团公司电力科学研究院 Unmanned plane integral ground station system
DE102013019098B3 (en) * 2013-11-11 2015-01-08 Hochschule für Technik und Wirtschaft Dresden System for recording parameters of the environment and environment

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB892838A (en) * 1959-09-18 1962-03-28 British Soc For Res In Agricul Apparatus for spreading liquids and powdered solids over the ground
JPH02114095A (en) * 1988-10-25 1990-04-26 Yamaha Motor Co Ltd Spraying device
GB2312056A (en) * 1996-04-09 1997-10-15 James Dalgliesh Remote flying craft with mobile control
US6246932B1 (en) * 1997-02-20 2001-06-12 Komatsu Ltd. Vehicle monitor for controlling movements of a plurality of vehicles
WO2000073727A1 (en) * 1999-05-27 2000-12-07 Steadicopter Ltd. Bordered flying tool
US20120029732A1 (en) * 2010-07-29 2012-02-02 Axel Roland Meyer Harvester with a sensor mounted on an aircraft
EP2433867A2 (en) * 2010-09-28 2012-03-28 Kabushiki Kaisha Topcon Automatic take-off and landing system
CN202896375U (en) * 2012-11-30 2013-04-24 山东电力集团公司电力科学研究院 Unmanned plane integral ground station system
DE102013019098B3 (en) * 2013-11-11 2015-01-08 Hochschule für Technik und Wirtschaft Dresden System for recording parameters of the environment and environment

Cited By (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230374754A1 (en) * 2015-02-13 2023-11-23 ESCO Group LLLC Monitoring ground-engaging products for earth working equipment
US20160282872A1 (en) * 2015-03-25 2016-09-29 Yokogawa Electric Corporation System and method of monitoring an industrial plant
US9845164B2 (en) * 2015-03-25 2017-12-19 Yokogawa Electric Corporation System and method of monitoring an industrial plant
US10927528B2 (en) 2015-09-15 2021-02-23 Sumitomo(S.H.I.) Construction Machinery Co., Ltd. Shovel
US11492783B2 (en) 2016-01-29 2022-11-08 Sumitomo(S.H.I) Construction Machinery Co., Ltd. Shovel and autonomous aerial vehicle flying around shovel
EP3409849A4 (en) * 2016-01-29 2019-07-10 Sumitomo (S.H.I.) Construction Machinery Co., Ltd. Excavator and autonomous flying body to fly around excavator
US10767347B2 (en) 2016-01-29 2020-09-08 Sumitomo(S.H.I.) Construction Machinery Co., Ltd. Shovel and autonomous aerial vehicle flying around shovel
GB2548369B (en) * 2016-03-15 2021-02-17 Jaguar Land Rover Ltd System for providing land vehicle support operations using an unmanned autonomous vehicle
GB2548369A (en) * 2016-03-15 2017-09-20 Jaguar Land Rover Ltd System for providing land vehicle support operations using an unmanned autonomous vehicle
EP3273201B1 (en) 2016-07-21 2021-06-30 Arquus Method of calculating an itinerary for an off-road vehicle
CN106716062A (en) * 2016-11-24 2017-05-24 深圳市大疆创新科技有限公司 Flight route planning method of agricultural unmanned aerial vehicle and ground control terminal
CN109655067A (en) * 2016-11-24 2019-04-19 深圳市大疆创新科技有限公司 The flight course planning method and ground control terminal of agriculture unmanned vehicle
WO2018094661A1 (en) * 2016-11-24 2018-05-31 深圳市大疆创新科技有限公司 Flight course planning method for agricultural unmanned aerial vehicle, and ground control end
US10775796B2 (en) 2017-01-10 2020-09-15 Cnh Industrial America Llc Aerial vehicle systems and methods
EP3346347A1 (en) * 2017-01-10 2018-07-11 CNH Industrial Belgium NV Aerial vehicle systems and methods
US11507112B2 (en) 2017-08-15 2022-11-22 Zf Friedrichshafen Ag Control of a transportation vehicle
WO2019034365A1 (en) * 2017-08-15 2019-02-21 Zf Friedrichshafen Ag Control of a transportation vehicle
RU2671138C1 (en) * 2017-11-27 2018-10-29 Юрий Иосифович Полевой Unmanned combat vehicle and remote control system of motion and armament of unmanned combat vehicle
US11162241B2 (en) 2018-03-27 2021-11-02 Deere & Company Controlling mobile machines with a robotic attachment
CN109024722A (en) * 2018-07-16 2018-12-18 长沙南理技术推广服务有限公司 A kind of robot and its working method for excavation of earthwork
US11066000B2 (en) 2018-08-13 2021-07-20 United Parcel Service Of America, Inc. Systems, methods, and apparatuses for engaging and transporting objects
US20210340734A1 (en) * 2018-10-04 2021-11-04 Volvo Construction Equipment Ab Working machine comprising an illumination system
WO2020069747A1 (en) * 2018-10-04 2020-04-09 Volvo Construction Equipment Ab A working machine comprising an illumination system
US11254336B2 (en) * 2018-12-19 2022-02-22 Nordco Inc. Rail flaw detector
BE1027171B1 (en) * 2019-04-03 2020-11-05 Thyssenkrupp Ind Solutions Ag Method and device for the automated operation of a material extraction system that can be used primarily in open-cast mining
US11560694B2 (en) * 2019-07-17 2023-01-24 Kobelco Construction Machinery Co., Ltd. Work machine and work machine support server
EP3767040A1 (en) * 2019-07-17 2021-01-20 Kobelco Construction Machinery Co., Ltd. Work machine and work machine support server
US20210017739A1 (en) * 2019-07-17 2021-01-21 Kobelco Construction Machinery Co., Ltd. Work machine and work machine support server
US20210018937A1 (en) * 2019-07-17 2021-01-21 Kobelco Construction Machinery Co., Ltd. Work machine and work machine support server
DE102019211042A1 (en) * 2019-07-25 2021-01-28 Hochschule für Technik und Wirtschaft Dresden Procedure for monitoring the handling of goods with a loading vehicle
US11429106B2 (en) 2019-09-17 2022-08-30 United Parcel Service Of America, Inc. Methods and systems for shifting objects
US11713117B2 (en) 2020-03-31 2023-08-01 Cnh Industrial America Llc System and method for anchoring unmanned aerial vehicles to surfaces
US20220100212A1 (en) * 2020-09-30 2022-03-31 Kobelco Construction Machinery Co., Ltd. Work support apparatus for work machine
EP3978692A1 (en) * 2020-09-30 2022-04-06 Kobelco Construction Machinery Co., Ltd. Work support apparatus for work machine
US11835970B2 (en) * 2020-09-30 2023-12-05 Kobelco Construction Machinery Co., Ltd. Unmanned aerial vehicle with work implement view and overview mode for industrial vehicles

Similar Documents

Publication Publication Date Title
GB2533140A (en) Drone
KR102659077B1 (en) shovel
US20200340208A1 (en) Shovel and shovel management system
KR102602383B1 (en) Driving support system for construction machinery, construction machinery
EP3346347A1 (en) Aerial vehicle systems and methods
JP7301875B2 (en) excavator, excavator controller
JP7216472B2 (en) Working system and control method
WO2019244574A1 (en) Excavator and information processing device
US20100106344A1 (en) Unmanned land vehicle having universal interfaces for attachments and autonomous operation capabilities and method of operation thereof
KR20210106409A (en) shovel
AU2017318911B2 (en) Image display system of work machine, remote operation system of work machine, work machine, and method for displaying image of work machine
EP3885494B1 (en) Automatic operation work machine
EP2516757B1 (en) System and method for limiting operator control of an implement
JP7342018B2 (en) excavator
CN110485502A (en) A kind of excavator intelligent walking system, excavator and control method
EP3704312B1 (en) Clamp implement for excavator
KR20170136057A (en) Remote control excavator monitoring system and method for monitoring using the system
KR100913690B1 (en) 3 dimension modeling systems and construction method of 3 dimension modeling for remote controlling of a intelligence excavator
JP7449314B2 (en) Excavators, remote control support equipment
US11226627B2 (en) System for modifying a spot location
JP4058401B2 (en) Work machine
JP2022154722A (en) Excavator
KR20220030098A (en) Autonomous working excavator and operation method thereof
WO2023053497A1 (en) Trajectory generation system
WO2022210613A1 (en) Shovel and shovel control device

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)