GB2567142A - Delivery system - Google Patents

Delivery system Download PDF

Info

Publication number
GB2567142A
GB2567142A GB1715590.4A GB201715590A GB2567142A GB 2567142 A GB2567142 A GB 2567142A GB 201715590 A GB201715590 A GB 201715590A GB 2567142 A GB2567142 A GB 2567142A
Authority
GB
United Kingdom
Prior art keywords
delivery
vehicle
drone
indication
destination
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB1715590.4A
Other versions
GB201715590D0 (en
Inventor
Gujral Sunil
matthew lawrence fletcher Henry
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Cambridge Consultants Ltd
Original Assignee
Cambridge Consultants Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Cambridge Consultants Ltd filed Critical Cambridge Consultants Ltd
Priority to GB1715590.4A priority Critical patent/GB2567142A/en
Publication of GB201715590D0 publication Critical patent/GB201715590D0/en
Publication of GB2567142A publication Critical patent/GB2567142A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/48Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system
    • G01S19/485Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system whereby the further system is an optical system or imaging system
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • G05D1/102Simultaneous control of position or course in three dimensions specially adapted for aircraft specially adapted for vertical take-off of aircraft
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0088Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots characterized by the autonomous decision making process, e.g. artificial intelligence, predefined behaviours
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Business, Economics & Management (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Game Theory and Decision Science (AREA)
  • Medical Informatics (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

An autonomous delivery vehicle (UAV, drone) 4 comprises a means for guiding the vehicle to an area in the region of the destination, and means 5 for detecting within the region a target indication. The target indication means comprises a time-varying electromagnetic radiation pattern (such as a flashing or pulsed optical light signal). The vehicle also comprises a means for identifying a person, and a means for releasing an object towards the person if they are identified as the recipient. The vehicle may also operate for the collection of an object. The time-varying electromagnetic signal may be uniquely associated with the user, location or object. An alignment indication such as a QR code may be employed to further aid directing the vehicle. Another aspect of the invention disclosed is a user device 2 suitable for facilitating delivery or collection of an object, where the device comprises means for receiving an indication that a delivery vehicle is in the vicinity, and means for emitting in response to the indication a time-varying electromagnetic signal associated with the delivery or collection of the object. Also disclosed are methods of operating the delivery vehicle and user device.

Description

Delivery system
The present invention relates to a delivery system for an autonomous vehicle and to related control systems. It relates in particular but not exclusively to a delivery system for an unmanned aerial vehicle such as, for example, a system for parcel delivery to a specific, precisely defined target location (e.g. to a customer’s hand). The invention also has particular although not exclusive relevance to identifying and approaching a target location using machine vision.
Delivery systems for unmanned aerial vehicles (UAVs) (referred to herein as ‘drones’) are being trialled in several countries and they are expected to revolutionise the way in which customers receive purchased goods (e.g. after placing an order via a website or an application). Drones are expected to achieve faster and more predictable delivery times than existing delivery system, as they do not depend on the road network (and hence they are not affected by traffic jams, roadworks, and/or the like) and they require less manual supervision than delivery trucks for example. Drone based delivery is therefore convenient for customers whilst it can also potentially reduce associated shipping/handling costs for retailers (since a fleet of autonomous drones may be operated at a lower cost per parcel than road transport alternatives).
A typical drone based delivery system may employ for example dedicated docking bays I package transfer systems for landing and loading/unloading a drone. In this case, a delivery drone usually takes off from a controlled base station (e.g. a warehouse/depot) with a parcel attached to the drone and lands at another controlled base station to deliver the parcel. However, such a system requires significant ground infrastructure and it also requires recipients to travel to one of the ground stations to pick up the parcel, which is inconvenient.
Other drone based delivery systems employ drones that are not restricted to specific ground stations I infrastructure. Such drones land at a suitable location on the ground (or hover close to the ground) in order to perform a package transfer. However, this approach requires the drone to be equipped with a highly complex sense and avoid system to minimise risk of collision between the drone and people/infrastructure near the landing locations. Moreover, it is easier to sabotage a drone that lands or hovers close to ground, and drone rotors may also need safety guarding which may reduce the overall flight efficiency. This method of parcel delivery is likely to only ever be realised between remote locations, thus it is generally unsuitable for mainstream urban deliveries.
In some drone based delivery systems the drone is arranged to drop its payload (package) from a safe altitude and the payload then descends on a parachute or the like. However, this solution represents a more or less uncontrolled and passive delivery with no option to recall the package once delivery has been initiated. If the package is equipped with guidance technology (e.g. electronics and aerodynamic surfaces and/or the like) to facilitate more accurate targeting then the resulting packaging costs may be too high I uneconomical for certain applications. This method is also very unlikely to be suitable in a busy urban environment.
Yet another drone based delivery system employs drones equipped with a tether (controlled by e.g. an associated winch mechanism). In this case, once it is at a safe distance above its destination, the drone allows the package to descend on the tether. However, if no tether/package stabilisation is used, wind can cause the package to drift away from the target location, especially if the tether is long. The delivery is still not secure with the possibility that the delivery can be intercepted by someone other than the user for whom it was intended (especially because of the relatively large distance between the drone and the delivery location).
It can be seen therefore that drone based delivery systems are still in a relatively early phase of development and there are several areas where improvements may be needed. For example, it is undesirable for safety reasons, and very challenging for technical reasons (environment, sense, and avoid) to bring a large delivery drone down to ground level in order to complete a delivery.
Even when using a tether (or a chute) for lowering a package from a drone hovering at a safe altitude (e.g. 20-30 meters), the package may still have an illdefined position due to the effect of the wind, even when the drone’s position is stabilised (e.g. using an optical system and/or gyroscopes). This may lead to damaging the package and/or the drone and a reduced accuracy of delivery.
Moreover, it may be important/necessary to identify the recipient in order to avoid delivery to the wrong or an unauthorised person. However, such identification may not always be possible without landing or at least lowering the drone to a lower and potentially unsafe level. There are also complexities with quickly and robustly authenticating with radio methods such as Bluetooth and Wi-Fi. In addition, the loud noise of the drone may also inconvenience people around the delivery area.
Accordingly, the present invention seeks to provide systems, devices and methods for at least partially addressing one or more of these issues.
In one aspect, the invention provides a vehicle for delivering an object to a recipient (e.g. a person) or collecting an object from a provider at a destination, the vehicle comprising: means for guiding the vehicle autonomously to an area comprising the destination; means for detecting, within the area, a target indication at the destination, wherein the means for guiding is configured to guide the vehicle autonomously into closer proximity with the destination, based on the target indication when detected; means for identifying a recipient or provider (e.g. person); and means for releasing the object towards the recipient or collecting the object from the provider when the means for identifying successfully identifies the recipient as the recipient to whom the object is to be delivered or identifies the provider as the provider from whom the object is to be collected; wherein the means for detecting is configured to detect a target indication comprising a predetermined time-varying electromagnetic radiation pattern, associated with the recipient or provider, emitted from a device.
The predetermined time-varying electromagnetic radiation pattern may comprise a time-varying light pattern (e.g. a predetermined light sequence).
The means for detecting may comprise an imaging sensor for obtaining image data and a processor for processing the obtained image data to detect the timevarying electromagnetic radiation pattern. Alternatively, or additionally, the means for detecting may be configured to obtain data from a remote imaging sensor (e.g. an imaging sensor of a device at the destination) and the means for guiding may be configured to use the image data obtained from the remote imaging sensor to guide the vehicle for closer alignment of a delivery or collection with the device at the destination.
The predetermined time-varying electromagnetic radiation pattern may be uniquely associated with at least one of: the destination; a user (e.g. a recipient / a provider); the device from which the predetermined time-varying electromagnetic radiation pattern is emitted; an object to be delivered / collected.
The means for releasing the object towards the recipient or collecting the object from the provider may be configured to release a carrier, for holding or receiving the object, towards the destination (e.g. a winch and/or the like). In this case, the carrier may be provided with an associated alignment indication, the means for detecting may be configured to detect the alignment indication, and the means for guiding may be configured to autonomously guide the vehicle to align the carrier with the device at the destination using the detected alignment indication and the detected target indication. The alignment indication may comprise at least one of a printed pattern (e.g. a QR code) and a further predetermined time-varying light pattern (e.g. a different light sequence).
The means for guiding may comprise a receiver for receiving positioning signals based on which a geographical location of the vehicle can be determined (e.g. GPS location I network triangulation signals), and may be configured to guide the vehicle to the area comprising the destination at least partly based on the positioning signals received by the receiver.
The vehicle may further comprise means for transmitting a signal indicating that the vehicle has arrived in the area comprising the destination.
The vehicle may comprise means to abort delivery when at least one of the following conditions is met: a timeout (e.g. a predetermined time has passed after initiating delivery or collection at the destination); the predetermined time-varying electromagnetic radiation pattern no longer being detected; an obstruction to the delivery or collection being detected (based on e.g. obtained image data and/or data provided by a sensor coupled to the vehicle); a battery charge falling below a predetermined level; a change in weather conditions being detected (e.g. wind speed being above a set limit); and user initiated abort signal is received.
In another aspect, the invention provides a user device for facilitating delivery or collection of an object by a delivery vehicle, the user device comprising: means for receiving an indication that the delivery vehicle is in the vicinity; and means for emitting, responsive to receiving the indication, a predetermined time-varying electromagnetic radiation pattern associated with said delivery or collection for detection by the delivery vehicle.
The means for receiving an indication may be configured to receive an indication comprising a user input indicating that the delivery vehicle is in the vicinity. The means for receiving an indication may be configured to receive the indication directly or indirectly from the delivery vehicle.
The predetermined time-varying electromagnetic radiation pattern may be uniquely associated with at least one of: the user device; a user; a location (e.g. a geographical or relative location of the user device); a shipment (e.g. an order number); the object to be delivered or collected; and the delivery device.
The user device may comprise means for obtaining information associated with the delivery or collection and/or a person to be identified, and the means for emitting may be configured to emit a predetermined time-varying electromagnetic radiation pattern associated with said delivery or collection based on said obtained information. In this case, the means for obtaining information may be configured to obtain information associated with said delivery or collection and/or a person to be identified from a remote server.
The means for emitting may comprise a light source (e.g. at least one of: a LED flash, an infrared light source, and a display), and the predetermined time-varying electromagnetic radiation pattern may comprise a time-varying light pattern.
The user device may comprise a cellular telephone or a portable computer device. The vehicle may comprise an unmanned and/or autonomous vehicle (for example an autonomous aerial vehicle (drone) and/or the like).
In another aspect, the invention provides a vehicle for delivering an object to or collecting an object from a destination, the vehicle comprising: means for guiding the vehicle autonomously to an area comprising the destination; means for detecting, within the area, a target indication at the destination, wherein the means for guiding is configured to guide the vehicle autonomously into closer proximity with the destination, based on the target indication when detected; and means for delivering or collecting the object at the destination; wherein the means for detecting is configured to detect a target indication comprising a predetermined time-varying electromagnetic radiation pattern emitted from a device at the destination. The vehicle may comprise means for authenticating a user (e.g. based on the predetermined time-varying electromagnetic radiation pattern emitted from the device at the destination) and the means for delivering or collecting the object may be configured to perform a delivery or collection subject to successful authentication of the user as a valid target of the delivery or collection.
In yet another aspect, the invention provides a vehicle for delivering an object to or collecting an object from a destination, the vehicle comprising: means for guiding the vehicle autonomously to an area comprising the destination; means for emitting a predetermined time-varying electromagnetic radiation pattern; means for receiving information, from a device at the destination, indicating that the device has detected the predetermined time-varying electromagnetic radiation pattern, wherein the means for guiding is configured to guide the vehicle autonomously into closer proximity with the destination, based on the received information; and means for delivering or collecting the object at the destination.
The invention also provides a user device for facilitating delivery or collection of an object by a delivery vehicle, the user device comprising: means for receiving an indication that the delivery vehicle is in the vicinity; means for detecting a predetermined time-varying electromagnetic radiation pattern emitted from the delivery vehicle; and means for providing information, to the delivery vehicle, indicating that the user device has detected the predetermined time-varying electromagnetic radiation pattern, wherein the provided information is configured to facilitate guidance of the delivery vehicle into closer proximity with the user device.
Aspects of the invention extend to corresponding systems, methods, and computer program products such as computer readable storage media having instructions stored thereon which are operable to program a programmable processor to carry out a method as described in the aspects and possibilities set out above or recited in the claims and/or to program a suitably adapted computer to provide the apparatus recited in any of the claims.
Each feature disclosed in this specification (which term includes the claims) and/or shown in the drawings may be incorporated in the invention independently (or in combination with) any other disclosed and/or illustrated features. In particular but without limitation the features of any of the claims dependent from a particular independent claim may be introduced into that independent claim in any combination or individually.
Embodiments of the invention will now be described by way of example only with reference to the attached figures in which:
Figure 1 is a simplified schematic illustrating a drone based delivery system to which embodiments of the invention may be applied;
Figure 2 is a simplified block diagram illustrating the main components of the customer device (e.g. smartphone) shown in Figure 1;
Figure 3 is a simplified block diagram illustrating the main components of the drone shown in Figure 1;
Figure 4 is a simplified schematic illustrating an exemplary way in which the various subsystems of the drone interact whilst lowering a payload towards a recipient in the system shown in Figure 1;
Figure 5 is an exemplary image showing data of the sort that may be analysed in the system shown in Figure 1 for determination of the location of a recipient’s hand relative to the position of the drone; and
Figure 6 is a simplified schematic illustrating exemplary parameters and calculations that may be used by a machine vision assisted drone system whilst lowering a payload towards a recipient in the system shown in Figure 1;
Figures 7 and 8 are simplified schematic illustrations of some details of an exemplary winch mechanism that may be used by the drone shown in Figure 1.
Overview
Figure 1 schematically illustrates a unmanned aerial vehicle (UAV) (‘drone’) based delivery system 1 for delivering goods to a customer (represented by a smartphone 2) from a vendor or a warehouse 3 using a drone 4 (and/or any other suitable flight platform).
The drone 4 (flight platform) comprises four main elements: the drone 4, a companion computer (and/or a suitable controller unit), a winch system (not shown in Figure 1), and a vision system (camera 5 and/or the like).
The drone platform may be a combined fixed wing, low noise platform capable of vertical take-off, landing and sustained, efficient hover. Whilst any suitable drone and associated controller and control software may be used, in one example, the drone 4 may be an ‘off the shelf’ DJI S1000 octocopter, which is controlled using an off the shelf Pixhawk 2 flight controller running an appropriate flight software (e.g. Ardupilot and/or the like). The combination of this exemplary drone and flight controller gives particularly good lift capability and the ability to tune the flight characteristics of the assembled system, as well as to accept movement input requests from, and provide feedback to, the companion computer. However, it will be appreciated that in other systems different drones, different flight controllers, and/or flight software may also be used. The companion computer in this example is a Raspberry Pi device although any other suitable controller unit may be used.
The subsystems are mounted on the drone 4. The winch (and/or any other suitable mechanism for holding/lowering/releasing a payload) is mounted on the main body (for example, under the main body on one end of two equal pultruded carbon fibre box sections). In this example, the vision system is mounted on the other end of these box sections. The companion computer, winch controller, flight computer, and power distribution boards are mounted at a suitable part of the drone 4 (e.g. above the main body of the drone 4).
In this system, a customer can place an order for an item via a website or an application on the smartphone 2. Once the order is placed, the item is packaged and loaded onto the drone 4 at the warehouse 4 (store/depot/etc.). The customer may be presented with a real time order status via the display of the smartphone 2 and a real time progress of the delivery as the drone flies to the customer’s location, i.e. the current location of the smartphone 2 (which may be determined by for example GPS and/or other suitable location determination means).
The initial flight path from the warehouse 3 to the approximate location of the smartphone 2 is preferably determined based on the respective geographical locations of the start and endpoints, i.e. based on the known geographical location of the warehouse and the known (or approximated) geographical location of the customer/recipient (which may be determined and provided to the delivery system by the smartphone 2). It will be appreciated that the geographical location of the recipient may be determined using a GPS and/or network triangulation based method. Information identifying the location may be provided to the delivery system by the smartphone 2 (e.g. when placing the order and/or periodically and/or whenever the location changes). If GPS and/or network triangulation is not available, the postal address ofthe customer may be used for determining an approximate delivery location (e.g. at least for the determination of the initial flight path). It will be appreciated that the determination of the initial flight path may also take into account any obstacles and/or areas along the route that may be unsuitable or undesirable for drones, in order to avoid such obstacles and/or areas.
In this example, the drone’s flight towards the location of the customer is performed substantially autonomously based on the received (and/or continuously updated) flight parameters. Preferably, once the drone takes off (with the package), it guides itself based on GPS (or other geographical) data towards the location of the recipient of the package. For example, the recipient’s live GPS location may be uploaded to a secure server (e.g. from the time of initiating the delivery), and the drone 4 may be configured to fly towards this location and adjust its flight path as necessary. During its flight, the drone 4 may also be configured to use an appropriate machine vision I sensor system in order to avoid any obstacles.
Beneficially, when the drone 4 determines that it is in the vicinity of its target area,
i.e. the current location of the smartphone 2, the recipient is notified via the smartphone 2 in order to prepare for receiving the parcel (e.g. following an appropriate confirmation via the smartphone 2). It will be appreciated that such a vicinity may be defined as a given distance/radius from the customer’s current (or last known) location or as a percentage (e.g. 80%, 90%, 95%, 98%, or 99%) of the total distance (or remaining distance from a previous update) between the warehouse 3 and the customer (smartphone 2).
When the customer is ready to receive the parcel, they are instructed to make the smartphone 2 visible from above. Specifically, the customer is requested to hold the smartphone 2 in such a manner that a light source 6 (e.g. LED flash, smartphone display, or other light source that the smartphone may be fitted with) is facing towards the drone 4 (which is hovering at a safe altitude, e.g. 30 meters above ground). Beneficially, the smartphone 2 is then configured to control the light source 6 to emit a unique light pattern or code which identifies the order and/or the customer to the drone 4 (e.g. by turning the light source 6 on/off according to a specific pattern associated uniquely (globally or locally) with the recipient user and/or order).
The drone’s 4 image processing system is configured to locate this light pattern (based on image data from the camera 5) and confirm the identity of the recipient by analysing the flashing light sequence associated with this delivery. Moreover, the image processing system is also configured to determine the precise location of the drone 4 relative to the recipient based on the detected light source 6 (e.g. by determining the location of the light pattern relative to a point (e.g. centre) of the image recorded by the camera 5 representing the location of the drone 4). The location of the light source 6 is then used to modify the drone’s 4 position, repositioning it to a location directly above the recipient’s hand using an appropriate control mechanism (e.g. a control loop, which may involve proportional/integral/derivative elements, or any combination thereof).
The drone 4 is also configured (using appropriate hardware and/or software) to determine the height difference between the drone 4 and the recipient’s hand (smartphone 2) in order to maintain a safe altitude and for facilitating delivery of the package to the recipient (to a set height or user-defined height above the ground). For example, the height difference may be determined by obtaining the drone’s 4 altitude using e.g. a barometer or laser range finder / light detection and ranging (LIDAR) approach, and/or similar.
Once the recipient has been located based on the light pattern from the smartphone 2 and the height difference has been determined, the drone 4 is configured to lower its payload 7 (not shown in Figure 1) using the associated winch I tether system. Specifically, the winch system is configured to lower the tether by the exact distance (height difference) that has been determined by the drone 4. In this example, this is achieved by a bespoke constant diameter, levelwind winch drum, controlled by a motor and encoder system that knows the exact number of rotations of the winch drum required to lower the parcel a given distance. Beneficially, this system may interact with the mechanism for determining the height difference between the recipient’s hand and the drone 4, thereby providing a closed loop control of the current tether length and target tether length, also taking into account any altitude changes of the drone 4 from the end position of the payload.
Beneficially, the camera data may also be used by the vision system to assist the drone 4 / winch system in keeping the payload directly above the recipient’s hand (and/or the light source 6) during lowering the payload. This can be achieved by using the vision system to monitor the position of the payload (for example by tracking a printed graphics on the package, an additional flashing LED on the release mechanism, etc.) relative to the recipient’s hand I smartphone 2, and then using an appropriate stabilising method (e.g. by re-positioning the drone and/or using a lightweight horizontal thrust unit above the payload) to control the location of the payload and to lower it in a substantially straight line towards the recipient.
The actively stabilised parcel (in three dimensions) offers all the benefits of a winched delivery (drone at safer altitude, no flight surfaces needed on package, no infrastructure required at delivery destination) while also solving one of the key downsides of a package suspended on a long tether (no position control of package). As it is also able to deliver the parcel directly to the recipient anywhere, it is also significantly more desirable from a user perspective, whilst ensuring parcel security and traceability using the recipient ID system. Additionally, it is possible to verify the recipient’s identity using a unique light sequence from their smartphone.
In summary, the unique light pattern serves as a beacon signal (and may also serves as a unique identification code) for assisting the drone in approaching the intended recipient of a package. The drone is kept well away from the recipient and any ground level obstacles since the package is lowered on an approximately 20-30 metre tether, connected to the drone via a bespoke, levelwind winch system. The vision system tracks the parcel position relative to the LED and continuously manoeuvres the drone allowing placement of the parcel safely, securely, and conveniently, into the recipient’s hands.
Customer device (smartphone)
Figure 2 is a block diagram illustrating the main components of the smartphone 2 shown in Figure 1. The smartphone 2 may be a mobile (or ‘cellular’) telephone, a tablet/laptop computer, or any other type of suitable user equipment (UE). The smartphone 2 comprises a transceiver circuit 11 which is operable to transmit signals to, and to receive signals from, remote servers (e.g. using the Internet) via at least one antenna 13. Typically, the smartphone 2 also includes a user interface 15 (such as a touchscreen display, loudspeaker, microphone, etc.) which allows a user to interact with the smartphone 2. As can be seen, in this example the smartphone 2 also includes an appropriate light source 6 (for example a flash I LED and/or or a display).
The operation of the light source 6 and the transceiver circuit 11 is controlled by a controller 17 in accordance with software stored in memory 19. The software includes, among other things, an operating system 21, a communication control module 22, an application module 23, a positioning module 24 (optional), and a camera module 26 (optional).
The communication control module 22 controls communications between the smartphone 2 and other communication nodes (e.g. via a base station of a telecommunication network). Such communications may comprise communications (data transmissions) relating to ordering and delivering an item from an online store.
The application module 23 is responsible for storing and operating applications (including communicating associated data between the smartphone 2 and other nodes). The applications may include for example an application for receiving goods via a drone 4. It will be appreciated that the application may be in the form of a (web) browser in which case the associated data may be stored on a remote server (website).
The positioning module 24 (if present) is responsible for obtaining information relating to the location of the smartphone 2 (such as GPS position data) and providing such location information to the other modules. In some examples, the positioning module 24 also stores and provides location data relating to the customer, such as an associated home and/or (preferred) delivery address.
The camera module 26 (if present) is operable to record and store image data and to provide such image data to other modules (and other nodes via the transceiver circuit 11). The light source 6 (e.g. a flash unit / LED) may be coupled to and controlled by the camera module 26.
Drone
Figure 3 is a block diagram illustrating the main components of the drone 4 shown in Figure 1. Whilst the components are illustrated as forming part of a single unit, it will be appreciated that the drone, the companion computer, the winch system, and the vision system may be provided as separate, independently operating units, with appropriate communication interfaces between the respective units. In this case one or more of the components and/or modules shown in Figure 3 may be present in more than one of the units, and some of the components and/or modules may be omitted from one or more units.
The drone 4 comprises a transceiver circuit 31 which is operable to transmit signals to, and to receive signals from, other nodes via one or more antennas 33. The drone 4 also comprises a winch system 35 and a camera unit 5.
The operation of the transceiver circuit 31, the winch 35, and the camera 5 is controlled by a controller 37 in accordance with software stored in memory 39. The software includes, among other things, an operating system 41, a communication control module 42, a positioning module 43, an image processing module 44, and a payload handling module 45.
The communication control module 42 controls communications with other devices (e.g. the smartphone 2 and/or remote servers). Such communications may take place either directly or via an appropriate communication network.
The positioning module 43 is responsible for obtaining information regarding the current position of the drone 4 (e.g. by GPS and/or sensors) and information representing a target position of the drone 4 (e.g. location of the recipient of a parcel), and for controlling movement (flight) of the drone 4 from the current position to the target position. The positioning module 43 is also responsible for maintaining a position (e.g. hovering) and for landing and take-off. The positioning module 43 may be coupled to appropriate sensors such as gyroscopes (not shown) and receiver/transceiver circuitry (e.g. GPS).
The image processing module 44 is responsible for obtaining image data from the camera unit 5, and for processing the image data. Specifically, the image processing module 44 is operable to determine the location of the recipient of a parcel (relative to the drone 4) based on the image data and (optionally) for identifying the recipient based on the image data (by detecting and verifying a specific light sequence in the image capture by the camera 5).
The payload handling module 45 is coupled to the winch system 35 (and/or any other suitable payload manipulating mechanism) and it is responsible for holding (e.g. during flight) and releasing a payload (e.g. a parcel) in a controlled manner (e.g. when delivering a parcel to a recipient). The payload handling module 45 is configured to interwork with other modules (such as the positioning module 43 and the image processing module 44) in order to ensure that the payload is stabilised and it is released at the desired location (and to the correct recipient).
In the above description, the smartphone 2 and the drone 4 are described for ease of understanding as having a number of discrete modules. Whilst these modules may be provided in this way for certain applications, for example where an existing system has been modified to implement the invention, in other applications, for example in systems designed with the inventive features in mind from the outset, these modules may be built into the overall operating system or code and so these modules may not be discernible as discrete entities.
Operation
The following is a detailed description of some of the exemplary ways in which the drone based delivery system shown in Figure 1 may be used for delivering a parcel to a customer.
Navigation to user’s vicinity
In order to instigate a delivery, the prospective recipient (customer) presses an appropriate button (e.g. a ‘Place Order’ button and/or the like) on the smartphone 2 (running an appropriate application in its application module 23). In response to this, a regular update of the smartphone’s GPS location is provided to a cloud server, and an associated security I identification light sequence is synchronised between the companion computer of the drone 4 and the smartphone 2. For example, the light sequence may be provided from the server/vendor to smartphone 2 (as part of the order confirmation or separately). Alternatively, the light sequence may be provided at a later phase, or the light sequence may be derived (both by the drone 4 and the smartphone 2) using an appropriate algorithm and based on information associated with the order and/or the customer.
In this example, the drone 4 is then ‘armed’ and switched to ‘guided mode’, in order to ensure that there is a human in the loop at least until package loading is complete and the drone 4 ascends to a predetermined altitude (i.e. to prevent the drone from unexpectedly taking off). However, it will be appreciated that in commercial operation this sequencing (loading and taking off) may be handled autonomously.
The drone 4 then takes off to navigation altitude (e.g. 20m or more). When the drone 4 reaches this altitude, it commences flight to e.g. approximately 80% of the distance to the recipient’s current location (during which its altitude may also change as appropriate). This waypoint is calculated by the companion computer (positioning module 43), which communicates with the cloud server (e.g. via a mobile communication network and/or the like). When the drone 4 reaches this waypoint, it checks for the availability of an updated location of the smartphone 2, and if necessary it redirects to travel to approximately 80% of the distance to the updated location. This process continues until the drone 4 determines that the smartphone 2 is located within approximately 10m of the drone 4, at which point the drone 4 heads to 100% of the way to the recipient’s GPS position. It will be appreciated that in other systems a different mechanism (or other values than 80% 1100%) may be used.
It will be appreciated that the drone 4 may be configured to replot its flight course continuously, obeying low level drone air traffic navigation lanes, while taking the shortest possible route to the recipient’s current position. The recipient may beneficially be permitted a maximum distance of relocation between ordering and receiving, which may depend on the drone’s 4 flight capacity (amongst others). If this relocation distance has been exceeded, then the drone 4 may abort delivery (in which case the drone 4 may be configured to return to the warehouse 3) or postpone delivery (in which case the drone 4 may be configured to await for a further update of the recipient’s location before attempting delivery or returning to the warehouse 3).
When the drone 4 (still at flight altitude) has reached the location that corresponds to the customer’s current location, the drone 4 is configured to reduce its altitude to a predetermined delivery height (e.g. 20-30m).
Handover to machine vision system
Once the drone 4 has arrived at a location that corresponds to the GPS coordinates of the recipient, the machine vision system (which may comprise a camera 5 and a LIDAR unit and/or the like) is activated. Preferably, the vision system is mounted on a lightweight gimbal in order to ensure that it is continuously pointing vertically downwards. At this time, the recipient’s smartphone 2 (using its application module 23) alerts the user to the parcel’s arrival and activates the light source 6 (e.g. following a press of a button) to emit a flashing security I identification light sequence on either the smartphone screen or the torch LED. The user is then instructed to make the flashing light visible to the drone 4 (in the case of the LED, they are instructed to turn their phone over).
The processing of the feed from the camera 5 is done by the companion computer (using the image processing module 44). Specifically, the image processing module 44 is configured to look for the unique flashing light sequence that is associated with this delivery.
An exemplary image is shown in Figure 5. In this example, the image processing module 44 is configured to perform the following actions on the image captured by the camera 5:
- apply a black/white threshold so that only the brightest pixels are retained;
- filter out large areas of light (e.g. larger than a predetermined threshold area), leaving only relatively small, bright objects;
- analyse the on/off state of each of these small, bright objects, and record the time history;
- if it finds that one of these small, bright objects turns on and off at the predefined sequence (i.e. the light pattern corresponding to the recipient’s programmed unique flash sequence) then it relocates the drone 4 directly above the flash and begins the delivery ofthe package (as described in the next section).
It will be appreciated that any Gaussian blurring or similar processing step (typically employed during machine vision tasks) is omitted in this operation so that intensity peaks of the processed image data are not reduced.
As shown in Figure 5, the machine vision system (image processing module 44) determines the location of any bright areas that it considers as the potential location of the smartphone 2 (denoted ‘candidate target location’ in Figure 5). It will be appreciated that a bright area may be selected as a candidate target location when it has an appropriate size and/or brightness (e.g. a size and/or brightness falling within an associated size/brightness/colour range). When a number of candidate target locations (e.g. at least one) have been identified by the machine vision system in the image data, it looks for the presence of the specific light sequence associated with this delivery I recipient, at the identified location(s). In the specific example shown in Figure 5, out of three candidate target locations, the expected light sequence can be found only at one location (denoted ‘target location’).
Once the vision system (image processing module 43) has found the recipient’s flashing sequence (i.e. the target location I location of the smartphone 2), the companion computer calculates the distance that the drone 4 needs to move to be positioned directly over that location (at the desired height, e.g. 20m). This is done using the known, (fixed) field of view and resolution of the camera 5 with the altitude determined by the LIDAR and the number of pixels between the centre of the frame and the flashing LED. The calculation is illustrated in Figure 6.
In this example, the vision system uses one or more of the following parameters in determining the location of the drone 4 within the frame of reference of the imaging system, e.g. relative to the smartphone 2 (target location),:
Θ camera field of view;
a altitude (given by e.g. LIDAR, barometer, GPS, etc.);
D resolution (number of pixels) of the camera 5;
r number of pixels from centre of image to light source 6; and d number of pixels from centre of image to the smartphone 2.
Based on these parameters, the vision system (image processing module 43) may perform for example the following calculations:
b = a tan Θ d 2bdR — = — =>b = — r R2r dR => — = a tan Θ
2r
2ra => d = tan Θ
Accordingly, in this example, the distance from the drone 4 to the smartphone 2,
i.e. the number of pixels from the position of the drone 4 (herein represented by the centre of the image) to the position ofthe smartphone 2 (i.e. target location) is given by twice the number of pixels from the centre of the image to the light source 6, multiplied by the drone’s current altitude (relative to the ground), divided by the resolution (number of pixels) of the camera 5, and multiplied by the tangent of the camera field of view.
The companion computer (controller 37) then calculates the necessary velocity request to send to the flight computer (positioning module 43) by using the history of the above described distance and a control loop. Once complete, the drone 4 is located directly above the user’s smartphone 2.
Winch operation
Figure 4 schematically illustrates how the payload 7 of the drone 4 is lowered towards a recipient. Specifically, once the drone 4 (and thus the parcel) is located directly above the user’s smartphone 2, the companion computer instructs the winch controller (payload handling module 45) to commence delivery of the payload from the drone 4 to the recipient.
An exemplary winch 35 module (shown in more detail in Figure 7 and Figure 8) comprises a drum 46 around which a tether 47 can be wrapped and a means of rotating this drum in a controlled fashion (e.g. a motor). For the purpose described herein, it is advantageous for this drum 46 to facilitate a level wind of the tether in order to allow the position of the motor to be directly associated with the length of the tether and thus the distance from the drone to the parcel.
This level-wind is achieved by machining a screw thread into the drum 46 into which the tether 47 sits during spooling / unspooling. This forces the tether 47 to wind evenly across the surface of the drum 46.
As occurs in this case, a nut 48 is used to assist this process. This nut 48 is partially threaded and prevented from rotation by an external casing 49 and so moves linearly at the same rate as that intended of the tether. The tether 47 passes through a hole in this nut 48 and thus its level wind is assisted.
In the present example, delivery of the payload from the drone 4 to the recipient involves the following steps:
1) Homing the parcel release mechanism into the body of the drone 4 (finding its appropriate ‘zero’ position).
2) Rotating the winch motor a number of turns at a predefined speed. The number of turns is calculated from the diameter of the winch drum (a constant) and the altitude of the drone 4, as indicated by the LIDAR, and the desired delivery altitude of the parcel. For example, a suitable delivery speed may be calculated to ensure a full delivery can be made in 15 seconds (from 30m).
3) Once the motor has completed the required number of turns, it waits in this position for a predetermined duration (e.g. 10 seconds).
4) Once the predetermined time (10 seconds) has elapsed or when the drone 4 determines (e.g. using the camera 5) that the recipient has taken the parcel, the winch motor controller instructs the winch motor to return to the home position.
It will be appreciated that the above mentioned ‘required number of turns’ is not necessarily fixed: as the drone’s altitude varies over time, the LIDAR-read altitude will also change, thus updating the end position of the winch motor. This update ensures that regardless of any movement of the drone 4, the parcel is always delivered to the instructed delivery altitude (e.g. at 1.1m from the ground, but may be user-defined if desired).
Parcel release
The parcel is released manually by the recipient as the parcel is intended to be taken from mid-air. The user places one hand under the parcel, and uses the other hand to press a release button and/or the like. It will be appreciated that the drone 4 may be configured to determine whether or not the parcel has been released (e.g. due to a change in the weight of the payload and/or by using its image processing module).
In some cases, the parcel may also be released using a one handed mechanism allowing the smartphone to be continuously tracked.
Drone return
Once the parcel release mechanism returns to its home position, the drone 4 is instructed to return to the launch location (in this example, the warehouse 2), where it lands and powers down.
Beneficially, using a long tether to lower a parcel from the drone towards a recipient (e.g. following an appropriate identification), it is possible to keep the drone at a safe altitude whilst maintaining tight control of the position of the parcel. The parcel can thus be delivered in a controlled fashion, accurately to the recipient’s hand. The recipient may also be verified by the drone (e.g. by a suitable light pattern and/or the like) prior to operating the winch system, thus increasing delivery security.
Beneficially, the above system is able to actively control the precise position, in three dimensions, of the tethered underslung load relative to a recipient / target. It is also possible to verify the identity of the recipient (and/or an associated order number) using a unique light sequence emitted by the smartphone (e.g. from which the order was placed).
It will be appreciated that the above described delivery system may be used in a number of applications, including one or more of the following (amongst others):
- delivery of emergency medical supplies (such as EpiPens, defibrillators, and/or blood);
- delivery of goods from warehouses to customers (business or private);
- delivery of business supplies; and
- delivery of food and/or other goods/items, including rapid peer-to-peer delivery of food/items (e.g. a parcel can be loaded by one person and received/unloaded by another).
Modifications and alternatives
Detailed embodiments have been described above. As those skilled in the art will appreciate, a number of modifications and alternatives can be made to the above embodiments whilst still benefiting from the inventions embodied therein.
The above description refers to delivery of a parcel I package to a specific recipient (e.g. a customer). However, it will be appreciated that the invention is also applicable to collection of an item (parcel I package) from a sender. In this case, the drone may be configured to fly to the sender without a payload and collect the package from the sender using the winch mechanism as described above, for delivery to a different location (e.g. the warehouse or to another recipient).
It will be appreciated that if the drone’s image processing module either cannot find the correct light source (smartphone) or it cannot determine the correct light sequence (within a predetermined time window, e.g. 10 seconds), then the drone may be configured to obtain the smartphone’s GPS location again, re-navigate to that location and attempt to find the light source at that location. This approach beneficially allows the user to continue to move, and also allows recovery from a poor GPS signal and still result in a successful delivery.
The light pattern emitted by the smartphone may include information identifying a direction (in at least two dimensions, e.g. in the horizontal plane, or preferably in three dimensions) in which the drone/payload needs to move in order to position the parcel at the desired location (e.g. in the user’s hand).
It will also be appreciated that assistance information (i.e. other than the light sequence) may also be provided by the smartphone to the drone in order to allow more accurate and efficient detection of a target user / smartphone. For example, the drone may be configured to obtain information from the smartphone, e.g. information representing the direction in which the user is pointing the smartphone camera I flash. This may be achieved by for example extracting an appropriate vector from the smartphone (e.g. from its associated inertial measurement unit and/or the like) that can be used to guide the drone to the user more quickly (especially in the event of GPS inaccuracies). In one example, this may be achieved by instructing the user to “target” the drone in crosshairs on the smartphone’s camera feed, determining and providing (to the drone) the smartphone’s inclination and heading which, combined with the drone’s altitude reading, enables a path to the user to be established even in the absence of a reliable GPS data from the smartphone.
In one modified example, the drone includes a light source and the smartphone (e.g. an application on the smartphone) is configured to detect the light source of the drone and identify the delivery based on an associated light sequence / pattern (corresponding to e.g. a unique order ID I user ID I delivery ID stored on the smartphone and/or information identifying the drone). The smartphone in this example may also be configured to provide appropriate data/information to the drone (via a network or via direct communication such as Bluetooth, Wi-Fi Direct etc.) for assisting the drone in approaching the target location (i.e. the smartphone) at a predetermined altitude and for delivery of the parcel from the drone to (the hand of) the recipient.
In one example, a scanning LIDAR system may be used to image the ground across a larger area and to establish the vertical distance to the smartphone flash LED with greater confidence in the presence of drop offs (such as balconies etc).
Using the flashing light from the light source of the smartphone provides a lowlatency method for communicating with the drone 4, and serves the dual purpose of providing a simple object that the vision system can identify whilst also identifying the recipient. However, it will be appreciated that other information may also be communicated towards the drone using the light source. Such information includes, although not limited to, for example:
• the smartphone camera may be used to see the location of the parcel compared to its centre screen, and send this information back to the drone to increase the precision of the delivery;
• the required height above ground of the delivery (either user defined or corresponding to the height of the smartphone above the ground) may be calculated using the recipient’s biometric parameters (if stored on their smartphone) or by any other suitable measurement;
• the instantaneous velocity (and heading) of the user, which may help the drone remain above the user; and • the light sequence may also provide appropriate verification for example to enable a secure Bluetooth or WiFi connection with the drone, which may be used for an even quicker and efficient direct communication.
In the above description the smartphone emits a light sequence in order to guide the drone towards the user (and to uniquely identify the user/delivery). However, it will be appreciated that any other suitable time-varying electromagnetic radiation pattern may be used instead if a light sequence.
In the above description the parcel is kept directly above the recipient’s hand during lowering. This is achieved by using the vision system on the drone to monitor the position of the parcel relative to the recipient’s hand I smartphone, and then using an appropriate stabilizing force to control the location of the parcel. It will be appreciated that such stabilizing force may be generated by:
- translational movement of the drone in the horizontal plane so that the tether angle is adjusted to compensate for the external (wind) forces acting on the package; and/or
- a lightweight lateral thrust unit at the parcel end of the tether.
The first method is more technically challenging to realise due to the complex dynamics of tethered weights beneath airborne vehicles, but it is advantageous in reducing mechanical and electrical complexity. Additionally, the only part of the drone system that has to be abandoned in an entanglement or sabotage event is the cheap tether and parcel release clip. However, in some locations, where gusting I fast changes in wind direction are common, a sufficient control force may only be possible to achieve using the second method (possibly in combination of the first method).
As mentioned, in the case of entanglement (or sabotage, attempted theft, etc.) the parcel release mechanism (and tether) may be abandoned. Such action may be detected for example by observing a current load (or detecting an overload) of the winch motor, assessing the tension on the tether, and/or by other means. In this instance, the motor may be disengaged and the drone is able to gain altitude. In doing so, the winch may unspool the tether fully, thus detaching the tether from the drone. It will be appreciated that the system may be configured to pass through a number of stages of recovery before taking this action, including alarms, partial unspooling before attempted tether recovery, among others. It is worth noting however that the accurate positioning of the system reduces the risk of entanglement significantly.
As the parcel is lowered, it is possible that an offset will occur between the parcel and the smartphone (e.g. due to the wind). In this case, even though the drone is directly above the smartphone, the parcel is not. Under such conditions, it may be beneficial to track the position of the parcel and move the drone (horizontally) in order to compensate for any deviation from the desired delivery position so that the parcel is lowered directly to the user’s hand. If a lateral thrust unit is attached to the tether, then the parcel may be moved using the lateral thrust unit (in this case it may not be necessary to move the drone).
In order to facilitate tracking of the parcel (relative to the recipient’s hand), one of the following methods may be used:
• An additional LED (of a similar intensity and design to the smartphone flash) may be added to the parcel release housing (or the end of the tether) possibly with contacts to charge an included lightweight battery (which may be located in the release mechanism). This additional LED may also have an associated flash sequence (preferably different to the flash sequence of the smartphone’s light source). The vision system may be configured to search for both encoded flashes in the same image and generate a vector between the two in the aircraft frame of reference. The drone may be configured to correct its hover position based on the generated vector.
• A QR code (or other suitable graphical pattern/element) may be printed on the parcel wrap and tracked by the drone vision system alongside the smartphone’s light source. In this case, a vector may be calculated between the QR code I graphical element and the light source in the aircraft frame of reference.
• It may also be possible to use the smartphone’s camera (e.g. if data quality is good enough) and the positioning data may be fused from both the drone’s camera and the smartphone camera which may result in an improved accuracy of positioning the parcel. A low latency, authenticationfree transmission of the smartphone’s positioning data may be provided to the drone for example by encoding position updates in the smartphone LED flash sequence.
• A stereoscopic pair of cameras may be used by the drone for ranging directly to the smartphone light source (e.g. instead of ranging to the ground as when using a LIDAR system).
• If the smartphone LED is lost from view then the parcel lateral position may be held relative to other features (e.g. ground features) at least for a predetermined period before aborting delivery or alerting the recipient to move I turn the smartphone.
Combining the vector output information with the current winch position allows the relative location of the parcel and smartphone to be known in three dimensions. This information may be used to make small alterations to the drone’s positon and/or to control the lateral thrust unit, and hence the parcel position, during descent. An algorithm with parcel mass, tether length and wind measurements among other inputs may be used to adjust the response of the drone for optimum positioning of the parcel and to keep the feedback loop stable.
In the embodiments described above, the smartphone and the drone each include transceiver circuitry. Typically, this circuitry will be formed by dedicated hardware circuits. However, in some embodiments, part of the transceiver circuitry may be implemented as software run by the corresponding controller.
In the above embodiments, a number of software modules were described. As those skilled in the art will appreciate, the software modules may be provided in compiled or un-compiled form and may be supplied to the smartphone or the drone as a signal over a computer network, or on a recording medium. Further, the functionality performed by part or all of this software may be performed using one or more dedicated hardware circuits.
For example, functionality and/or modules described herein may be implemented using one or computer processing apparatus having one or more hardware computer processors programmed using appropriate software instructions to provide the required functionality (e.g. one or more computer processors forming part of the controllers described with reference to the Figures 2 or 3). It will be further appreciated that all or part of these functions may be implemented in hardware as dedicated circuitry for example using one or more dedicated integrated circuits such as an application specific integrated circuit (ASIC) or the like.
It will be appreciated that the controllers referred to in the description of the smartphone and the drone (i.e. with reference to Figures 2 and 3) may comprise any suitable controller such as, for example an analogue or digital controller. Each controller may comprise any suitable form of processing circuitry including (but not limited to), for example: one or more hardware implemented computer processors; microprocessors; central processing units (CPUs); arithmetic logic units (ALUs); input/output (IO) circuits; internal memories I caches (program and/or data); processing registers; communication buses (e.g. control, data and/or address buses); direct memory access (DMA) functions; hardware or software implemented counters, pointers and/or timers; and/or the like.
In the above description an approximately 20-30 metre tether is used for lowering the package thereby keeping the drone away from the recipient and any ground level obstacles. However, it will be appreciated that in certain cases it may be beneficial to use a longer, for example a 50m long tether (e.g. because in some 5 countries drones are required by law to keep at least a 50m distance (including above) persons not under direct control of the drone or its pilot). It will be appreciated therefore that an optimal flight/delivery altitude may be chosen by taking into account various factors including for example convenience ofthe user, weight/associated costs ofthe drone, and/or applicable legal requirements.
Whilst the above description refers to unmanned aerial vehicles (UAVs) I drones, it will be appreciated that the invention may also be applicable to any unmanned vehicles I delivery systems other than aerial ones (for example, autonomous delivery trucks). In this case the functionality of stabilising the payload and the winch mechanism may be omitted or replaced by a different payload handling 15 mechanism.
Various other modifications will be apparent to those skilled in the art and will not be described in further detail here.

Claims (23)

1. A vehicle for delivering an object to or collecting an object from a person at a destination, the vehicle comprising:
means for guiding the vehicle autonomously to an area comprising the destination;
means for detecting, within the area, a target indication at the destination, wherein the means for guiding is configured to guide the vehicle autonomously into closer proximity with the destination, based on the target indication when detected;
means for identifying a person; and means for releasing the object towards or collecting the object from the person when the means for identifying successfully identifies the person as the person to whom the object is to be delivered or from whom the object is to be collected;
wherein the means for detecting is configured to detect a target indication comprising a predetermined time-varying electromagnetic radiation pattern, associated with the person, emitted from a device.
2. The vehicle according to claim 1, wherein the predetermined time-varying electromagnetic radiation pattern comprises a time-varying light pattern (e.g. a predetermined light sequence).
3. The vehicle according to claim 1 or 2, wherein the means for detecting comprises an imaging sensor for obtaining image data and a processor for processing the obtained image data to detect the time-varying electromagnetic radiation pattern.
4. The vehicle according to any of claims 1 to 3, wherein the means for detecting is configured to obtain data from a remote imaging sensor (e.g. an imaging sensor of the device at the destination) and wherein the means for guiding is configured to use the image data obtained from the remote imaging sensor to guide the vehicle for closer alignment of a delivery or collection with the device at the destination.
5. The vehicle according to any of claims 1 to 4, wherein the predetermined timevarying electromagnetic radiation pattern is uniquely associated with at least one of: the destination; a user; the device from which the predetermined time-varying electromagnetic radiation pattern is emitted; an object to be delivered/collected.
6. The vehicle according to any of claims 1 to 5, wherein the means for releasing the object towards the recipient or collecting the object from the provider is configured to release a carrier, for holding or receiving the object, towards the destination (e.g. a winch and/or the like).
7. The vehicle according to claim 6, wherein the carrier is provided with an associated alignment indication, wherein the means for detecting is configured to detect the alignment indication, and wherein the means for guiding is configured to autonomously guide the vehicle to align the carrier with the device at the destination using the detected alignment indication and the detected target indication.
8. The vehicle according to claim 7, wherein the alignment indication comprises at least one of a printed pattern (e.g. a Quick Response (‘QR’) code) and a further predetermined time-varying light pattern (e.g. a different light sequence).
9. The vehicle according to any of claims 1 to 8, wherein the means for guiding comprises a receiver for receiving positioning signals based on which a geographical location of the vehicle can be determined (e.g. GPS location I network triangulation signals), and is configured to guide the vehicle to the area comprising the destination at least partly based on the positioning signals received by the receiver.
10. The vehicle according to any of claims 1 to 9, further comprising means for transmitting a signal indicating that the vehicle has arrived in the area comprising the destination.
11. The vehicle according to any of claims 1 to 10, comprising means to abort delivery when at least one of the following conditions is met: a timeout (e.g. a predetermined time has passed after initiating delivery or collection at the destination); the predetermined time-varying electromagnetic radiation pattern no longer being detected; an obstruction to the delivery or collection being detected (based on e.g. obtained image data and/or data provided by a sensor coupled to the vehicle); a battery charge falling below a predetermined level; a change in weather conditions being detected (e.g. wind speed being above a set limit); and user initiated abort signal is received.
12. A user device for facilitating delivery or collection of an object by a delivery vehicle, the user device comprising: means for receiving an indication that the delivery vehicle is in the vicinity; and means for emitting, responsive to receiving the indication, a predetermined timevarying electromagnetic radiation pattern associated with said delivery or collection for detection by the delivery vehicle.
13. The user device according to claim 12, wherein the means for receiving an indication is configured to receive an indication comprising a user input indicating that the delivery vehicle is in the vicinity.
14. The user device according to claim 12 or 13, wherein the means for receiving an indication is configured to receive the indication directly or indirectly from the delivery vehicle.
15. The user device according to any of claims 12 to 14, wherein the predetermined time-varying electromagnetic radiation pattern is uniquely associated with at least one of: the user device; a user; a location (e.g. a geographical or relative location of the user device); a shipment (e.g. an order number); the object to be delivered or collected; and the delivery device.
16. The user device according to any of claims 12 to 15, comprising means for obtaining information associated with said delivery or collection and/or a person to be identified, and wherein the means for emitting is configured to emit a predetermined time-varying electromagnetic radiation pattern associated with said delivery or collection based on said obtained information.
17. The user device according to claim 16, wherein said means for obtaining information is configured to obtain said information associated with said delivery or collection and/or a person to be identified from a remote server.
18. The user device according to any of claims 12 to 17, comprising a cellular telephone or a portable computer device.
19. The user device according to any of claims 12 to 18, wherein means for emitting comprises a light source (e.g. at least one of: a light-emitting diode (LED) flash, an infrared light source, and a display), and wherein the predetermined timevarying electromagnetic radiation pattern comprises a time-varying light pattern.
20. A system comprising: the vehicle according to any of claims 1 to 11; and the user device according to any of claims 12 to 19.
21. A method performed by a vehicle for delivering an object to or collecting an object from a person at a destination, the method comprising:
guiding the vehicle autonomously to an area comprising the destination;
detecting, within the area, a target indication at the destination, and guiding the vehicle autonomously into closer proximity with the destination, based on the target indication when detected;
identifying a person; and releasing the object towards or collecting the object from the person when the identifying successfully identifies the person as the person to whom the object is to be delivered or from whom the object is to be collected;
wherein the detecting a target indication comprises detecting a target indication comprising a predetermined time-varying electromagnetic radiation pattern, associated with the person, emitted from a device.
22. A method performed by a user device for facilitating delivery or collection of an object by a delivery vehicle, the method comprising:
receiving an indication that the delivery vehicle is in the vicinity; and emitting, responsive to receiving the indication, a predetermined time-varying electromagnetic radiation pattern associated with said delivery or collection for detection by the delivery vehicle.
23. A computer implementable instructions product comprising computer implementable instructions for causing a programmable communications device to perform the method of claim 21 or 22.
GB1715590.4A 2017-09-26 2017-09-26 Delivery system Withdrawn GB2567142A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB1715590.4A GB2567142A (en) 2017-09-26 2017-09-26 Delivery system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1715590.4A GB2567142A (en) 2017-09-26 2017-09-26 Delivery system

Publications (2)

Publication Number Publication Date
GB201715590D0 GB201715590D0 (en) 2017-11-08
GB2567142A true GB2567142A (en) 2019-04-10

Family

ID=60244301

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1715590.4A Withdrawn GB2567142A (en) 2017-09-26 2017-09-26 Delivery system

Country Status (1)

Country Link
GB (1) GB2567142A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113342048A (en) * 2021-06-23 2021-09-03 广州泽衡信息科技有限公司 Unmanned aerial vehicle express delivery method and system based on smart rod
US11320830B2 (en) 2019-10-28 2022-05-03 Deere & Company Probabilistic decision support for obstacle detection and classification in a working area

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11176630B2 (en) * 2017-12-21 2021-11-16 Wing Aviation Llc Dynamic UAV transport tasks
CN110389593B (en) * 2018-04-23 2022-12-02 北京京东乾石科技有限公司 Cargo throwing control method, system, equipment and storage medium of logistics unmanned aerial vehicle
CN113970753B (en) * 2021-09-30 2024-04-30 南京理工大学 Unmanned aerial vehicle positioning control method and system based on laser radar and vision detection

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160068264A1 (en) * 2014-09-08 2016-03-10 Qualcomm Incorporated Methods, Systems and Devices for Delivery Drone Security
WO2016196093A1 (en) * 2015-06-01 2016-12-08 Stoman Nicolaas Systems, methods, and apparatuses for managing aerial drone parcel transfers
US20170011333A1 (en) * 2013-12-26 2017-01-12 CyPhy Works, Inc. Unmanned delivery
US20170050747A1 (en) * 2015-08-22 2017-02-23 Olaf Wessler Method for destination approach control of unmanned aerial vehicles

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170011333A1 (en) * 2013-12-26 2017-01-12 CyPhy Works, Inc. Unmanned delivery
US20160068264A1 (en) * 2014-09-08 2016-03-10 Qualcomm Incorporated Methods, Systems and Devices for Delivery Drone Security
WO2016196093A1 (en) * 2015-06-01 2016-12-08 Stoman Nicolaas Systems, methods, and apparatuses for managing aerial drone parcel transfers
US20170050747A1 (en) * 2015-08-22 2017-02-23 Olaf Wessler Method for destination approach control of unmanned aerial vehicles

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11320830B2 (en) 2019-10-28 2022-05-03 Deere & Company Probabilistic decision support for obstacle detection and classification in a working area
CN113342048A (en) * 2021-06-23 2021-09-03 广州泽衡信息科技有限公司 Unmanned aerial vehicle express delivery method and system based on smart rod

Also Published As

Publication number Publication date
GB201715590D0 (en) 2017-11-08

Similar Documents

Publication Publication Date Title
GB2567142A (en) Delivery system
AU2017366437B2 (en) Landing and payload loading structures
US11827356B2 (en) Payload-release device position tracking
US11093888B1 (en) On-demand designated delivery locator
US10577124B2 (en) Method for destination approach control of unmanned aerial vehicles
US9783295B2 (en) Interaction during delivery from aerial vehicle
US10139817B2 (en) Unmanned aircraft systems and methods to interact with specifically intended objects
US9896207B2 (en) Product delivery methods and systems utilizing portable unmanned delivery aircraft
US10423167B2 (en) System and method for automated landing of an unmanned aerial vehicle
US9580173B1 (en) Translational correction of payload-release device based on tracked position
CN108885120A (en) Target position product delivery system and method
US11816999B2 (en) Unmanned aerial vehicle control system, unmanned aerial vehicle control method, and program
JP7002863B2 (en) Guidance system and guidance method
JPWO2020100945A1 (en) Mobile
KR102488641B1 (en) Method and apparatus for handling goods by unmanned aerial vehicle and autonomous robot
US20230205233A1 (en) Information processing system, method for setting release place, and non-transitory computer readable memory
JP2023036129A (en) Article delivery system and article delivery method

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)