GB2616879A - An autonomous mobile robot and automated produce transportation system - Google Patents

An autonomous mobile robot and automated produce transportation system Download PDF

Info

Publication number
GB2616879A
GB2616879A GB2204094.3A GB202204094A GB2616879A GB 2616879 A GB2616879 A GB 2616879A GB 202204094 A GB202204094 A GB 202204094A GB 2616879 A GB2616879 A GB 2616879A
Authority
GB
United Kingdom
Prior art keywords
autonomous mobile
mobile robot
location
server
geolocation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
GB2204094.3A
Other versions
GB202204094D0 (en
Inventor
Acevedo Henry
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fox Robotics Ltd
Original Assignee
Fox Robotics Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fox Robotics Ltd filed Critical Fox Robotics Ltd
Priority to GB2204094.3A priority Critical patent/GB2616879A/en
Publication of GB202204094D0 publication Critical patent/GB202204094D0/en
Priority to PCT/EP2023/057556 priority patent/WO2023180481A1/en
Publication of GB2616879A publication Critical patent/GB2616879A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/0274Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • G05D1/0278Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using satellite positioning signals, e.g. GPS

Abstract

An autonomous mobile robot 100 moves dependent upon data from a plurality of sensors (e.g., camera 126, 127, ultrasonic sensors 120 monitoring the surroundings of the robot 100. The robot 100 has an identification unit (e.g., a RFID reader) for determining an identity of a user and a loading area 104 with one or more weight sensors for determining the weight of produce loaded onto the loading area 104. A second invention relates to a system combining the robot 100 with geolocation devices (902, fig 5) which can be used to call the robot 100 to a pick up location.

Description

An Autonomous Mobile Robot and Automated Produce Transportation System
Field of the Invention
The present invention relates to an autonomous mobile robot and an automated produce transportation system.
Background
In the field of logistics there are still some sectors in which transportation of goods is still dependent on human labour. This reduces the profitability of the business and the productivity of the labour force since transportation is typically a low-skilled, low-productivity task. An example of this is fruit and vegetable picking. In many farms, pickers transport their picked produce to a collection area themselves, spending approximately 20% of their time doing so. A dedicated worker may be employed to perform all of the transportation tasks on a farm but this is still an unproductive use of human labour.
There is therefore a need to deploy an autonomous system to facilitate the transport of goods within certain environments. While autonomous transportation systems are known, these are difficult to deploy effectively and safely in certain environments. For example, in a fruit farm the pickup locations vary greatly. The drop off point may also vary from day to day. The terrain on a farm may be rough and include numerous -0 or obstacles which may move from day to day. Additionally, there are safety considerations when an autonomous vehicle is moving and working in close proximity to humans. Any autonomous transport system must also be able to transport sufficient quantity of produce efficiently to re-coup initial costs of the system.
Summary of the Invention
A first aspect of the invention disclosed herein provides an autonomous mobile robot comprising: a processor arrangement configured to control operation of the autonomous mobile robot; a plurality of sensors configured to monitor the surroundings of the autonomous mobile robot; an identification unit configured to determine an identity of a user loading or unloading the autonomous mobile robot; and a loading area, the loading area comprising one or more weight sensors configured to determine the weight of produce loaded onto the loading area by the identified user, wherein the processor arrangement is configured to control movement of the autonomous mobile robot dependent on data received from the plurality of sensors.
The autonomous mobile robot may further comprise a wireless communication unit, wherein the processor arrangement is configured to control the wireless communication unit to transmit data, comprising at least the identity of the user and the weight of produce loaded onto the loading area by the identified user, to a remote /o device. The processor arrangement may be further configured to control the wireless communication unit to transmit data comprising one or more of a current location of the autonomous mobile robot, a total load carried on the loading area, or a battery charge level.
The autonomous mobile robot may further comprise a GNSS receiver and wherein the processor arrangement may be configured to determine a current location of the autonomous mobile robot based on signals received via the GNSS receiver. The autonomous mobile robot may further comprise an inertial measurement unit and wherein the processor arrangement may be configured to determine a current location of the autonomous mobile robot based on measurements made by the inertial measurement unit. The autonomous mobile robot may further comprise one or more tracking cameras and wherein the processor arrangement may be configured to determine a current location of the autonomous mobile robot based on measurements made by the one or more tracking cameras.
The autonomous mobile robot may further comprise a memory storing a map of an operational area, the map indicating permitted pathways and known obstacles. The processor arrangement may be configured to plan a route from the current location of the autonomous mobile robot to a target location using the map.
The plurality of sensors may comprise at least one forward-facing camera and at least one rear-facing camera. The processor arrangement may be configured to: use data received from the at least one forward facing camera and/or the at least one rear facing camera to detect obstructions in the surroundings of the autonomous mobile robot; and when an obstruction is detected, use the map of the operational area stored in the memory to plan a route to the target location which avoids the obstacle. -3 -
The plurality of sensors may comprise one or more sonar sensors. The plurality of sensors may comprise one or more safety bumpers and wherein the autonomous mobile robot is configured to stop moving when one of the one or more safety bumpers is activated.
The autonomous mobile robot may further comprise a user interface and wherein the identification unit is disposed on or within the user interface. The user interface may comprise at least one of a touch sensitive display screen, a speaker, a microphone and jo one or more user inputs.
The autonomous mobile robot has a chassis, and the loading area may be mounted on top of the chassis.
A second aspect of the invention disclosed herein provides an automated produce transportation system comprising: the autonomous mobile robot of the first aspect; a plurality of geolocation devices, wherein each of the plurality of geolocation devices comprises a transceiver and is configured to determine its location; and a server configured to receive data from the plurality of geolocation devices and exchange data with the autonomous mobile robot.
Each of the geolocation devices may comprise an RF1D system storing identification data. Each of the geolocation devices may comprise a first user input button, wherein activation of the first user input button causes the geolocation device to send a pickup or request to the server, the pickup request including the location of the geolocation device.
Each of the geolocation devices may comprise a second user input button, wherein activation of the second user input button causes the geolocation device to send a message cancelling the pickup request to the server.
The server may be configured, in response to receiving a pickup request from a geolocation device, to send the location of the geolocation device to the autonomous mobile robot with instructions to move to the location. -4 -
The automated produce transportation system of the second aspect may comprise a plurality of autonomous mobile robots each according to the first aspect and wherein the server may be configured: in response to receiving a pickup request from a geolocation device, to select one of the plurality of autonomous mobile robots based on a distance between each of the plurality of autonomous mobile robots and the location of the geolocation device; and to send the location of the geolocation device to the selected autonomous mobile robot with instructions to move to the location.
The server may be configured to: access a map of an operational area of the plurality of io autonomous mobile robots, the map indicating permitted pathways and known obstacles; select one of the plurality of autonomous mobile robots based on an actual travel distance between the each of the plurality of autonomous mobile robots and the location of the geolocation device.
The server may be further configured to select one of the plurality of autonomous mobile robots based on a total load carried by each autonomous mobile robot as determined by the one or more weight sensors.
Each of the geolocation devices may comprises a GNSS receiver configured to determine the location of the respective geolocation device. Alternatively, or in addition, each of the geolocation devices may comprise an ultrasonic beacon configured to exchange data with a network of stationary beacons in order to determine the location of the respective geolocation device.
A third aspect of the invention disclosed herein provides an automated produce transportation system comprising: (i) an autonomous mobile robot comprising; a plurality of sensors configured to monitor the surroundings of the autonomous mobile robot; a GNSS receiver for determining a location of the autonomous mobile robot; a wireless transceiver configured to receive a target destination; and a processor arrangement configured to control movement of the autonomous mobile robot to the target destination dependent on data received from the plurality of sensors; (ii) at least one geolocation device, wherein the at least one geolocation device comprises a transceiver and is configured to: -5 -determine its location; and use the transceiver to send a request for the autonomous mobile robot to move to its location.
The system of the third aspect may further comprise a server, wherein the at least one geolocation device may be configured to send the request for the autonomous mobile robot to move to its location to the server and the server may be configured to set one of the received locations as the target destination and send the target destination to the autonomous mobile robot.
The at least one geolocation device may be configured to send the request directly to the autonomous mobile robot as a target destination and the autonomous mobile robot may be configured to select a target destination from among a plurality of received target destinations.
A fourth aspect of the invention disclosed herein provides a method of controlling an automated produce transportation system comprising an autonomous mobile robot, a geolocation device and a server, the method comprising: the geolocation device determining its location; the geolocation device sending to the server a request for the autonomous mobile robot to move to its location; the server receiving the request for the autonomous mobile robot to move to the location of the geolocation device; the server sending a target destination to the autonomous mobile robot, the target destination comprising the received location of the geolocation device; the autonomous mobile robot using a plurality of sensors to monitor the surroundings of the autonomous mobile robot; the autonomous mobile robot using a GNSS receiver to determine a location of the autonomous mobile robot; the autonomous mobile robot using a wireless transceiver to receive the target destination from the server; and the autonomous mobile robot using a processor arrangement to control movement of the autonomous mobile robot to the target destination dependent on data received from the plurality of sensors. -6 -
The method may further comprise the geolocation device sending a pickup request to the server in response to activation of a first user input button, the pickup request including the location of the geolocation device. The method may further comprise the geolocation device sending a message cancelling the pickup request to the server in response to activation of a second user input button, the pickup request including the location of the geolocation device.
The method may further comprise the server receiving a pickup request from a geolocation device, and in response, sending the location of the geolocation device to jo the autonomous mobile robot with instructions to move to the location.
The automated produce transportation system of the fourth aspect may comprise a plurality of autonomous mobile robots and wherein the method may further comprise the server receiving a pickup request from the geolocation device, and in response, selecting one of the plurality of autonomous mobile robots based on a distance between each of the plurality of autonomous mobile robots and the location of the geolocation device; and sending the location of the geolocation device to the selected autonomous mobile robot with instructions to move to the location.
The method may further comprise the server accessing a map of an operational area of the plurality of autonomous mobile robots, the map indicating permitted pathways and known obstacles; and selecting one of the plurality of autonomous mobile robots based on an actual travel distance between the each of the plurality of autonomous mobile robots and the location of the geolocation device.
The method may further comprise the server selecting one of the plurality of autonomous mobile robots based on a total load carried by each autonomous mobile robot as determined by one or more weight sensors disposed on each autonomous mobile robot.
The method may further comprise the autonomous mobile robot using an identification unit to determine an identity of a user loading or unloading the autonomous mobile robot. The method may further comprise the autonomous mobile robot using one or more weight sensors configured to determine the weight of produce loaded onto a loading area by the identified user. -7 -
The method may further comprise the autonomous mobile robot using the wireless transceiver to transmit data comprising at least the identity of the user and the weight of produce loaded onto the loading area by the identified user to the server.
The method may further comprise the autonomous mobile robot using the wireless transceiver to transmit data comprising one or more of a current location of the autonomous mobile robot, a total load carried on the loading area, or a battery charge level.
Jo Brief Description of the Figures
So that the general concepts set out in the foregoing sections can be more fully understood, embodiments thereof will be described with reference to the accompanying drawings, in which: Figure ia shows a perspective view of an autonomous mobile robot from the front; Figure lb shows detail of the front section of the autonomous mobile robot of Figure la; Figure lc shows detail of a side section of the autonomous mobile robot of Figure la; Figure id shows a perspective view of the autonomous mobile robot from the rear; Figure 2 is an exploded view illustrating details of the loading area and weight sensors of the autonomous mobile robot of Figure 1; Figure 3 a schematic of the electronics of the autonomous mobile robot of Figure 1; Figure 4 shows an embodiment of the autonomous mobile robot with three refrigeration units; or Figure 5 illustrates an automated produce transportation system comprising the autonomous mobile robot of Figure 1 a server and a plurality of geolocation devices; and Figure 6 is a schematic illustration of the components of a geolocation device.
Detailed description
Referring to Figures ia to id, an embodiment of an autonomous mobile robot (AMR) 100 is shown. The AMR 100 takes the form of a rover having a chassis 102, supported by a suspension system 106. Locomotion is provided by a plurality of wheels 108 secured to the suspension system 106. The computation box no housing the electronics of the AMR 100 can be seen in Figure la through a gap or clear panel provided in a side panel -8 -of the chassis 102. This panel can be removed to allow engineering access to the computation box 110.
The chassis 102 is constructed from ductile soft steel and four wheels io8 are mounted on two axles with the suspension 106 in a traditional rover configuration. There are no steering mechanisms, and the AMR ioo uses differential or skid steering to turn. A bent sheet iron chassis (not shown) is mounted onto the undercarriage and houses the computation box no. The computation box no is an 1P67-rated sealed plastic or metallic container box and houses all the electronics (described in detail with reference /0 to Figure 3), safety and control hardware, dedicated safety PLC, and two motor control units. The chassis 102 is not sealed, but the exposed hardware is weather-resistant.
Figure ia shows the AMR km from the front, while Figure id shows the same AMR leo from the rear. The chassis 102 supports object detection cameras 126, including a front object detection camera 126a and a rear object detection camera 126b, disposed at the front and rear of the chassis 102 respectively. The front of the chassis 102 also supports a tracking camera 127.
The front object detection camera 126a and the rear object detection camera 126b may 20 be RGBD camera units and each may comprise two apertures and two separate cameras, to facilitate depth perception.
The tracking camera 127 may also comprise two separate lenses, which may be fish-eye lenses. In some examples, the tracking camera 127 has inbuilt processing capabilities such as a vision processing unit (NTH) and additional sensors, such as an inertial measurement unit. The tracking camera 127 may therefore be capable of running visual simultaneous localization and mapping (V-SLAM) algorithms to facilitate mapping and depth perception.
Figure lb shows a close-tip of the front section of the AMR loo in which the relative positions of the front object detection camera 126a and the tracking camera 127 can be seen. However, these positions are only illustrative of one possible arrangement. Although both the object detection cameras 126 and the tracking camera 127 are shown as double lens cameras, suitable single lens cameras may also be employed. -9 -
A loading area 104 is mounted on top of the chassis 102. The loading area 104 may take the form of a tray table onto which produce trays can be placed. Weight sensors (also referred to herein as load cells) are integrated into the loading area 104 to enable the AMR too to weigh produce as they are loaded.
The chassis 102 supports a user interface 112 at the front of the AMR too. The user interface comprises a display screen, which may be a touch sensitive screen. The user interface 112 also comprises an identification unit (not shown) which may be disposed inside of the housing of the user interface 112, integrated as software in the user it) interface 112 or alternatively provided as a separate unit next to the user interface 112.
The identification unit may be an RFID reader and is configured to determine an identity of a user loading or unloading the AMR too. The user interface 112 may also comprise a number of physical user inputs (not shown), for example in the form of push buttons. The physical user inputs may be configured to turn the user interface on/off and set various usage modes.
The AMR too has a plurality of sensors configured to monitor the surroundings of the autonomous mobile robot. Movement of the autonomous mobile robot is controlled dependent on data received from this plurality of sensors. The plurality of sensors includes the tracking camera 127, at least one ultrasonic sensor 120 (also referred to as a sonar sensor) a front safety bumper 122a and a rear safety bumper 122b. The AMR 100 is configured to stop if either of the safety bumpers are contacted while the AMR 100 is moving.
or Figure ic shows a close-up of one side of the AMR too, in which two ultrasonic sensors 120 can be seen. The two ultrasonic sensors 120 are angled forward and backward at approximately a 45 degree angle, so as to cover the area to the side of the AMR which is not in the field of view of the object detection cameras 126. The ultrasonic sensors 120 provide proximity alerts during turning. In addition to the ultrasonic sensors 120 shown on the side of the chassis, further ultrasonic sensors may be provided on the front and rear of the AMR too.
The AMR too is powered by two 24V Lithium-Ion batteries, housed near the front and rear of the chassis for even weight distribution. Each battery has its own side hatch 114 to facilitate easy battery insertion and removal. The chassis 102 also supports a floodlight 116. A similar light may be provided on the rear of the AMR too. The side -10 -panels of the chassis have LED strips 118. The chassis 102 also supports a number of emergency stop buttons 124. These are relatively large and prominent and provided on the front, rear and both sides of the AMR.
The chassis 102 also supports an antenna 128 and a radar 130. The antenna 128 may house separate WiFi, 4G and long-range radio (LoRa) antennae.
The AMR km has four wheels 108 in the embodiment of figure 1. However, the AMR loo may be provided with any suitable number of wheels, for example six or more.
/o Alternatively, the AMR loo may be provided with any suitable number of continuous tracks, such as caterpillar treads.
Figure id shows the AMR loo from the rear. The rear safety bumper 122b and rear object detection camera 126b can be seen in mirrored positions to those on the front of the AMR. The AMR loo also has an additional user interface bracket 132 to allow a further user interface to be attached. This user interface may be identical to the first user interface 112, allowing a user to interact with the AMR loo from either end. This may be particularly useful where the AMR 100 is being used in confined spaces, such as between rows of plants.
Figure 2 is an exploded view illustrating details of the loading area 104 and weight sensors. The loading area 104 comprises three platforms 200, each configured so as to receive a tray of corresponding size. Each of the platforms 200 has a respective weighing arrangement comprising a central load cell 202 and four corner pins 204, or each comprising one or more springs. The central load cell zoz performs the primary measurement of the weight loaded onto the platform zoo. The platform zoo may not be evenly loaded and the corner weight pins 204 have springs which prevent the platform from bending under an uneven load and ensure that the weight measured by the central load cell 202 is accurate.
Referring now to Figure 3 a schematic of the electronics 300 of the AMR loo is shown. Some of the electronic components are housed in the computation box no, while others are supported at various position on the chassis 102 of the AMR 100. Figure 3 is highly schematic and intended to illustrate only the functional connections between elements.
The electronics 300 comprise a processor arrangement 302, a memory 304, a wireless transceiver 306, user inputs 308, display 310, speaker 312 and optionally a microphone 314. The processor arrangement 302 is configured to control operation of the other components of the AMR too. The processor arrangement 302 may for instance be a microprocessor, a Digital Signal Processor (DSP), Application Specific Integrated Circuit (ASIC), Field Programmable Gate Array (FPGA) or the like. Alternatively, the one or more processors 1202 may comprise specialised processing hardware, for instance a MSC processor or programmable hardware with embedded firmware. The processor arrangement 302 may comprise multiple processors.
The display 310 may be a standard LCD display or a touch sensitive display based on capacitive or resistive sensing technology. The speaker 312 may provide audible warnings and instructions during operation of the AMR too. It may also be possible for a remote supervisor to send voice messages through the speaker 312 to nearby users.
The optional microphone may be used to receive voice commands from a user, or to facilitate communication between a user and a supervisor of the system. The user inputs 308 may comprise a number of physical user inputs, for example in the form of push buttons. The user inputs 308, display 310, speaker 312 and microphone 314 may together form the user interface 112 of the AMR loo shown in Figure 1. The physical user inputs 308 may be configured to turn the user interface 112 on/off and set various usage modes.
The memory 304 may comprise both a program memory storing program code (e.g. software or firmware) and main memory storing data. The processor arrangement 302 or is configured to execute the program code stored in the program memory and to read, write and delete data from the main memory. The program memory may for instance be a non-volatile memory, such as a Read-Only Memory (ROM), a Flash memory or a magnetic drive memory. The main memory may for instance be a Random Access Memory (RAM) for example Static RAM (SRAM) or Dynamic RAM (DRAM), or it may 30 comprise Flash memory, such as an SD-Card.
The wireless transceiver 306 comprises the network interfaces necessary to communicate over 3G, 4G, LoRa, WiFi or any combination of these protocols and also includes the antenna 128.
-12 -The electronics 300 comprise an ID unit 316. The ID unit 316 is configured to read information from a token, card or device in proximity to the ID unit. In some embodiments, the ID unit 316 is an RFID reader, for example employing NFC. Alternatively, the ID unit 316 may use Bluetooth. Alternatively, the ID unit 316 may be integrated into the software of the user interface 112 and may require that the user enter a pin or alphanumeric code to identify themselves. The processor arrangement 302 uses the ID unit 316 to identify the user interacting with the AMR loo and in particular to associate the weight of produce loaded onto the AMR with the identified user.
The electronics 300 comprise a GNSS receiver 326, an inertial measurement unit 328 and one or more tracking cameras 127. Using signals from some or all of these components, the processor arrangement 302 is able to localise the robot within a global reference frame, i.e. it can determine where it is anywhere in the world (provided there is sufficient GNSS coverage). The GNSS receiver 326 is able to communicate with all available GNSS arrays (Galileo, GPS, BeiDou, QZSS, GLONASS, etc.). The GNSS receiver 326 may be an RTK GNSS receiver. Although the GNSS receiver 326 and wireless transceiver 306 are shown as separate elements, it is possible for a single transceiver to provide the functions of both of these units. In some embodiments, the tracking camera 127 has its own on-board VPU which does all of its own processing. The tracking camera 127 therefore directly outputs various telemetry estimates e.g. linear and rotational velocities, which are used in a localisation model to determine the AMR's position and orientation. The inertial measurement unit 328 (IMU) may comprise any suitable combination of accelerometers, gyroscopes and/or or magnetometers. Magnetometer signals and the rotational velocity measured by the IMU 328 are combined with the data from the tracking camera 127 to provide a heading and orientation, while the linear acceleration signals from the IMIJ 328 are combined with the location data from the GNSS receiver 326 and tracking camera data to provide a 3D location.
The electronics 300 comprises a plurality of weight sensors 320 configured to determine the weight of produce loaded onto the loading area. The AMR loo may comprise several different loading plates within the loading area, and each plate may have respective weight sensors 320. Each set of weight sensors 320 may comprise multiple sensors working in combination. As the produce is weighed by the weight sensors 320, it is associated with the identified user.
-13 -The electronics 300 comprise one or more object detection cameras 126. These may comprise the front object detection camera 126a and rear object detection camera 126b previously described. The object detection cameras 126 may be RGBD cameras capable of perceiving depth. The data from the object detection cameras 126 is provided to the processor arrangement 302, which uses it to determine whether there are obstacles in the AMR's path. These obstacles may include people, empty and full trays, trees, and vehicles. Software stored in the memory 304 and executed by the processor arrangement 302, is used to determine the type of obstacle, and devise a route around it. The electronics 300 comprise a plurality of ultrasonic sensors 120. These are provided on the sides of the AMR loo so as to cover the blind spots of the object detection cameras 126. Further ultrasonic sensors may also be located on the front and/or rear of the AMR ma The electronics 300 comprise emergency stops 322. These may include the plurality of emergency stop buttons 124 and the safety bumpers 122 previously described.
The electronics 300 comprise a number of motor controllers 324 configured to control the application of power to the wheels 108 and thereby to control the movement of the AMR 100. Finally, the electronics 300 comprise a safety unit 318, which may be a programmable logic controller. The safety unit 318 monitors the conditions of the emergency stops 322, which include at least the front safety bumper 122, rear safety bumper 122b and emergency stop buttons 124. The safety unit 318 may also receive stop commands directly from the ultrasonic sensors 120. If the safety unit 318 received a stop signal from any of the emergency stops or the ultrasonic sensors, then it sends a -0 or stop command to the motor controllers 324. This stop command countermands any signal sent to the motor controllers 324 by the processor 302.
The safety unit 318 also monitors the speed and acceleration of the motors via the motor controllers 324 as well as monitoring the temperature and current in the motor controllers 324 If any of these parameters exceed predetermined thresholds set in the safety unit 318, then a safety condition is triggered and the safety unit 318 sends a stop command to the motor controllers 324 Again, this stop command countermands any signal sent to the motor controllers 324 by the processor 302. Only once the safety unit 318 has cleared the safety condition will the motor controllers 324 respond to signals from the processor 302. The safety unit 318 can inform the processor 302 whenever a -14 -stop command is received or when one of a plurality of safety conditions is triggered or cleared.
The processor arrangement 302 is connected and to and controls the operation of each of the other components of the electronics 300, with the exception of the safety unit 318 and emergency stops 322 as described above. The memory stores software which is executed by the processor arrangement 302 to control the operations of the AMR loo. Most of the AMR's control system is written in C++ and Python using the Robot Operating System (ROS) framework. The memory 304 also stores a map of the jo operational area, the map indicating permitted pathways and known obstacles. For example, the positions and dimensions of pathways between rows of plants may be part of the map. The locations of permanent obstructions such as the edges of polytunnels and poles may also be marked on the map. The map may be preprogramed into the memory 304. The AMR 100 may also update the map as it operates whenever an obstacle is detected by the object detection cameras 126. An operator of the system may then confirm whether the obstacle is a permanent obstacle to be added to the map or not. Dynamic objects such as cars, people, trolleys and trays are not updated on the map, but instead are detected by the front object detection camera 126a and the rear object detection camera 126b. The memory stores routing software to allow the processor to determine a route from its present location to a target location using the map. As such, when given a target location, the processor arrangement 302 is able to plan the shortest route and update the path accordingly as new map information becomes available. The processor arrangement 302 then sends commands to the motor controllers 324 to control movement of the AMR. -0or
The processor arrangement 302 uses the wireless transceiver 306 to communicate information to a remote device or control system. The processor arrangement 302 sends the telemetry data for the AMR loo, for example the AMR's battery levels, current produce load and location. The processor arrangement 302 may also send a status update whenever a stop command or a safety condition is notified to it by the safety unit 318. The processor arrangement 302 also uses the wireless transceiver 306 to receive job commands. The job commands may indicate the target location for pickup or for drop off. The job command may also indicate the identity of the user requesting pickup. The processor arrangement 302 also uses the wireless transceiver 306 to transmit the identity of the user and the weight of produce loaded onto the loading area by the identified user to a remote device or control system.
-15 -The skilled persons will appreciate the certain elements, such as power sources, serial buses and other hardware are omitted from Figure 3 for clarity. Furthermore, the skilled person will appreciate that not all of the elements shown in Figure 3 are essential for implementing the invention and that the functions of the various components are mostly independent of one another. Usage
The AMR 100 is intended to be used to automate the logistics transportation of items, jo in particular where the start and end points of the transportation vary. The AMR 100 is adapted for use in outdoor environments, with rough terrain, but can equally be used in indoor settings, for example in warehouses or glasshouses. The AMR 100 finds particular use in agricultural settings in which picked produce needs to be collected from the picking point and transported to a collection point, for further processing or transportation. The AMR 100 can also be used to transport seeds, seedlings or plants between growing and storage areas.
The rover's dimensions are set such that it fits within the im to 2m polytunnel row configuration commonly found on farms and it is capable of carrying a load of 200kg on its table. In some embodiments to the AMR 100 has a length of approximately 1.65 meter, a width of approximately 75 cm, and a height (to the loading area) of approximately 60 cm. The unladen weight of the AMR 100 is approximately 90kg and it can carry a payload of up to 200kg.
or In a farm environment, a user (generally a fruit or vegetable picker) interacts with the AMR loth The user is equipped with an ID token, card or device, for example an RFID card and uses this to log in to the AMR 100 using the identification unit 316 of the AMR. Alternatively, the user may enter an identification number into the user interface 112, for example using the touch screen display 310. After identifying the user, the AMR 100 can then associate the weight of produce loaded onto the loading area 104 with that individual. After logging into the AMR 100, the user interface 112 may communicate information to the user. For example, the display 310 may display a photograph of the user. The user interface 112 may also request confirmation of the user's identity which may be provided for example via the touch sensitive display 310 or via a user input 308.
The user interface may comprise a number of LEDs to communicate the status of the AMR, e.g. to indicate when the user has successfully logged in, to indicate that the -16 -produce has been successfully weighed and to indicate that the user has successfully logged out. This information may also be displayed via the LCD screen and/or via the speakers. The user interface may request that the user take action to log out after loading produce onto the loading area 104 or the user may automatically be logged out a predetermined time after loading has commenced. The AMR may proceed to its next waypoint or collection point when the current user indicates that loading is complete, or after the predetermined time. After successfully loading of produce onto the loading area 104, the AMR loo communicated the identity of the user and the weight of produce loaded via the wireless transceiver 306 to a remote device or controller.
During operation, the AMR loo determines its location. The primary means for doing this is the GNSS receiver 326. The inertial measurement unit 328 and tracking cameras 127 may also contribute to determining the AMRs location. The AMR conducts a location determination using the GNSS receiver 326 at predetermined intervals. In between these intervals the inertial measurement unit 328 may be used to determine movements of the AMR. The AMR mo then communicates it location to the remote device or controller using the wireless transceiver 306, for example using cellular data (e.g. 4G), WiFi or LoRa depending on the location and circumstances in which it is working. The AMR also periodically sends other telemetry data, such as the battery level and total load carried on each of the platforms 200 in the loading area 104. The AMR may also transmit continuously or on demand a camera feed from the object detection cameras 126 and/or tracking camera 127.
The AMR receives pickup job commands from the remote device via the wireless or transceiver 306. In response to receiving such a command the AMR uses the map information stored in its memory to plan the quickest route to the target location of the job command. While moving towards the target location, the object detection cameras 126 are used to look for potential obstacles. If an obstacle is detected, the software stored in the memory determines the type of obstruction. If the software determines that the obstruction is likely to move within a predetermined timeframe (for example if the obstructions is a person), then the AMR may use the object detection cameras 126 to re-assess for obstructions until the predetermined timeframe has elapsed. After this, or if an obstruction is identified as one which will not move, then the obstruction is added to the stored map information and the route planning is re-run to provide an alternative route to the target location which avoids the obstacle. If no route to the target location can be found as a result of the obstacle, then the AMR may communicate -17 -that no route is possible to the controller, which can then implement a recovery behaviour. This recovery behaviour may comprise the AMR being sent to a previous known location close to the obstruction or the AMR sending a condition or alarm status to the remote device or controller to alert the operator for manual operation of the AMR.
Various ancillaries may be used with or attached to the AMR too to increase its utility in various circumstances. Figure 4 shows an embodiment of the AMR wo in which three refrigeration units 400 are disposed on the loading area 104. Each refrigeration tc) unit 400 may comprise a number of shelves 402 for loading smaller trays of produce on. The refrigeration units 400 may be powered by the AMR loo power supply or may have separate power supplies. The freshness of the picked produce can be improved if it is refrigerated more quickly after picking. The AMR shown in Figure 4 provides portable refrigeration capacity which would be difficult to achieve by other means, particularly where the space between rows of plants is limited. The AMR is programmed to zero the central load cell 202 after the refrigeration units 400 have been placed onto the loading area 104 such that the weight of produce loaded by an individual user can still be accurately measured.
Broader System The AMR loo operates as part of a larger automated produce transportation system 900 as shown in Figure 5. The system 900 comprises the autonomous mobile robot 100 as previously described, a plurality of geolocation devices 902, a hub 904, a task management server 906, a web application user interface 908 and a database 910. -0or
Although only a single AMR loo is shown, the system 900 may comprise a plurality of AMRs. In a situation in which there is only a single AMR mo, the task management server 906 may be on board the AMR too. Alternatively, one of a fleet of AMRs may act as controller for the fleet and carry the task management server 906. However, in most embodiments, the task management server 906 is provided separately as a cloud based instance. The AMRs are in constant communication with the server 906, updating it with each AMR's telemetry data (e.g. battery levels, current produce load, location, etc.). The server 906 therefore has up-to-date knowledge of each AMR's status.
Users (pickers) need to indicate to AMRs 100 that they have full trays and need to be serviced. However, the AMRs 100 will often be outside of a picker's visual or traditional -18 -remote control range and the user will not be able to call one over via those channels. To address this, each user is provided with a geolocation device 902 which are able to determine a user's geolocation and contact the server 906 via WiFi, 3G, 4G or LoRa and request a pickup at their location upon a button press on the geolocation device 902.
The server 906 may be a remotely located web server or a server located on site.
When a user pickup request is received, the server 906 evaluates the viability of the request itself and then determines which of the available AMRs is most suitable for the task based on all of the AMRs' availability status (e.g. fully-loaded AMRs are /o unavailable) and other factors, such as an AMR's distance to the request. The server may store a map of an operational area of the plurality of AMRs, the map indicating permitted pathways and known obstacles. Selecting the most suitable AMR can be based on an actual travel distance between the each of the AMRs and the location of the geolocation device goz making the pickup request. Alternatively the map may be stored /5 in the database 910 to which the sever 906 has access.
The pickup task is then sent by the server 906 to the selected AMR loo, which then commits to moving to the pickup location and completing the pickup task. Job cancellations are handled in a similar manner, informing the dispatched AMR that it no longer needs to complete the task and can return to an idle state.
The server 906 also implements a job queueing system to store and send out jobs to the AMRs as they become available. In some embodiments, each AMR loo is only aware of one pickup task at a time and only has one target destination provided to it at a time.
or The target destination is the live location of the geolocation device goz which the AMR has been commanded to move to. Therefore, if the geolocation device should be moved while the AMR is moving, the AMR will still move accurately to the target destination. In some circumstances, if the geolocation device should be moved while the AMR is moving to its location, then the AMR may dynamically re-assess its planned route and change the route if necessary (e.g. if a more efficient path to the new location is possible). After completing a pickup or drop off task, the AMR loo informs the server 906. A pickup or drop off task assigned to an AMR can be cancelled or interrupted by the server 906 and the AMR assigned to a new job if a better suited one becomes available. Similarly, with a fleet of multiple AMRs 100, if a better suited AMR loo becomes available, then a pickup task may be reassigned to this better suited AMR.
-19 -Also included in the system 900 is a hub device 904. The hub device 904 may be similar to the geolocation devices 902. The hub device 904 also determines its location and communicates this to the server 906. The hub device 904 acts as a collection point for the produce collected by the AMR loo. The hub 904 may issue a return command to the AMR loo via the server 906 which causes the AMR to return to the hub location. The AMR loci may also autonomously return to the hub location when the total load carried by the AMR, as determined by the weight sensors, exceeds a predetermined threshold.
jo The system also includes a web application user interface 908. Produce data collected by the server 906, including the weight of produce loaded onto an AMR and a user ID associated with that produce is uploaded is displayed to the system operator via this user interface 908. Additional robot data (location, time of day, battery charge, etc.), are also presented to the operator. This information provides the operator with information about the overall status of the picking operation (e.g. daily yields per field, output per picker, etc.), as well as insights into the AMR fleet's status.
In some embodiments the web application user interface 908 has six pages, which may be accessible through respective tabs. These are the home page, the operations page, the telemetry page, the staff page, the report page, and the settings page. In addition to these pages the web application 908 keeps a side bar visible at all times that shows the identity of the current user, the status of AMRs connected and weight of produce picked in the month thus far. Additionally, the current date and time is always visible at the top of the page. or
The home page shows key performance indicators. This page is organised in three sections: a collection of summative cards, weekly weights for each type of produce, and a map showing the location of all connected AMRs 100, geolocation devices 902 and hubs 904. The operations page is primarily for data input. This page comprises three tabs that handle the creation, update, and deletion of three categories of data. These categories are: Produce 8r Fields, Picker teams, and Produce weight data.
The telemetry page's primary focus is data visualisation. It presents data regarding the AMRs, users/pickers, and coordinators/supervisors in a visual format. These three 35 categories are also organised in tabs. The staff page is used for user and RF1D management. It is organised in two tabs: the picker tab and the coordinator tab. Each -20 -tab contains the list of staff that corresponds with the tab and upon clicking on the icon next to a name, the operator can see detailed information about the staff member selected. The picker tab slightly differs from the coordinator tab. This is because it is only on the picker tab that the user can select whether pickers work in teams or individually.
Operators of the web application 908 can create reports on the report page. The report creation is customisable and can be exported in CSV and PDF formats. Reports can be created choosing the start and end date of their report as well as selecting what headers to include in their report. The settings page allows the web application operators to control parameters for both the AMR and the web application itself.
The task management server 906 is configured to save data in various MySQL database tables in the database 910. The task management server 906 sends data to the web application UT 908 when requested. Most of the pages of the web application user interface 908 described above get data directly from the database. The task management server 906 performs SQL queries to insert, select, update, and delete the data on those pages. Certain pages like the home page and the telemetry page get some of their data directly from connected devices. Data includes CPS location, camera feed and battery information. The first tab on the telemetry page for example shows the camera feed, GPS position and battery usage of a selected AMR loo. When an operator selects an AMR, the web application 908 sends a signal to the task management server 906, and the task management server 906 in sends a signal to the AMR.
or The processor arrangement 302 on the AMR too constantly receives data from the object detection cameras 126 and tracking camera 127, GNSS receiver 326 and battery information. This information is stored temporarily in the memory 304 and updated as soon as new information is published by the AMR. As soon as the processor arrangement 302 receives a signal from the server 906, it sends the latest information it has via the wireless transceiver 306. While the AMR is selected in the web application user interface 908, it continues to send the latest information so that the camera feed, GNSS location, and battery data reflects any changes in near real time.
When the task management server 906 receives the camera, GNSS, and battery data 35 from the AMR, it makes a temporary list and saves the data received and some additional information. The additional information includes the ID of which AMR sent -21 -the data and the time the data was received. This additional information is used by the task management server 906 to send the correct information to the web application user interface 908 and delete data that has been in memory for too long. The web application user interface 908 in turn receives the requested data from the task management server 906 and shows it using various components as appropriate.
On the home page the produce weight and location data come from other devices. The weight is recorded from the AMR 100 and securely stored in the database 910 by the task management server 906. Operators of the web application user interface 908 can also add to this weight manually on the operations page. The location of the AMRs loo and the hubs 904 are periodically sent by each device. The geolocation devices 902 send their GNSS positions to the task management server 906 when a user presses a button on it, as described below. All of these locations are stored in the task management server 906 and sent to the web application user interface 908 /5 and displayed on a map for the operators to view.
The system 900 also provides functionality to assist users of the AMRs too who do not speak the local language or who have low literacy. To this end the AMR memory 304 may store various written or spoken commands in a variety of different languages. Once the AMR loo reaches the target location, i.e. the location of the geolocation device 902 which has made the pickup request, the AMR requests that the user identify themselves. This may be done for example using a recognisable sound emitted by the speaker 312, a graphic displayed on the display 310 or by illuminating one or more of the LED strips 118. Once the users have identified themselves by presenting their REID or card/token or by inputting their pin code into the user interface 112, the user's ID data is sent by the AMR 100 to the task management server 906. The task management server 906 checks the ID data against data held in the database 910 which identifies the basic information of the user, such as language, ability with a local language (e.g. English), and if the preferred language of the user is different to the local language. If so, the task management server 906 instructs the AMR 100 to switch to displaying labels, commands and responses on the display 310 in the preferred language of the user. If the information held on the user indicates a low level of literacy, the task management server 906 instructs the AMR loo to provide basic spoken instructions via the speaker 312. In some embodiments, the spoken instructions are always provided in addition to the displayed text. After the user confirms that they have finished loading -22 -the AMR too, or after a predetermined time from the user logging in, the AMR defaults back to the local language, ready for the next task.
Geolocation Devices Figure 6 is a schematic illustration of the components of a geolocation device 902. The geolocation device 902 comprises a microcontroller 1000, DIP switch 1002, first button 1004, second button loo6, third button too8 and a plurality of LEDs low. The geolocation device 902 also comprises an LCD screen 1012, RFID system 1014, one or more geolocation units and one or more communication units.
The first button 1004 is programmed to send a pickup request to the server 906. The second button loo6 is programmed to cancel the pickup request. The third button loo8 may be programmed for a variety of purposes, for example to perform a soft reset of the geolocation device 902. The LEDs tow may be associated with respective buttons (1004, 1006, 1008) to indicate button selection. The LCD screen 1012 is a non-touch LCD screen for displaying basic messages to the user.
The one or more geolocation units comprise at least one of an GNSS RUC receiver 1016 and an ultrasonic beacon 1018. The ultrasonic beacon may be advantageous in areas where there is no satellite visibility. The one or more communication units comprise at least one of a LoRa module 1020, a WiFi module 1022 and a 4G SIM 1024 In some embodiment only one of these communication units may be provided in each geolocation device depending on the circumstances in which it is to be deployed. Due to the rural nature of some agricultural locations, a reliable 4G signal may not be or available, but a VViFi network may exist or could be set-up.
The geolocation devices 902 may also be referred to as 'smart trolleys' in reference to the trolleys used by pickers. Each geolocation device may comprise a housing (not shown) with one or more hooks to facilitate hanging the geolocation device 902 on the handle of the trolley. Alternatively, the geolocation device 902 may be configured to be worn on the clothing of the user (picker).
The hub devices 904 may be physically identical to the geolocation devices 902. The devices have a DIP switch 1002 to alternate between hub and trolley modes and adjust the behaviour accordingly. This switch is generally unavailable to the users (pickers).
-23 -List of Reference Numerals autonomous mobile robot 100 (AMR) chassis 102 loading area 104 suspension 106 wheels 108 computation box no user interface 112 /0 side hatch 114 floodlight 116 LED strip n8 ultrasonic sensor 120 safety bumpers 122 front safety bumper 122a rear safety bumper 122b emergency stop button 124 object detection cameras 126 front object detection camera 126a rear object detection camera 126b tracking camera 127 antenna 128 radar 130 additional user interface bracket 132 or platforms zoo central load cell 202 corner pins 204 electronics 300 processor arrangement 302 memory 304 wireless transceiver 306 user inputs 308 display 310 speaker 312 microphone 314 ID unit 316 tracking camera 127 weight sensors 320 object detection cameras 126 motor controllers 324 GNSS receiver 326 inertial measurement unit 328 safety unit 318 emergency stops 322 refrigeration units 400 o shelves 402 system 900 geolocation devices 902 hub 904 task management server 906 web application user interface 908 database 910 microcontroller woo DIP switch 1002 first button 1004, second button loo6 third button 1008 LEDs 1010 LCD screen 1012 RFID system 1014 or GNSS RTK receiver loth ultrasonic beacon ioi8 LoRa module 1020 WiFi module 1022 4G SIM 1024

Claims (27)

  1. -25 -Claims 1. An autonomous mobile robot comprising: a processor arrangement configured to control operation of the autonomous 5 mobile robot; a plurality of sensors configured to monitor the surroundings of the autonomous mobile robot; an identification unit configured to determine an identity of a user loading or unloading the autonomous mobile robot; and a loading area, the loading area comprising one or more weight sensors configured to determine the weight of produce loaded onto the loading area by the identified user, wherein the processor arrangement is configured to control movement of the autonomous mobile robot dependent on data received from the plurality of sensors.
  2. 2. An autonomous mobile robot according to claim 1, further comprising a wireless communication unit, wherein the processor arrangement is configured to control the wireless communication unit to transmit data comprising at least the identity of the user and the weight of produce loaded onto the loading area by the identified user to a remote device.
  3. 3. An autonomous mobile robot according to claim 2, wherein the processor arrangement is further configured to control the wireless communication unit to transmit data comprising one or more of a current location of the autonomous mobile robot, a total load carried on the loading area, or a battery charge level.
  4. 4. An autonomous mobile robot according to any of claims ito 3, further comprising a GNSS receiver and wherein the processor arrangement is configured to: determine a current location of the autonomous mobile robot based on signals received via the GNSS receiver.
  5. 5. An autonomous mobile robot according to any of claims ito 4, further comprising an inertial measurement unit and wherein the processor arrangement is configured to: determine a current location of the autonomous mobile robot based on measurements made by the inertial measurement unit.
  6. 6. An autonomous mobile robot according to any of claims ito 5, further comprising one or more tracking cameras and wherein the processor arrangement is configured to: determine a current location of the autonomous mobile robot based on measurements made by the one or more tracking cameras.
  7. 7. An autonomous mobile robot according to any of claims 4 to 6, wherein the autonomous mobile robot further comprises a memory storing a map of an operational area, the map indicating permitted pathways and known obstacles.
  8. 8. An autonomous mobile robot according to claim 7, wherein the processor arrangement is configured to plan a route from the current location of the autonomous mobile robot to a target location using the map.
  9. 9. An autonomous mobile robot according to claim 7 or claim 8, wherein the plurality of sensors comprise at least one forward facing camera and at least one rear facing camera.
  10. 10. An autonomous mobile robot according to claim 9, wherein the processor arrangement is configured to: use data received from the at least one forward facing camera and/or the at least one rear facing camera to detect obstructions in the surroundings of the autonomous mobile robot; and or when an obstruction is detected, use the map of the operational area stored in the memory to plan a route to the target location which avoids the obstacle.
  11. 11. An autonomous mobile robot according to any of claims 1 to 10, wherein the plurality of sensors comprise one or more sonar sensors.
  12. 12. An autonomous mobile robot according to any of claims ito ii, wherein the plurality of sensors comprise one or more safety bumpers and wherein the autonomous mobile robot is configured to stop moving when one of the one or more safety bumpers is activated.
  13. 13. An autonomous mobile robot according to any of claims 1 to 11, further comprising a user interface and wherein the identification unit is disposed on or within the user interface.
  14. 14. An autonomous mobile robot according to claim 13, wherein the user interface comprises at least one of a touch sensitive display screen, a speaker, a microphone and one or more user inputs.
  15. 15. An autonomous mobile robot according to any of claims ito 14, wherein the autonomous mobile robot has a chassis and wherein the loading area is mounted on top of the chassis.
  16. 16. An automated produce transportation system comprising: the autonomous mobile robot of any of claims i to 15; a plurality of geolocation devices, wherein each of the plurality of geolocation devices comprises a transceiver and is configured to determine its location; and a server configured to receive data from the plurality of geolocation devices and exchange data with the autonomous mobile robot.
  17. 17. An automated produce transportation system according to claim 16, wherein each of the geolocation devices comprises an RF1D system storing identification data.
  18. 18. An automated produce transportation system according to claim 16 or claim 17, wherein each of the geolocation devices comprises a first user input button, wherein activation of the first user input button causes the geolocation device to send a pickup request to the server, the pickup request including the location of the geolocation device.
  19. 19. An automated produce transportation system according to claim 18, wherein each of the geolocation devices comprises a second user input button, wherein activation of the second user input button causes the geolocation device to send a message cancelling the pickup request to the server.
  20. 20. An automated produce transportation system according to any of claims 18 or 19, wherein the server is configured, in response to receiving a pickup request from a -28 -geolocation device, to send the location of the geolocation device to the autonomous mobile robot with instructions to move to the location.
  21. 21. An automated produce transportation system according to any of claims 18 or 19, wherein the system comprises a plurality of autonomous mobile robots each according to any of claims 1 to 15 and wherein the server is configured: in response to receiving a pickup request from a geolocation device, to select one of the plurality of autonomous mobile robots based on a distance between each of the plurality of autonomous mobile robots and the location of the geolocation device; jo and to send the location of the geolocation device to the selected autonomous mobile robot with instructions to move to the location.
  22. 22 An automated produce transportation system according to claim 21, wherein the server is configured to: access a map of an operational area of the plurality of autonomous mobile robots, the map indicating permitted pathways and known obstacles; select one of the plurality of autonomous mobile robots based on an actual travel distance between the each of the plurality of autonomous mobile robots and the location of the geolocation device.
  23. 23. An automated produce transportation system according to claim 21 or claim 22, wherein the server is further configured to select one of the plurality of autonomous mobile robots based on a total load carried by each autonomous mobile robot as determined by the one or more weight sensors.
  24. 24. An automated produce transportation system according to any of claims 16 to 23, wherein each of the geolocation devices comprises a GNSS receiver configured to determine the location of the respective geolocation device and/or wherein each of the geolocation devices comprises an ultrasonic beacon configured to exchange data with a network of stationary beacons in order to determine the location of the respective geolocation device.
  25. 25. An automated produce transportation system comprising: (i) an autonomous mobile robot comprising; -29 -a plurality of sensors configured to monitor the surroundings of the autonomous mobile robot; a GNSS receiver for determining a location of the autonomous mobile robot; a wireless transceiver configured to receive a target destination; and a processor arrangement configured to control movement of the autonomous mobile robot to the target destination dependent on data received from the plurality of sensors; (ii) at least one geolocation device, wherein the at least one geolocation device comprises a transceiver and is configured to: determine its location; and use the transceiver to send a request for the autonomous mobile robot to move to its location.
  26. 26. An automated produce transportation system according to claim 25, wherein the system further comprises a server, wherein the at least one geolocation device is configured to send the request for the autonomous mobile robot to move to its location to the server and wherein the server is configured to set one of the received locations as the target destination and send the target destination to the autonomous mobile robot.
  27. 27. A method of controlling an automated produce transportation system comprising an autonomous mobile robot, a geolocation device and a server, the method comprising: the geolocation device determining its location; the geolocation device sending to the server a request for the autonomous or mobile robot to move to its location; the server receiving the request for the autonomous mobile robot to move to the location of the geolocation device; the server sending a target destination to the autonomous mobile robot, the target destination comprising the received location of the geolocation device; the autonomous mobile robot using a plurality of sensors to monitor the surroundings of the autonomous mobile robot; the autonomous mobile robot using a GNSS receiver to determine a location of the autonomous mobile robot; the autonomous mobile robot using a wireless transceiver to receive the target 35 destination from the server; and -30 -the autonomous mobile robot using a processor arrangement to control movement of the autonomous mobile robot to the target destination dependent on data received from the plurality of sensors.
GB2204094.3A 2022-03-23 2022-03-23 An autonomous mobile robot and automated produce transportation system Pending GB2616879A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
GB2204094.3A GB2616879A (en) 2022-03-23 2022-03-23 An autonomous mobile robot and automated produce transportation system
PCT/EP2023/057556 WO2023180481A1 (en) 2022-03-23 2023-03-23 An autonomous mobile robot and automated material transportation system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB2204094.3A GB2616879A (en) 2022-03-23 2022-03-23 An autonomous mobile robot and automated produce transportation system

Publications (2)

Publication Number Publication Date
GB202204094D0 GB202204094D0 (en) 2022-05-04
GB2616879A true GB2616879A (en) 2023-09-27

Family

ID=81344813

Family Applications (1)

Application Number Title Priority Date Filing Date
GB2204094.3A Pending GB2616879A (en) 2022-03-23 2022-03-23 An autonomous mobile robot and automated produce transportation system

Country Status (2)

Country Link
GB (1) GB2616879A (en)
WO (1) WO2023180481A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170364073A1 (en) * 2016-06-21 2017-12-21 Keith Alan Guy Modular Robotic System
US20190094857A1 (en) * 2017-09-22 2019-03-28 Erik Jertberg Semi-autonomous farm transport vehicle for picked produce
US20190287051A1 (en) * 2016-12-02 2019-09-19 Starship Technologies Oü System and method for securely delivering packages to different delivery recipients with a single vehicle
GB2576800A (en) * 2019-02-06 2020-03-04 Richmond Design And Marketing Ltd Self-propelled baggage dolly, baggage handling system, baggage handling facility, and related apparatus and method
AU2020103104A4 (en) * 2019-10-29 2021-01-07 Cloud Farming Pty Ltd Systems and methods for semi-autonomous produce transport
GB2586217A (en) * 2019-08-01 2021-02-17 Arrival Ltd A system and method for operating an autonomous mobile robot based on payload sensing

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11209887B1 (en) * 2018-03-20 2021-12-28 Amazon Technologies, Inc. Dynamic allocation of power from multiple sources in an autonomous mobile device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170364073A1 (en) * 2016-06-21 2017-12-21 Keith Alan Guy Modular Robotic System
US20190287051A1 (en) * 2016-12-02 2019-09-19 Starship Technologies Oü System and method for securely delivering packages to different delivery recipients with a single vehicle
US20190094857A1 (en) * 2017-09-22 2019-03-28 Erik Jertberg Semi-autonomous farm transport vehicle for picked produce
GB2576800A (en) * 2019-02-06 2020-03-04 Richmond Design And Marketing Ltd Self-propelled baggage dolly, baggage handling system, baggage handling facility, and related apparatus and method
GB2586217A (en) * 2019-08-01 2021-02-17 Arrival Ltd A system and method for operating an autonomous mobile robot based on payload sensing
AU2020103104A4 (en) * 2019-10-29 2021-01-07 Cloud Farming Pty Ltd Systems and methods for semi-autonomous produce transport

Also Published As

Publication number Publication date
GB202204094D0 (en) 2022-05-04
WO2023180481A1 (en) 2023-09-28

Similar Documents

Publication Publication Date Title
US10846656B2 (en) System and method for determining and controlling status and location of an object
US20210009391A1 (en) Recharging apparatus and method
CN108198024B (en) Information processing method and device, electronic device and storage medium
CA2514523C (en) Material handling system and method using autonomous mobile drive units and movable inventory trays
US20180373269A1 (en) Systems and methods using a backup navigational tool for unmanned aerial vehicles delivering merchandise
US11480953B2 (en) Autonomous broadcasting system for self-driving vehicle
US20040010339A1 (en) Material handling system and method using mobile autonomous inventory trays and peer-to-peer communications
US10330480B1 (en) Deployable sensors
EP3627461A1 (en) Information processing method and apparatus, electronic device, and storage medium
GB2535804A (en) Determining a position of an agent
CN104050729A (en) System and Method for Gathering Video Data Related to Operation of an Autonomous Industrial Vehicle
US10322802B1 (en) Deployable sensors
CN210162256U (en) Unmanned aerial vehicle with keep away barrier function
GB2616879A (en) An autonomous mobile robot and automated produce transportation system
US20210247493A1 (en) Non-destructive kit mounting system for driverless industrial vehicles
RU2295218C1 (en) System for informational servicing of agricultural enterprise using precise crop farming process
WO2020248185A1 (en) Autonomous mobile robot with adjustable display screen
US20240051750A1 (en) Mobile storage system handling storage devices on a floor area
RU2787095C1 (en) Robotic all-terrain complex for products storage and disposal
US20240140710A1 (en) Transportation assemblies providing a puzzle based storage system
Komáromi et al. Possibilities of using driverless handling robots in intralogistics