WO2023146451A1 - Improved operation for a robotic work tool system - Google Patents

Improved operation for a robotic work tool system Download PDF

Info

Publication number
WO2023146451A1
WO2023146451A1 PCT/SE2022/050930 SE2022050930W WO2023146451A1 WO 2023146451 A1 WO2023146451 A1 WO 2023146451A1 SE 2022050930 W SE2022050930 W SE 2022050930W WO 2023146451 A1 WO2023146451 A1 WO 2023146451A1
Authority
WO
WIPO (PCT)
Prior art keywords
robotic
area
work tool
processed
image
Prior art date
Application number
PCT/SE2022/050930
Other languages
French (fr)
Inventor
Arvi JONNARTH
Original Assignee
Husqvarna Ab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Husqvarna Ab filed Critical Husqvarna Ab
Publication of WO2023146451A1 publication Critical patent/WO2023146451A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/16Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using electromagnetic waves other than radio waves
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01DHARVESTING; MOWING
    • A01D34/00Mowers; Mowing apparatus of harvesters
    • A01D34/006Control or measuring arrangements
    • A01D34/008Control or measuring arrangements for automated or remotely controlled operation
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/20Control system inputs
    • G05D1/24Arrangements for determining position or orientation
    • G05D1/243Means capturing signals occurring naturally from the environment, e.g. ambient optical, acoustic, gravitational or magnetic signals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/40Control within particular dimensions
    • G05D1/43Control of position or course in two dimensions
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/60Intended control result
    • G05D1/648Performing a task within a working area or space, e.g. cleaning
    • G05D1/6484Performing a task within a working area or space, e.g. cleaning by taking into account parameters or characteristics of the working area or space, e.g. size or shape
    • G05D1/6486Performing a task within a working area or space, e.g. cleaning by taking into account parameters or characteristics of the working area or space, e.g. size or shape by taking into account surface condition, e.g. soiled
    • EFIXED CONSTRUCTIONS
    • E01CONSTRUCTION OF ROADS, RAILWAYS, OR BRIDGES
    • E01HSTREET CLEANING; CLEANING OF PERMANENT WAYS; CLEANING BEACHES; DISPERSING OR PREVENTING FOG IN GENERAL CLEANING STREET OR RAILWAY FURNITURE OR TUNNEL WALLS
    • E01H5/00Removing snow or ice from roads or like surfaces; Grading or roughening snow or ice
    • E01H5/04Apparatus propelled by animal or engine power; Apparatus propelled by hand with driven dislodging or conveying levelling elements, conveying pneumatically for the dislodged material

Definitions

  • This application relates to a robotic work tool and in particular to a system and a method for providing an improved operation through improved navigation for robotic work tools, such as lawnmowers, in such a system.
  • Automated or robotic work tools such as robotic lawnmowers are becoming increasingly more popular and the demands for an efficient and high-quality operation is rising.
  • robotic lawnmowers the demands for a fast and efficient operation, while at the same time being of even quality is increasing, such as mowing the lawn in a short time without leaving uncut sections or strings of grass. Similar problems exist for robotic snow removers.
  • a robotic work tool to navigate a work area in order to cover it completely.
  • One way is to move randomly. This requires very few perception sensors but on the other hand requires the robot to move a long distance to cover the work area in full.
  • a second alternative is to plan a path based on geographical coordinates and navigate the robot based on this path. This approach significantly decreases the distance needed for covering the complete work area.
  • this requires accurate positioning of the robotic work tool to efficiently navigate along the path. This is typically achieved using RTK-GNSS which requires an open sky.
  • the inventor suggests the use of a camera for predicting or detecting the edge between processed and non-processed work area, and describes how to use this prediction to efficiently cover a work area with limited or nonexistant positioning accuracy during robotic operation.
  • a robotic working tool system comprising a robotic working tool comprising at least one image sensor, a work tool and a controller, wherein the controller is configured to receive at least one image from the image sensor; detect at least one object in the received at least one image; classify the at least one object as processed or unprocessed, wherein an object classified as processed represents a processed area and an object classified as unprocessed represents an unprocessed area; detect an edge between a processed area and an unprocessed area; and navigate the robotic work tool so that the work tool overlaps the edge.
  • controller is further configured to detect a string of unprocessed area, and take corrective action.
  • the corrective action is to navigate closer to the processed area.
  • corrective action is to navigate the robotic work tool to the string of unprocessed area.
  • the controller is further configured to: classify objects as not-to-be-processed representing an area not-to-be-processed; detect a second edge between an un-processed area and an area not-to-be-processed; and navigate the robotic work tool so that the work tool overlaps the second edge.
  • the controller is further configured to determine a height in a current area, and to provide feedback based on the height.
  • the robotic work tool is a robotic lawnmower.
  • the height is the height of grass being cut.
  • the robotic work tool is a robotic snow remover.
  • the height is the height of snow being removed.
  • the robotic work tool is a robotic floor grinder.
  • the robotic work tool is a robotic gravel rake.
  • the height is the height of gravel being raked.
  • robotic work tool comprises an RTK navigation sensor, and wherein the controller is configured to: determine a path travelled by the robotic work tool; determine an image area covered by the image sensor(s); determine a location of the robotic work tool; map the image area to a map area based on the location of the robotic work tool; determine if the image area is of processed and/or unprocessed areas based on the travelled path.
  • the robotic work tool comprises an RTK navigation sensor, and wherein the method comprises: determining a path travelled by the robotic work tool; determining an image area covered by the image sensor(s); determining a location of the robotic work tool; mapping the image area to a map area based on the location of the robotic work tool; determining if the image area is of processed and/or unprocessed areas based on the travelled path.
  • the robotic work tool comprises an RTK navigation sensor
  • the method comprises: determining a path travelled by the robotic work tool; determining an image area covered by the image sensor(s); determining a location of the robotic work tool; mapping the image area to a map area based on the location of the robotic work tool; determining if the image area is of processed and/or unprocessed areas based on the travelled path. It is also an object of the teachings of this application to overcome or at least reduce those problems by providing a server configured to perform a method according to herein.
  • the inventors have realized that it is enough to classify an image, or part thereof, as being of a processed or an unprocessed area which enables for a very fast training leading to a reliable classification.
  • the inventors have further realized that it suffices to classify all other objects or portions of an image simply as areas not-to-be processed, which for embodiments where the robotic work tool is a robotic lawnmower simply corresponds to objects not being grassed. What kind may not be of interest, not for the purpose of detecting an edge of uncut grass (or unplowed snow).
  • Figure 1 A shows an example of a robotic lawnmower according to some embodiments of the teachings herein
  • Figure IB shows a schematic view of the components of an example of a robotic work tool being a robotic lawnmower according to some example embodiments of the teachings herein;
  • Figure 2 shows a schematic view of a robotic work tool system according to some example embodiments of the teachings herein;
  • Figure 3 A shows a schematic view of a robotic work tool according to some example embodiments of the teachings herein;
  • Figure 3B shows a schematic view of a robotic work tool according to some example embodiments of the teachings herein;
  • Figure 3C shows a schematic view of a robotic work tool according to some example embodiments of the teachings herein;
  • Figure 3 A shows a schematic view of a robotic work tool according to some example embodiments of the teachings herein;
  • Figure 3B shows a schematic view of a robotic work tool according to some example embodiments of the teachings herein;
  • Figure 3D shows a schematic view of a robotic work tool according to some example embodiments of the teachings herein;
  • Figure 3E shows a schematic view of a robotic work tool according to some example embodiments of the teachings herein;
  • Figure 3F shows a schematic view of a robotic work tool according to some example embodiments of the teachings herein;
  • Figure 4A shows a schematic view of a robotic work tool system and how it is trained according to some example embodiments of the teachings herein;
  • Figure 4B shows a schematic view of a camera view for a robotic work tool system as in Figure 4A according to some example embodiments of the teachings herein;
  • Figure 4C shows a schematic view of a segmentation of the camera view of Figure 4B according to some example embodiments of the teachings herein;
  • Figure 5A shows a corresponding flowchart for a method according to some example embodiments of the teachings herein; and Figure 5B shows a corresponding flowchart for a method according to some example embodiments of the teachings herein.
  • Figure 1 A shows a perspective view of a robotic work tool 100, here exemplified by a robotic lawnmower 100, having a body 140 and a plurality of wheels 130 (only one side is shown).
  • the robotic work tool 100 may be a multi-chassis type or a mono-chassis type (as in figure 1 A).
  • a multi-chassis type comprises more than one main body parts that are movable with respect to one another.
  • a mono-chassis type comprises only one main body part.
  • robotic lawnmower may be of different sizes, where the size ranges from merely a few decimetres for small garden robots, to more than 1, 1.5 5 or even over 2 meters for large robots arranged to service for example airfields.
  • robotic work tool is a self-propelled robotic work tool, capable of autonomous navigation within a work area, where the robotic work tool propels itself across or around the work area in a pattern (random or predetermined).
  • Figure IB shows a schematic overview of the robotic work tool 100, also exemplified here by a robotic lawnmower 100.
  • the robotic lawnmower 100 is of a mono-chassis type, having a main body part 140.
  • the main body part 140 substantially houses all components of the robotic lawnmower 100.
  • the robotic lawnmower 100 has a plurality of wheels 130.
  • the robotic lawnmower 100 has four wheels 130, two front wheels and two rear wheels. At least some of the wheels 130 are drivably connected to at least one electric motor 150. It should be noted that even if the description herein is focused on electric motors, combustion engines may alternatively be used, possibly in combination with an electric motor.
  • each of the wheels 130 is connected to a respective electric motor 155, but it would also be possible with two or more wheels being connected to a common electric motor 155, for driving the wheels 130 to navigate the robotic lawnmower 100 in different manners.
  • the wheels, the motor 155 and possibly the battery 150 are thus examples of components making up a propulsion device.
  • the propulsion device may be controlled to propel the robotic lawnmower 100 in a desired manner, and the propulsion device will therefore be seen as synonymous with the motor(s) 150.
  • wheels 130 driven by electric motors is only one example of a propulsion system and other variants are possible such as caterpillar tracks.
  • the robotic lawnmower 100 also comprises a controller 110 and a computer readable storage medium or memory 120.
  • the controller 110 may be implemented using instructions that enable hardware functionality, for example, by using executable computer program instructions in a general-purpose or special-purpose processor that may be stored on the memory 120 to be executed by such a processor.
  • the controller 110 is configured to read instructions from the memory 120 and execute these instructions to control the operation of the robotic lawnmower 100 including, but not being limited to, the propulsion and navigation of the robotic lawnmower.
  • the controller 110 in combination with the electric motor 155 and the wheels 130 forms the base of a navigation system (possibly comprising further components) for the robotic lawnmower, enabling it to be self-propelled as discussed under figure 1 A,
  • the controller 110 may be implemented using any suitable, available processor or Programmable Logic Circuit (PLC).
  • PLC Programmable Logic Circuit
  • the memory 120 may be implemented using any commonly known technology for computer-readable memories such as ROM, FLASH, DDR, or some other memory technology.
  • the robotic lawnmower 100 is further arranged with a wireless communication interface 115 for communicating with a server, and in some embodiments, also with other devices, such as a personal computer, a smartphone, the charging station, and/or other robotic work tools. Examples of such wireless communication devices are Bluetooth®, WiFi® (IEEE802.1 lb), Global System Mobile (GSM) and LTE (Long Term Evolution), to name a few.
  • the robotic lawnmower 100 is thus arranged to communicate with a server (referenced 240 in figure 2) for providing information regarding status, location, and/or progress of operation as well as receiving commands or settings from the server.
  • the robotic lawnmower 100 also comprises a grass cutting device 160, such as a rotating blade 160 driven by a cutter motor 165. It should be noted that the robotic lawnmower 100 may comprise more than one rotating blade working together.
  • the grass cutting device 160 enables for cutting grass at a width (referenced w in figure 3 A).
  • the grass cutting device being one example of a work tool 160 for a robotic work tool 100.
  • the robotic lawnmower 100 is also configured to determine a load exerted on the cutter motor 165 for determining the height of the grass being cut. In alternative embodiments, the robotic lawnmower is configured to determine the height of the grass in other manners, such as through image analysis
  • the robotic work tool 100 is a robotic snow remover (or snow blower), and the work tool 160 corresponds to the augers of the snow remover.
  • the robotic lawnmower 100 may, in some embodiments, be arranged to determine the height (or rather the depth) of the snow being removed in similar manners.
  • the robotic lawnmower 100 may further comprise at least one navigation sensor, such as an optical navigation sensor, an ultrasound sensor, a beacon navigation sensor and/or a satellite navigation sensor 185.
  • the optical navigation sensor may be a camera-based sensor and/or a laser-based sensor.
  • the beacon navigation sensor may be a Radio Frequency receiver, such as an Ultra Wide Band (UWB) receiver or sensor, configured to receive signals from a Radio Frequency beacon, such as a UWB beacon.
  • the beacon navigation sensor may be an optical receiver configured to receive signals from an optical beacon.
  • the satellite navigation sensor may be a GPS (Global Positioning System) device or other Global Navigation Satellite System (GNSS) device.
  • GPS Global Positioning System
  • GNSS Global Navigation Satellite System
  • the magnetic sensors 170 as will be discussed below are optional.
  • the work area may be specified as a virtual work area in a map application stored in the memory 120 of the robotic lawnmower 100.
  • the virtual work area may be defined by a virtual boundary.
  • a physical border may be used to define a work area 205.
  • the robotic lawnmower 100 may also or alternatively comprise deduced reckoning sensors 180.
  • the deduced reckoning sensors may be odometers, accelerometer or other deduced reckoning sensors.
  • the deduced reckoning sensors are comprised in the propulsion device, wherein a deduced reckoning navigation may be provided by knowing the current supplied to a motor and the time the current is supplied, which will give an indication of the speed and thereby distance for the corresponding wheel.
  • the robotic lawnmower 100 is, in some embodiments, further configured to have at least one magnetic field sensor 170 arranged to detect the magnetic field and for detecting the boundary wire and/or for receiving (and possibly also sending) information to/from a signal generator (will be discussed with reference to figure 1).
  • the sensors 170 may be connected to the controller 110, possibly via filters and an amplifier, and the controller 110 may be configured to process and evaluate any signals received from the sensors 170.
  • the sensor signals are caused by the magnetic field being generated by the control signal being transmitted through the boundary wire. This enables the controller 110 to determine whether the robotic lawnmower 100 is close to or crossing the boundary wire, or inside or outside an area enclosed by the boundary wire.
  • the robotic lawnmower 100 is in some embodiments arranged to operate according to a map application representing one or more work areas (and possibly the surroundings of the work area(s)) stored in the memory 120 of the robotic lawnmower 100.
  • the map application may be generated or supplemented as the robotic lawnmower 100 operates or otherwise moves around in the work area 205.
  • the map application includes one or more start regions and one or more goal regions for each work area.
  • the map application also includes one or more transport areas.
  • the map application is in some embodiments stored in the memory 120 of the robotic working tool(s) 100.
  • the map application is stored in the server (referenced 240 in figure 2).
  • maps are stored both in the memory 120 of the robotic working tool(s) 100 and in the server, wherein the maps may be the same maps or show subsets of features of the area.
  • the robotic working tool 100 also comprises at least one camera or other image sensor 190 for enabling image analysis, such as segmentation, of a path navigated in a work area.
  • the image sensor(s) 190 is comprised or combined with the optical navigation sensor(s) 180.
  • FIG. 2 shows a robotic work tool system 200 in some embodiments.
  • the schematic view is not to scale.
  • the robotic work tool system 200 comprises one or more robotic work tools 100 according to the teachings herein.
  • the operational area 205 shown in figure 2 is simplified for illustrative purposes.
  • the robotic work tool system comprises a boundary 220 that may be virtual and/or electro mechanical.
  • An example of an electro mechanical border is one generated by a magnetic field generated by a control signal being transmitted through a boundary wire, and which magnetic field is sensed by sensors 170 in the robotic work tool 100.
  • An example of a virtual border is one defined by coordinates and navigated using a location-based navigation system, such as a GPS (or RTK) system.
  • the robotic lawnmower 100 is configured to navigate using an RTK system as this provides a high accuracy and enables for very precise steering.
  • the robotic work tool system 200 further comprises a station 210 possibly at a station location.
  • a station location may alternatively or additionally indicate a service station, a parking area, a charging station or a safe area where the robotic work tool may remain for a time period between or during operation session.
  • the robotic work tool(s) is exemplified by a robotic lawnmower, whereby the robotic work tool system may be a robotic lawnmower system or a system comprising a combinations of robotic work tools, one being a robotic lawnmower, but the teachings herein may also be applied to other robotic work tools adapted to operate within a work area.
  • the robotic work tool system may be a robotic lawnmower system or a system comprising a combinations of robotic work tools, one being a robotic lawnmower, but the teachings herein may also be applied to other robotic work tools adapted to operate within a work area.
  • the one or more robotic working tools 100 of the robotic work tool system 200 are arranged to operate in an operational area 205, which in this example comprises a first work area 205A and a second work area 205B connected by a transport area TA.
  • an operational area may comprise a single work area or one or more work areas, possibly arranged adjacent for easy transition between the work areas, or connected by one or more transport paths or areas, also referred to as corridors.
  • corridors also referred to as corridors.
  • the operational area 205 is in this application exemplified as a garden, but can also be other work areas as would be understood, such as a (part of a) neighbourhood, a sports complex or an airfield to mention a few examples.
  • the garden may contain a number of obstacles and/or objects, for example a number of trees, stones, slopes and houses or other structures.
  • the robotic work tool is arranged or configured to traverse and operate in work areas that are not essentially flat, but contain terrain that is of varying altitude, such as undulating, comprising hills or slopes or such.
  • the ground of such terrain is not flat and it is not straightforward how to determine an angle between a sensor mounted on the robotic work tool and the ground.
  • the robotic work tool is also or alternatively arranged or configured to traverse and operate in a work area that contains obstacles that are not easily discerned from the ground. Examples of such are grass or moss covered rocks, roots or other obstacles that are close to ground and of a similar colour or texture as the ground.
  • the robotic work tool is also or alternatively arranged or configured to traverse and operate in a work area that contains obstacles that are overhanging, i.e. obstacles that may not be detectable from the ground up, such as low hanging branches of trees or bushes. Such a garden is thus not simply a flat lawn to be mowed or similar, but a work area of unpredictable structure and characteristics.
  • the operational area or any of its work areas 205 exemplified with referenced to figure 2 may thus be such a non-uniform area as disclosed in this paragraph that the robotic work tool is arranged to traverse and/or operate in.
  • the robotic working tool(s) 100 is arranged to navigate in one or more work areas 205A, 205B, possibly connected by a transport area TA.
  • the robotic working tool system 200 may alternatively or additionally comprise or be arranged to be connected to a server 240, such as a cloud service, a cloud server application or a dedicated server 240.
  • the connection to the server 240 may be direct from the robotic working tool 100, indirect from the robotic working tool 100 via the service station 210, and/or indirect from the robotic working tool 100 via user equipment (not shown).
  • a server may be implemented in a number of ways utilizing one or more controllers 240A and one or more memories 240B that may be grouped in the same server or over a plurality of servers.
  • Figure 3 A shows a schematic view of a robotic working tool 100, such as one disclosed in relation to figures 1 A and IB, which is configured to operate in a robotic working tool system 200 as disclosed in relation to figure 2.
  • the robotic work tool will be exemplified as a robotic lawnmower 100, as in Figures 1 A, IB and 2.
  • the robotic lawnmower 100 is configured to operate in the operating area 205 and to cut the grass in there. This is illustrated in figure 3 A as the robotic lawnmower 100 navigating over uncut grass (illustrated by dotted ground) and leaving behind a trail of cut grass (illustrated by white area(s)).
  • the trail of cut grass is (substantially) of a width w corresponding to the width of the grass cutting device 160 as discussed in relation to figure IB.
  • the robotic lawnmower 100 is in this example arranged with a camera (or other image sensor 190) in the front of the robotic lawnmower 100, the camera 190 having a field of view (indicated and referenced FOV).
  • the camera 190 enables for the robotic lawnmower 100 to detect an edge (referenced E) between cut grass and uncut grass.
  • the robotic lawnmower 100 will be able to see the edge as it turns and continues operating in the opposite direction.
  • the robotic lawnmower 100 is configured (through its controller 110) to detect the edge E based on image(s) received through the camera(s) 190 utilizing image processing.
  • the image processing is based on Artificial Intelligence, such as machine learning.
  • the controller performing the image processing is the controller 110 of the robotic lawnmower 100.
  • the controller performing the image processing is the controller of the camera(s) 190 (not shown explicitly, but considered to be part of the controller 110 of the robotic lawnmower 100 for the teachings herein).
  • the controller performing the image processing is the controller of the camera(s) 190 in combination with the controller 110 of the robotic lawnmower 100.
  • the robotic lawnmower 100 is this configured to detect where an edge is between cut grass and uncut grass, which enables the robotic lawnmower to navigate in an efficient manner by placing the robotic lawnmower 100 (knowing the location of the grass cutting device 160 relative the body of the robotic lawnmower 100, and its width w) so that any trail of cut grass is adjacent or slightly overlapping a previous one.
  • the robotic lawnmower 100 is thus configured to navigate the robotic lawnmower based on the detected edge E.
  • the robotic lawnmower 100 is configured to navigate based on the detected edge E so that the (extension of the) grass cutting device 160 overlaps the edge E of a previously cut area of grass (such as a cut trail).
  • the robotic lawnmower 100 is configured to navigate based on the detected edge E so that any missed string or other narrow passage of uncut grass is minimized or cut, such as by taking corrective action.
  • the robotic lawnmower 100 is configured to take the corrective action by turning to further overlap the edge or to ensure that the edge is overlapped, by turning towards the cut area. This can be utilized to accommodate or correct for any navigational error resulting in the uncut string of grass.
  • the robotic lawnmower 100 is configured to take the corrective action by returning to traverse over the uncut string of grass to ensure that it is cut.
  • a string or narrow passage of uncut grass may be any area of uncut grass having a width less than the width w of the grass cutting device.
  • a string of uncut grass is an area of grass that is between two cut areas.
  • the string of uncut grass is an area of grass that should have been cut according to an operating schedule of the robotic lawnmower 100. In other words it is in an area that has already been marked or noted to have been covered as the robotic lawnmower executes a scheduled operation.
  • Figure 3C shows an example of such a string (referenced E’) and that the robotic lawnmower 100 takes corrective action (as indicated by the arrow).
  • Figure 3D shows an example of a robotic lawnmower 100 having camera(s) 190 mounted in the front as well as having camera(s) 190 mounted in the back of the robotic lawnmower 100.
  • a corrective action can of course be taken sooner, than only with one camera, but in some embodiments, the use of front and rear-facing cameras can also be utilized by the robotic lawnmower to provide feedback.
  • the feedback is in some embodiments related to the navigation of the robotic lawnmower 100. And the feedback can be used to further improve the navigation of the robotic lawnmower 100.
  • Figure 3E shows an example of a different operating pattern, in this example being to follow an expanding or imploding pattern, such as following the edges of the operating area 205, or simply a rectangle, square or even a circle - all being known operating patterns - which is followed so that edges or strings between cut areas are minimized.
  • Figure 3F shows an example of a different operating pattern, in this example being a random or semi-random pattern, which is followed so that edges or strings between cut areas are minimized.
  • Figure 4A shows a schematic view of an example operating area 205, possibly one such as discussed in relation to figure 2, wherein a robotic work tool 100 operates and gathers image data, such as images, through at least one camera.
  • the images are at first gathered for training purposes, and later for operating purposes.
  • the description herein will be focussed on a single robotic work tool gathering the images in a single operating area, it is in some embodiments more than one robotic work tool that gathers the images, and in some embodiments the images are gathered in/from different operating areas. It should also be noted that the images may be gathered during operation, during a training session, or even a combination of such, where an operating session for one robotic work tool may be seen as a training session for another robotic work tool.
  • the training is performed in a factory operating area, and in some alternative or additional embodiments, the training is performed during operation for one or more customers. In embodiments where several robotic work tools are utilized, the training may be shared.
  • FIG 4A it is illustrated how the robotic work tool 100 navigates an operating area gathering images through the camera(s) 190.
  • FOV field of view
  • a portion of an image may be referred to as an object or as comprising an object, without there actually being a physical object.
  • An object is therefore an image object representing a portion of n image having some features that may be classified, and more specifically that can be classified as processed (cut) are, unprocessed (uncut) area and other (area-not-to-be-processed).
  • the terminology of object or portion of image, and portion of image representing an area will be used interchangeably herein.
  • Figure 4B shows a schematic view of an example image captured or gathered through a camera 190 as in the example situation in figure 4A. Illustrated in figure 4B is the trail of cut grass (illustrated with diagonal lines) the uncut grass (illustrated as a dotted area), the house, the pathway and also the sky (or other background, referenced BG) extending above the ground. In case the camera(s) 190 is angled not solely on the ground the sky (or other background) will also be part of the image(s) captured.
  • the angle between the camera and the ground can be determined based on a known mounting angle and a current angle of the robotic work tool (detectable through the use of deduced reckoning sensors such as inertial mobility units) and possibly also based on a known elevation which may be known from the map application.
  • the robotic work tool is configured to perform image analysis on the received image(s) and to detect (including classification) objects, i.e. different areas, in the image(s), possibly through segmentation, object classification or other image processing technique.
  • Figure 4C shows a schematic view of a segmentation of the camera view of Figure 4B according to some example embodiments of the teachings herein.
  • the objects i.e. the areas, detected are indicated as CG for the trail of cut grass, UG for the area of uncut grass, NG as in not grass for the house, the pathway and also for the sky.
  • the house and the pathway are examples of objects that are not grass and also should not be cut - as is indicated as NC.
  • the objects are classified as Processed (P), Unprocessed (U) and Not-to-be-processed (N) corresponding to CG, UG and NG respectively.
  • the classifications correspond to snow removed, snow not removed, and not snow respectively.
  • the robotic work tool is a robotic floor grinder.
  • the classifications correspond to floor processed, floor not processed, and area-not-to-process, for example not floor, respectively.
  • the robotic work tool is a robotic gravel rake.
  • the classifications correspond to gravel area raked, gravel area raked, and gravel-area-not-to-process, for example not gravel path/area, respectively.
  • a robotic work tool need not be able to detect all classes of physical objects in order to perform a more efficient operation, and that the robotic work tool need only be trained to detect which areas are processed and which are not in order to be able to follow the edge between processed and unprocessed areas. This enables for a faster and more efficient training of the robotic work tool (or rather the Al for the robotic work tool) and also for a more accurate usage and detection of image objects, i.e. areas.
  • the robotic work tool may also, in some embodiments, be trained to determine what is not grass, which will further enable for a more efficient operation as edges to areas where there is no grass may be followed more efficiently and without wasting time or breaking up a pattern by attempting to operate in areas where there is nothing to process, such as a pathway for a lawnmower.
  • the robotic work tool is thus trained based on images where processed areas are indicated and where unprocessed areas are indicated.
  • the robotic work tool may be configured to determine whether an area is processed or not based also on other inputs, such as by determining the height of the grass being cut. This can be used to provide feedback for the training - especially when being trained in the field. The training may thus also be performed based on feedback.
  • the height is the height of gravel being raked.
  • the height is the height of snow being removed.
  • Figure 5A shows a general flowchart according to a method of the teachings herein for use in a robotic working tool system, and especially during operation, wherein the method comprises receiving at least one image from the image sensor; detecting at least one object in the received at least one image; classifying the at least one (image) object as processed or unprocessed, wherein an object classified as processed represents a processed area and an object classified as unprocessed represents an unprocessed area; detecting an edge between a processed area and an unprocessed area; and navigating the robotic work tool so that the work tool overlaps the edge.
  • a robotic work tool system may thus in some embodiments be configured to perform the method according to figure 5A as discussed above for example in relation to figures 3A, 3B, 3C, 3D, 3E, and 3F.
  • Figure 5B shows a general flowchart according to a method of the teachings herein for use in a robotic working tool or in a server connected to the robotic work tool, and especially for training the robotic work tool, wherein the method comprises receiving at least one image, wherein the image comprises at least one or more of a processed area and an unprocessed area; detecting at least one object in the received at least one image; classifying the at least one (image) object as processed or unprocessed; comparing to see if the classified object is as indicated and train accordingly.
  • the method comprises receiving at least one image, wherein the image comprises at least one or more of a processed area and an unprocessed area; detecting at least one object in the received at least one image; classifying the at least one (image) object as processed or unprocessed; comparing to see if the classified object is as indicated and train accordingly.
  • the method comprises receiving at least one image, wherein the image comprises at least one or more of a processed area and an unprocessed area; detecting at least one object
  • the robotic work tool is configured to navigate using RTK. This allows for knowing or determining a very exact position of the robotic work tool in an operational area, for example navigating according to a map. This allows for a highly efficient data collection as regards gather the image data, which can be used both for initial data collection and for continued training.
  • the received images are of a known angle relative the robotic work tool, and by knowing the more or less exact position of the robotic work tool, the area covered by the image may also be determined -relative the position of the robotic work tool.
  • mapping this area of the image to an area in the map and based on an operating schedule where areas are noted to be processor or not, it can be determined if the area in the image(s) is in a processed area or an unprocessed area, or whether the area in the image covers both processed or unprocessed areas.
  • the extension of such processed or unprocessed areas may be determined in the image(s) based on the map and the exact location of the robotic work tool.
  • the image data collected in this manner may thus be gathered during operation for subsequent training, for ongoing training or for training of another robotic work tool. Additionally or alternatively the image data collected for initial training.
  • data collection is performed possibly during operation so as to collect image data and perform automatic segmentation labelling or object detection, by dividing the image(s) into segments of processed work area, non-processed work area, and possibly non-work area.
  • the three categories would correspond to cut grass, uncut grass, and non-grass respectively.
  • the third category includes everything that would not be considered as part of the work area surface, such as obstacles, humans, sky, buildings etc.).
  • the whole work area only contains non-processed surfaces and does not contain any processed area at all. In some embodiments, this should be confirmed by a human operator or some other method, such as by querying an operating schedule. As discussed in the above, during operation, the position of the tool is recorded, and a map is updated that includes information about what part of the work area has been processed (the path travelled).
  • the image(s) is annotated by being mapped against a specific location or area in the map based on the position of the camera(s) relative the robotic work tool and the accurate position of the robotic work tool.
  • the data can be used to train the artificial intelligence as discussed in the above, where the image data is used to train for example a convolutional neural network (CNN) in the task of work area segmentation, to detect or predict which parts of the image belong to the three aforementioned categories.
  • CNNs convolutional neural network
  • Examples of CNNs that can be used are U-Net, DeepLab, Mask R-CNN etc.
  • the artificial intelligence can thereafter be used to improve the navigation of the robotic work tool.
  • the improved navigation is based on enabling the robotic work tool to find the edge between processed and non-processed work area to efficiently cover the whole work area.
  • the robotic work tool is equipped with a forward-facing camera from which the images are fed into the CNN. And the edge between processed/non-processed work area is predicted or detected by the CNN and the edge is used to guide the robot in covering the work area.
  • the robotic work tool could for example move in a zig-zag pattern and on each pass adjust the distance to the previous pass based on where the edge between processed/non-processed area is located.
  • the robot could also move in a circular pattern by first covering the outer boundary. From the second rotation and onwards the robot can use the predicted edge to control its pass while it spirals inwards towards the center of the work area. (This could also be executed by starting from the center and spiraling outwards.)
  • a robotic work tool may thus in some embodiments be configured to perform the method according to figure 5B as discussed above for example in relation to figures 4 A, 4B and 4C.
  • a server that trains the artificial intelligence to be used by the robotic work tool
  • a server may thus in some embodiments be configured to perform the method according to figure 5B.
  • the robotic work tool is being trained through the server.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Automation & Control Theory (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Electromagnetism (AREA)
  • Environmental Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Manipulator (AREA)
  • Automatic Tool Replacement In Machine Tools (AREA)

Abstract

A robotic working tool system comprising a robotic working tool comprising at least one image sensor, a work tool and a controller, wherein the controller is configured to receive at least one image from the image sensor; detect at least one object in the received at least one image; classify the at least one object as processed or unprocessed, wherein an object classified as processed represents a processed area and an object classified as unprocessed represents an unprocessed area; detect an edge between a processed area and an unprocessed area; and navigate the robotic work tool so that the work tool overlaps the edge.

Description

IMPROVED OPERATION FOR A ROBOTIC WORK TOOL SYSTEM
TECHNICAL FIELD
This application relates to a robotic work tool and in particular to a system and a method for providing an improved operation through improved navigation for robotic work tools, such as lawnmowers, in such a system.
BACKGROUND
Automated or robotic work tools such as robotic lawnmowers are becoming increasingly more popular and the demands for an efficient and high-quality operation is rising. For example, for robotic lawnmowers the demands for a fast and efficient operation, while at the same time being of even quality is increasing, such as mowing the lawn in a short time without leaving uncut sections or strings of grass. Similar problems exist for robotic snow removers.
Thus, there is a need for an improved operation for robotic work tools.
SUMMARY
As the inventors have realized, there are several ways for a robotic work tool to navigate a work area in order to cover it completely. One way is to move randomly. This requires very few perception sensors but on the other hand requires the robot to move a long distance to cover the work area in full. A second alternative is to plan a path based on geographical coordinates and navigate the robot based on this path. This approach significantly decreases the distance needed for covering the complete work area. However, this requires accurate positioning of the robotic work tool to efficiently navigate along the path. This is typically achieved using RTK-GNSS which requires an open sky. The inventor suggests the use of a camera for predicting or detecting the edge between processed and non-processed work area, and describes how to use this prediction to efficiently cover a work area with limited or nonexistant positioning accuracy during robotic operation.
It is therefore an object of the teachings of this application to overcome or at least reduce those problems by providing a robotic working tool system comprising a robotic working tool comprising at least one image sensor, a work tool and a controller, wherein the controller is configured to receive at least one image from the image sensor; detect at least one object in the received at least one image; classify the at least one object as processed or unprocessed, wherein an object classified as processed represents a processed area and an object classified as unprocessed represents an unprocessed area; detect an edge between a processed area and an unprocessed area; and navigate the robotic work tool so that the work tool overlaps the edge.
In some embodiments the controller is further configured to detect a string of unprocessed area, and take corrective action.
In some embodiments the corrective action is to navigate closer to the processed area.
In some embodiments corrective action is to navigate the robotic work tool to the string of unprocessed area.
In some embodiments the controller is further configured to: classify objects as not-to-be-processed representing an area not-to-be-processed; detect a second edge between an un-processed area and an area not-to-be-processed; and navigate the robotic work tool so that the work tool overlaps the second edge.
In some embodiments the controller is further configured to determine a height in a current area, and to provide feedback based on the height.
In some embodiments the robotic work tool is a robotic lawnmower.
In some embodiments the height is the height of grass being cut.
In some embodiments the robotic work tool is a robotic snow remover.
In some embodiments the height is the height of snow being removed.
In some embodiments the robotic work tool is a robotic floor grinder.
In some embodiments the robotic work tool is a robotic gravel rake. In some embodiments the height is the height of gravel being raked. In some embodiments robotic work tool comprises an RTK navigation sensor, and wherein the controller is configured to: determine a path travelled by the robotic work tool; determine an image area covered by the image sensor(s); determine a location of the robotic work tool; map the image area to a map area based on the location of the robotic work tool; determine if the image area is of processed and/or unprocessed areas based on the travelled path.
It is also an object of the teachings of this application to overcome or at least reduce those problems by providing a method for robotic working tool system comprising a robotic working tool comprising at least one image sensor and a work tool, wherein the method comprises: receiving at least one image from the image sensor; detecting at least one object in the received at least one image; classifying the at least one object as processed or unprocessed, wherein an object classified as processed represents a processed area and an object classified as unprocessed represents an unprocessed area; detecting an edge between a processed area and an unprocessed area; and navigating the robotic work tool so that the work tool overlaps the edge.
In some embodiments the robotic work tool comprises an RTK navigation sensor, and wherein the method comprises: determining a path travelled by the robotic work tool; determining an image area covered by the image sensor(s); determining a location of the robotic work tool; mapping the image area to a map area based on the location of the robotic work tool; determining if the image area is of processed and/or unprocessed areas based on the travelled path.
It is also an object of the teachings of this application to overcome or at least reduce those problems by providing a method for training a robotic work tool, wherein the method comprises: receiving at least one image, wherein the image comprises at least one or more of a processed area and an unprocessed area; detecting at least one object in the received at least one image; classifying the at least one object as processed or unprocessed; comparing to see if the classified object is as indicated and train accordingly.
In some embodiments the robotic work tool comprises an RTK navigation sensor, and wherein the method comprises: determining a path travelled by the robotic work tool; determining an image area covered by the image sensor(s); determining a location of the robotic work tool; mapping the image area to a map area based on the location of the robotic work tool; determining if the image area is of processed and/or unprocessed areas based on the travelled path. It is also an object of the teachings of this application to overcome or at least reduce those problems by providing a server configured to perform a method according to herein.
It should be noted that for the purpose of detecting the edge the inventors have realized that it is enough to classify an image, or part thereof, as being of a processed or an unprocessed area which enables for a very fast training leading to a reliable classification. In order to enable for operation in more complicated areas, the inventors have further realized that it suffices to classify all other objects or portions of an image simply as areas not-to-be processed, which for embodiments where the robotic work tool is a robotic lawnmower simply corresponds to objects not being grassed. What kind may not be of interest, not for the purpose of detecting an edge of uncut grass (or unplowed snow).
Further embodiments and aspects are as in the attached patent claims and as discussed in the detailed description.
Other features and advantages of the disclosed embodiments will appear from the following detailed disclosure, from the attached dependent claims as well as from the drawings. Generally, all terms used in the claims are to be interpreted according to their ordinary meaning in the technical field, unless explicitly defined otherwise herein. All references to "a/an/the [element, device, component, means, step, etc.]" are to be interpreted openly as referring to at least one instance of the element, device, component, means, step, etc., unless explicitly stated otherwise. The steps of any method disclosed herein do not have to be performed in the exact order disclosed, unless explicitly stated.
BRIEF DESCRIPTION OF THE DRAWINGS
The invention will be described in further detail under reference to the accompanying drawings in which:
Figure 1 A shows an example of a robotic lawnmower according to some embodiments of the teachings herein; Figure IB shows a schematic view of the components of an example of a robotic work tool being a robotic lawnmower according to some example embodiments of the teachings herein;
Figure 2 shows a schematic view of a robotic work tool system according to some example embodiments of the teachings herein;
Figure 3 A shows a schematic view of a robotic work tool according to some example embodiments of the teachings herein;
Figure 3B shows a schematic view of a robotic work tool according to some example embodiments of the teachings herein;
Figure 3C shows a schematic view of a robotic work tool according to some example embodiments of the teachings herein;
Figure 3 A shows a schematic view of a robotic work tool according to some example embodiments of the teachings herein;
Figure 3B shows a schematic view of a robotic work tool according to some example embodiments of the teachings herein;
Figure 3D shows a schematic view of a robotic work tool according to some example embodiments of the teachings herein;
Figure 3E shows a schematic view of a robotic work tool according to some example embodiments of the teachings herein;
Figure 3F shows a schematic view of a robotic work tool according to some example embodiments of the teachings herein;
Figure 4A shows a schematic view of a robotic work tool system and how it is trained according to some example embodiments of the teachings herein;
Figure 4B shows a schematic view of a camera view for a robotic work tool system as in Figure 4A according to some example embodiments of the teachings herein;
Figure 4C shows a schematic view of a segmentation of the camera view of Figure 4B according to some example embodiments of the teachings herein;
Figure 5A shows a corresponding flowchart for a method according to some example embodiments of the teachings herein; and Figure 5B shows a corresponding flowchart for a method according to some example embodiments of the teachings herein.
DETAILED DESCRIPTION
The disclosed embodiments will now be described more fully hereinafter with reference to the accompanying drawings, in which certain embodiments of the invention are shown. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Like reference numbers refer to like elements throughout.
It should be noted that even though the description given herein will be focused on robotic lawnmowers, the teachings herein may also be applied to, robotic ball collectors, robotic mine sweepers, robotic farming equipment, or other robotic work tools where a work tool is to be safeguarded against from accidentally extending beyond or too close to the edge of the robotic work tool.
Figure 1 A shows a perspective view of a robotic work tool 100, here exemplified by a robotic lawnmower 100, having a body 140 and a plurality of wheels 130 (only one side is shown). The robotic work tool 100 may be a multi-chassis type or a mono-chassis type (as in figure 1 A). A multi-chassis type comprises more than one main body parts that are movable with respect to one another. A mono-chassis type comprises only one main body part.
It should be noted that robotic lawnmower may be of different sizes, where the size ranges from merely a few decimetres for small garden robots, to more than 1, 1.5 5 or even over 2 meters for large robots arranged to service for example airfields.
It should be noted that even though the description herein is focussed on the example of a robotic lawnmower, the teachings may equally be applied to other types of robotic work tools, such as robotic watering tools, robotic golfball collectors, and robotic mulchers to mention a few examples. It should also be noted that more than one robotic working tool may be set to operate in a same operational area, and that all of these robotic working tools need not be of the same type. It should also be noted that the robotic work tool is a self-propelled robotic work tool, capable of autonomous navigation within a work area, where the robotic work tool propels itself across or around the work area in a pattern (random or predetermined).
Figure IB shows a schematic overview of the robotic work tool 100, also exemplified here by a robotic lawnmower 100. In this example embodiment the robotic lawnmower 100 is of a mono-chassis type, having a main body part 140. The main body part 140 substantially houses all components of the robotic lawnmower 100. The robotic lawnmower 100 has a plurality of wheels 130. In the exemplary embodiment of figure IB the robotic lawnmower 100 has four wheels 130, two front wheels and two rear wheels. At least some of the wheels 130 are drivably connected to at least one electric motor 150. It should be noted that even if the description herein is focused on electric motors, combustion engines may alternatively be used, possibly in combination with an electric motor. In the example of figure IB, each of the wheels 130 is connected to a respective electric motor 155, but it would also be possible with two or more wheels being connected to a common electric motor 155, for driving the wheels 130 to navigate the robotic lawnmower 100 in different manners. The wheels, the motor 155 and possibly the battery 150 are thus examples of components making up a propulsion device. By controlling the motors 150, the propulsion device may be controlled to propel the robotic lawnmower 100 in a desired manner, and the propulsion device will therefore be seen as synonymous with the motor(s) 150.
It should be noted that wheels 130 driven by electric motors is only one example of a propulsion system and other variants are possible such as caterpillar tracks.
The robotic lawnmower 100 also comprises a controller 110 and a computer readable storage medium or memory 120. The controller 110 may be implemented using instructions that enable hardware functionality, for example, by using executable computer program instructions in a general-purpose or special-purpose processor that may be stored on the memory 120 to be executed by such a processor. The controller 110 is configured to read instructions from the memory 120 and execute these instructions to control the operation of the robotic lawnmower 100 including, but not being limited to, the propulsion and navigation of the robotic lawnmower. The controller 110 in combination with the electric motor 155 and the wheels 130 forms the base of a navigation system (possibly comprising further components) for the robotic lawnmower, enabling it to be self-propelled as discussed under figure 1 A,
The controller 110 may be implemented using any suitable, available processor or Programmable Logic Circuit (PLC). The memory 120 may be implemented using any commonly known technology for computer-readable memories such as ROM, FLASH, DDR, or some other memory technology.
The robotic lawnmower 100 is further arranged with a wireless communication interface 115 for communicating with a server, and in some embodiments, also with other devices, such as a personal computer, a smartphone, the charging station, and/or other robotic work tools. Examples of such wireless communication devices are Bluetooth®, WiFi® (IEEE802.1 lb), Global System Mobile (GSM) and LTE (Long Term Evolution), to name a few. The robotic lawnmower 100 is thus arranged to communicate with a server (referenced 240 in figure 2) for providing information regarding status, location, and/or progress of operation as well as receiving commands or settings from the server.
The robotic lawnmower 100 also comprises a grass cutting device 160, such as a rotating blade 160 driven by a cutter motor 165. It should be noted that the robotic lawnmower 100 may comprise more than one rotating blade working together. The grass cutting device 160 enables for cutting grass at a width (referenced w in figure 3 A). The grass cutting device being one example of a work tool 160 for a robotic work tool 100.
In some embodiments, the robotic lawnmower 100 is also configured to determine a load exerted on the cutter motor 165 for determining the height of the grass being cut. In alternative embodiments, the robotic lawnmower is configured to determine the height of the grass in other manners, such as through image analysis
In another embodiment, the robotic work tool 100 is a robotic snow remover (or snow blower), and the work tool 160 corresponds to the augers of the snow remover.
Similarly as to the robotic lawnmower, the robotic snow mover 100 may, in some embodiments, be arranged to determine the height (or rather the depth) of the snow being removed in similar manners. The robotic lawnmower 100 may further comprise at least one navigation sensor, such as an optical navigation sensor, an ultrasound sensor, a beacon navigation sensor and/or a satellite navigation sensor 185. The optical navigation sensor may be a camera-based sensor and/or a laser-based sensor. The beacon navigation sensor may be a Radio Frequency receiver, such as an Ultra Wide Band (UWB) receiver or sensor, configured to receive signals from a Radio Frequency beacon, such as a UWB beacon. Alternatively or additionally, the beacon navigation sensor may be an optical receiver configured to receive signals from an optical beacon. The satellite navigation sensor may be a GPS (Global Positioning System) device or other Global Navigation Satellite System (GNSS) device. In embodiments, where the robotic lawnmower 100 is arranged with a navigation sensor, the magnetic sensors 170 as will be discussed below are optional. In embodiments relying (at least partially) on a navigation sensor, the work area may be specified as a virtual work area in a map application stored in the memory 120 of the robotic lawnmower 100. The virtual work area may be defined by a virtual boundary. A physical border may be used to define a work area 205.
The robotic lawnmower 100 may also or alternatively comprise deduced reckoning sensors 180. The deduced reckoning sensors may be odometers, accelerometer or other deduced reckoning sensors. In some embodiments, the deduced reckoning sensors are comprised in the propulsion device, wherein a deduced reckoning navigation may be provided by knowing the current supplied to a motor and the time the current is supplied, which will give an indication of the speed and thereby distance for the corresponding wheel.
For enabling the robotic lawnmower 100 to navigate with reference to a boundary wire emitting a magnetic field caused by a control signal transmitted through the boundary wire, the robotic lawnmower 100 is, in some embodiments, further configured to have at least one magnetic field sensor 170 arranged to detect the magnetic field and for detecting the boundary wire and/or for receiving (and possibly also sending) information to/from a signal generator (will be discussed with reference to figure 1). In some embodiments, the sensors 170 may be connected to the controller 110, possibly via filters and an amplifier, and the controller 110 may be configured to process and evaluate any signals received from the sensors 170. The sensor signals are caused by the magnetic field being generated by the control signal being transmitted through the boundary wire. This enables the controller 110 to determine whether the robotic lawnmower 100 is close to or crossing the boundary wire, or inside or outside an area enclosed by the boundary wire.
As mentioned above, in some embodiments, the robotic lawnmower 100 is in some embodiments arranged to operate according to a map application representing one or more work areas (and possibly the surroundings of the work area(s)) stored in the memory 120 of the robotic lawnmower 100. The map application may be generated or supplemented as the robotic lawnmower 100 operates or otherwise moves around in the work area 205. In some embodiments, the map application includes one or more start regions and one or more goal regions for each work area. In some embodiments, the map application also includes one or more transport areas.
As discussed in the above, the map application is in some embodiments stored in the memory 120 of the robotic working tool(s) 100. In some embodiments the map application is stored in the server (referenced 240 in figure 2). In some embodiments maps are stored both in the memory 120 of the robotic working tool(s) 100 and in the server, wherein the maps may be the same maps or show subsets of features of the area.
As will be discussed in reference to figures 3A, 3B, 3C, 3D, 3E, 3F, 4A, 4B and 4C, the robotic working tool 100 also comprises at least one camera or other image sensor 190 for enabling image analysis, such as segmentation, of a path navigated in a work area. In some embodiments the image sensor(s) 190 is comprised or combined with the optical navigation sensor(s) 180.
Figure 2 shows a robotic work tool system 200 in some embodiments. The schematic view is not to scale. The robotic work tool system 200 comprises one or more robotic work tools 100 according to the teachings herein. It should be noted that the operational area 205 shown in figure 2 is simplified for illustrative purposes. The robotic work tool system comprises a boundary 220 that may be virtual and/or electro mechanical. An example of an electro mechanical border is one generated by a magnetic field generated by a control signal being transmitted through a boundary wire, and which magnetic field is sensed by sensors 170 in the robotic work tool 100. An example of a virtual border is one defined by coordinates and navigated using a location-based navigation system, such as a GPS (or RTK) system.
In some embodiments, the robotic lawnmower 100 is configured to navigate using an RTK system as this provides a high accuracy and enables for very precise steering.
The robotic work tool system 200 further comprises a station 210 possibly at a station location. A station location may alternatively or additionally indicate a service station, a parking area, a charging station or a safe area where the robotic work tool may remain for a time period between or during operation session.
As with figures 1 A and IB, the robotic work tool(s) is exemplified by a robotic lawnmower, whereby the robotic work tool system may be a robotic lawnmower system or a system comprising a combinations of robotic work tools, one being a robotic lawnmower, but the teachings herein may also be applied to other robotic work tools adapted to operate within a work area.
The one or more robotic working tools 100 of the robotic work tool system 200 are arranged to operate in an operational area 205, which in this example comprises a first work area 205A and a second work area 205B connected by a transport area TA. However, it should be noted that an operational area may comprise a single work area or one or more work areas, possibly arranged adjacent for easy transition between the work areas, or connected by one or more transport paths or areas, also referred to as corridors. In the following work areas and operational areas will be referred to interchangeably, unless specifically indicated.
The operational area 205 is in this application exemplified as a garden, but can also be other work areas as would be understood, such as a (part of a) neighbourhood, a sports complex or an airfield to mention a few examples.
As discussed above, the garden may contain a number of obstacles and/or objects, for example a number of trees, stones, slopes and houses or other structures.
In some embodiments the robotic work tool is arranged or configured to traverse and operate in work areas that are not essentially flat, but contain terrain that is of varying altitude, such as undulating, comprising hills or slopes or such. The ground of such terrain is not flat and it is not straightforward how to determine an angle between a sensor mounted on the robotic work tool and the ground. The robotic work tool is also or alternatively arranged or configured to traverse and operate in a work area that contains obstacles that are not easily discerned from the ground. Examples of such are grass or moss covered rocks, roots or other obstacles that are close to ground and of a similar colour or texture as the ground. The robotic work tool is also or alternatively arranged or configured to traverse and operate in a work area that contains obstacles that are overhanging, i.e. obstacles that may not be detectable from the ground up, such as low hanging branches of trees or bushes. Such a garden is thus not simply a flat lawn to be mowed or similar, but a work area of unpredictable structure and characteristics. The operational area or any of its work areas 205 exemplified with referenced to figure 2, may thus be such a non-uniform area as disclosed in this paragraph that the robotic work tool is arranged to traverse and/or operate in.
As shown in figure 2, the robotic working tool(s) 100 is arranged to navigate in one or more work areas 205A, 205B, possibly connected by a transport area TA.
The robotic working tool system 200 may alternatively or additionally comprise or be arranged to be connected to a server 240, such as a cloud service, a cloud server application or a dedicated server 240. The connection to the server 240 may be direct from the robotic working tool 100, indirect from the robotic working tool 100 via the service station 210, and/or indirect from the robotic working tool 100 via user equipment (not shown).
As a skilled person would understand a server, a cloud server or a cloud service may be implemented in a number of ways utilizing one or more controllers 240A and one or more memories 240B that may be grouped in the same server or over a plurality of servers.
In the below several embodiments of how the robotic work tool 100 may be adapted will be disclosed. It should be noted that all embodiments may be combined in any combination providing a combined adaptation of the robotic work tool.
Figure 3 A shows a schematic view of a robotic working tool 100, such as one disclosed in relation to figures 1 A and IB, which is configured to operate in a robotic working tool system 200 as disclosed in relation to figure 2. In the following example, the robotic work tool will be exemplified as a robotic lawnmower 100, as in Figures 1 A, IB and 2.
The robotic lawnmower 100 is configured to operate in the operating area 205 and to cut the grass in there. This is illustrated in figure 3 A as the robotic lawnmower 100 navigating over uncut grass (illustrated by dotted ground) and leaving behind a trail of cut grass (illustrated by white area(s)). The trail of cut grass is (substantially) of a width w corresponding to the width of the grass cutting device 160 as discussed in relation to figure IB. The robotic lawnmower 100 is in this example arranged with a camera (or other image sensor 190) in the front of the robotic lawnmower 100, the camera 190 having a field of view (indicated and referenced FOV). As is shown in figure 3B, also being a schematic view of example embodiments of the teachings herein, showing the robotic lawnmower 100 of figure 3 A at a later stage, the camera 190 enables for the robotic lawnmower 100 to detect an edge (referenced E) between cut grass and uncut grass. As the camera 190 in this example is in the front, the robotic lawnmower 100 will be able to see the edge as it turns and continues operating in the opposite direction.
As would be obvious, even though the examples herein are discussed based on a zig-zag-pattern of operating, any pattern or other navigation may be used.
As will be discussed in greater detail with reference to figures 4A, 4B and 4C, the robotic lawnmower 100 is configured (through its controller 110) to detect the edge E based on image(s) received through the camera(s) 190 utilizing image processing. The image processing is based on Artificial Intelligence, such as machine learning. In some embodiments, the controller performing the image processing is the controller 110 of the robotic lawnmower 100. In some embodiments, the controller performing the image processing is the controller of the camera(s) 190 (not shown explicitly, but considered to be part of the controller 110 of the robotic lawnmower 100 for the teachings herein). In some embodiments, the controller performing the image processing is the controller of the camera(s) 190 in combination with the controller 110 of the robotic lawnmower 100.
The robotic lawnmower 100 is this configured to detect where an edge is between cut grass and uncut grass, which enables the robotic lawnmower to navigate in an efficient manner by placing the robotic lawnmower 100 (knowing the location of the grass cutting device 160 relative the body of the robotic lawnmower 100, and its width w) so that any trail of cut grass is adjacent or slightly overlapping a previous one. The robotic lawnmower 100 is thus configured to navigate the robotic lawnmower based on the detected edge E. In some embodiments, and in some usages and situations, the robotic lawnmower 100 is configured to navigate based on the detected edge E so that the (extension of the) grass cutting device 160 overlaps the edge E of a previously cut area of grass (such as a cut trail). In some alternative or additional embodiments, and in some alternative or additional usages and situations, the robotic lawnmower 100 is configured to navigate based on the detected edge E so that any missed string or other narrow passage of uncut grass is minimized or cut, such as by taking corrective action. In some such embodiments the robotic lawnmower 100 is configured to take the corrective action by turning to further overlap the edge or to ensure that the edge is overlapped, by turning towards the cut area. This can be utilized to accommodate or correct for any navigational error resulting in the uncut string of grass. In some alternative or additional such embodiments the robotic lawnmower 100 is configured to take the corrective action by returning to traverse over the uncut string of grass to ensure that it is cut.
In the teachings herein, a string or narrow passage of uncut grass may be any area of uncut grass having a width less than the width w of the grass cutting device. Alternatively or additionally, a string of uncut grass is an area of grass that is between two cut areas. In some such embodiments, the string of uncut grass is an area of grass that should have been cut according to an operating schedule of the robotic lawnmower 100. In other words it is in an area that has already been marked or noted to have been covered as the robotic lawnmower executes a scheduled operation.
Figure 3C shows an example of such a string (referenced E’) and that the robotic lawnmower 100 takes corrective action (as indicated by the arrow).
Figure 3D shows an example of a robotic lawnmower 100 having camera(s) 190 mounted in the front as well as having camera(s) 190 mounted in the back of the robotic lawnmower 100. In an embodiment where cameras are facing both forwards and rearwards a corrective action can of course be taken sooner, than only with one camera, but in some embodiments, the use of front and rear-facing cameras can also be utilized by the robotic lawnmower to provide feedback. The feedback is in some embodiments related to the navigation of the robotic lawnmower 100. And the feedback can be used to further improve the navigation of the robotic lawnmower 100.
Figure 3E shows an example of a different operating pattern, in this example being to follow an expanding or imploding pattern, such as following the edges of the operating area 205, or simply a rectangle, square or even a circle - all being known operating patterns - which is followed so that edges or strings between cut areas are minimized.
Figure 3F shows an example of a different operating pattern, in this example being a random or semi-random pattern, which is followed so that edges or strings between cut areas are minimized.
Figure 4A shows a schematic view of an example operating area 205, possibly one such as discussed in relation to figure 2, wherein a robotic work tool 100 operates and gathers image data, such as images, through at least one camera.
The images are at first gathered for training purposes, and later for operating purposes.
It should be noted that even though the description herein will be focussed on a single robotic work tool gathering the images in a single operating area, it is in some embodiments more than one robotic work tool that gathers the images, and in some embodiments the images are gathered in/from different operating areas. It should also be noted that the images may be gathered during operation, during a training session, or even a combination of such, where an operating session for one robotic work tool may be seen as a training session for another robotic work tool.
It may thus not be the same robotic work tool that gathers the images for training as it is that gathers the images for operating.
In some embodiments the training is performed in a factory operating area, and in some alternative or additional embodiments, the training is performed during operation for one or more customers. In embodiments where several robotic work tools are utilized, the training may be shared. Returning to figure 4A it is illustrated how the robotic work tool 100 navigates an operating area gathering images through the camera(s) 190. In this example there are some objects in the field of view (FOV) of the camera(s) 190, namely a trail of cut grass (and thus an edge), a house, referenced H and a pathway referenced PW. It should be noted that for the purpose herein a portion of an image may be referred to as an object or as comprising an object, without there actually being a physical object. An object is therefore an image object representing a portion of n image having some features that may be classified, and more specifically that can be classified as processed (cut) are, unprocessed (uncut) area and other (area-not-to-be-processed). The terminology of object or portion of image, and portion of image representing an area will be used interchangeably herein.
Figure 4B shows a schematic view of an example image captured or gathered through a camera 190 as in the example situation in figure 4A. Illustrated in figure 4B is the trail of cut grass (illustrated with diagonal lines) the uncut grass (illustrated as a dotted area), the house, the pathway and also the sky (or other background, referenced BG) extending above the ground. In case the camera(s) 190 is angled not solely on the ground the sky (or other background) will also be part of the image(s) captured.
If the angle is known for the camera relative the ground, the background may be ignored. In some embodiments the angle between the camera and the ground can be determined based on a known mounting angle and a current angle of the robotic work tool (detectable through the use of deduced reckoning sensors such as inertial mobility units) and possibly also based on a known elevation which may be known from the map application.
As noted in the above the robotic work tool is configured to perform image analysis on the received image(s) and to detect (including classification) objects, i.e. different areas, in the image(s), possibly through segmentation, object classification or other image processing technique.
Figure 4C shows a schematic view of a segmentation of the camera view of Figure 4B according to some example embodiments of the teachings herein. In figure 4C the objects, i.e. the areas, detected are indicated as CG for the trail of cut grass, UG for the area of uncut grass, NG as in not grass for the house, the pathway and also for the sky. As is also indicated in figure 4C, the house and the pathway are examples of objects that are not grass and also should not be cut - as is indicated as NC. In general, and for other employments, the objects are classified as Processed (P), Unprocessed (U) and Not-to-be-processed (N) corresponding to CG, UG and NG respectively. For the embodiments relating to snow removers, the classifications correspond to snow removed, snow not removed, and not snow respectively.
In some embodiments the robotic work tool is a robotic floor grinder. For such embodiments relating to a floor grinder, the classifications correspond to floor processed, floor not processed, and area-not-to-process, for example not floor, respectively.
In some embodiments the robotic work tool is a robotic gravel rake. For some such embodiments relating to a gravel rake (or other rake), the classifications correspond to gravel area raked, gravel area raked, and gravel-area-not-to-process, for example not gravel path/area, respectively.
The inventors of the present teachings have realized that a robotic work tool need not be able to detect all classes of physical objects in order to perform a more efficient operation, and that the robotic work tool need only be trained to detect which areas are processed and which are not in order to be able to follow the edge between processed and unprocessed areas. This enables for a faster and more efficient training of the robotic work tool (or rather the Al for the robotic work tool) and also for a more accurate usage and detection of image objects, i.e. areas.
In addition, the robotic work tool may also, in some embodiments, be trained to determine what is not grass, which will further enable for a more efficient operation as edges to areas where there is no grass may be followed more efficiently and without wasting time or breaking up a pattern by attempting to operate in areas where there is nothing to process, such as a pathway for a lawnmower.
The robotic work tool is thus trained based on images where processed areas are indicated and where unprocessed areas are indicated. As noted above the robotic work tool may be configured to determine whether an area is processed or not based also on other inputs, such as by determining the height of the grass being cut. This can be used to provide feedback for the training - especially when being trained in the field. The training may thus also be performed based on feedback.
For some embodiments relating to a gravel rake (or other rake), the height is the height of gravel being raked.
For some embodiments relating to a snow remover, the height is the height of snow being removed.
Figure 5A shows a general flowchart according to a method of the teachings herein for use in a robotic working tool system, and especially during operation, wherein the method comprises receiving at least one image from the image sensor; detecting at least one object in the received at least one image; classifying the at least one (image) object as processed or unprocessed, wherein an object classified as processed represents a processed area and an object classified as unprocessed represents an unprocessed area; detecting an edge between a processed area and an unprocessed area; and navigating the robotic work tool so that the work tool overlaps the edge.
A robotic work tool system may thus in some embodiments be configured to perform the method according to figure 5A as discussed above for example in relation to figures 3A, 3B, 3C, 3D, 3E, and 3F.
Figure 5B shows a general flowchart according to a method of the teachings herein for use in a robotic working tool or in a server connected to the robotic work tool, and especially for training the robotic work tool, wherein the method comprises receiving at least one image, wherein the image comprises at least one or more of a processed area and an unprocessed area; detecting at least one object in the received at least one image; classifying the at least one (image) object as processed or unprocessed; comparing to see if the classified object is as indicated and train accordingly. As would be known there are many ways of how to train the artificial intelligence based on the detected objects and how they correspond to the fed images, with known areas.
In some embodiments, for example such where a robotic work tool provides the images, the robotic work tool is configured to navigate using RTK. This allows for knowing or determining a very exact position of the robotic work tool in an operational area, for example navigating according to a map. This allows for a highly efficient data collection as regards gather the image data, which can be used both for initial data collection and for continued training.
In such embodiments the received images, are of a known angle relative the robotic work tool, and by knowing the more or less exact position of the robotic work tool, the area covered by the image may also be determined -relative the position of the robotic work tool. By mapping this area of the image to an area in the map and based on an operating schedule where areas are noted to be processor or not, it can be determined if the area in the image(s) is in a processed area or an unprocessed area, or whether the area in the image covers both processed or unprocessed areas. Furthermore, the extension of such processed or unprocessed areas may be determined in the image(s) based on the map and the exact location of the robotic work tool.
In some embodiments, it may be assumed that all areas are unprocessed and the map is used to note where the robotic work tool has been to process, that is to determine a path travelled.
The image data collected in this manner may thus be gathered during operation for subsequent training, for ongoing training or for training of another robotic work tool. Additionally or alternatively the image data collected for initial training.
As a precursor or as supplemental for the training, data collection is performed possibly during operation so as to collect image data and perform automatic segmentation labelling or object detection, by dividing the image(s) into segments of processed work area, non-processed work area, and possibly non-work area. For example, in the case of lawnmowing the three categories would correspond to cut grass, uncut grass, and non-grass respectively. (To clarify, the third category includes everything that would not be considered as part of the work area surface, such as obstacles, humans, sky, buildings etc.).
In order to improve the quality of data collected, in some embodiments, it is assumed the whole work area only contains non-processed surfaces and does not contain any processed area at all. In some embodiments, this should be confirmed by a human operator or some other method, such as by querying an operating schedule. As discussed in the above, during operation, the position of the tool is recorded, and a map is updated that includes information about what part of the work area has been processed (the path travelled).
The image(s) is annotated by being mapped against a specific location or area in the map based on the position of the camera(s) relative the robotic work tool and the accurate position of the robotic work tool.
As the data is collected, the data can be used to train the artificial intelligence as discussed in the above, where the image data is used to train for example a convolutional neural network (CNN) in the task of work area segmentation, to detect or predict which parts of the image belong to the three aforementioned categories. Examples of CNNs that can be used are U-Net, DeepLab, Mask R-CNN etc.
The artificial intelligence can thereafter be used to improve the navigation of the robotic work tool. The improved navigation is based on enabling the robotic work tool to find the edge between processed and non-processed work area to efficiently cover the whole work area.
In some embodiments, the robotic work tool is equipped with a forward-facing camera from which the images are fed into the CNN. And the edge between processed/non-processed work area is predicted or detected by the CNN and the edge is used to guide the robot in covering the work area.
As discussed in the above, the robotic work tool could for example move in a zig-zag pattern and on each pass adjust the distance to the previous pass based on where the edge between processed/non-processed area is located. The robot could also move in a circular pattern by first covering the outer boundary. From the second rotation and onwards the robot can use the predicted edge to control its pass while it spirals inwards towards the center of the work area. (This could also be executed by starting from the center and spiraling outwards.)
As discussed herein the teachings herein allows the robot to cover the work area with minimal overlap between the already processed area and the processing mechanism of the robotic work tool, thus increasing its efficiency. A robotic work tool may thus in some embodiments be configured to perform the method according to figure 5B as discussed above for example in relation to figures 4 A, 4B and 4C.
As noted, in some embodiments, it is a server that trains the artificial intelligence to be used by the robotic work tool, and a server may thus in some embodiments be configured to perform the method according to figure 5B. In such embodiments it is to be understood that the robotic work tool is being trained through the server.

Claims

1. A robotic working tool system comprising a robotic working tool comprising at least one image sensor (190), a work tool (160) and a controller, wherein the controller is configured to receive at least one image from the image sensor (190); detect at least one object in the received at least one image; classify the at least one object as processed or unprocessed, wherein an object classified as processed represents a processed area and an object classified as unprocessed represents an unprocessed area; detect an edge between a processed area and an unprocessed area; and navigate the robotic work tool (100) so that the work tool (160) overlaps the edge.
2. The system according to claim 1, wherein the controller is further configured to detect a string of unprocessed area, and take corrective action.
3. The system according to claim 2, wherein the corrective action is to navigate closer to the processed area.
4. The system according to claim 2, wherein the corrective action is to navigate the robotic work tool to the string of unprocessed area.
5. The system according to any preceding claim, wherein the controller is further configured to: classify objects as not-to-be-processed representing an area not-to-be- processed; detect a second edge between an un-processed area and an area not-to-be- processed; and navigate the robotic work tool (100) so that the work tool (160) overlaps the second edge.
6. The system according to any preceding claim, wherein the controller is further configured to determine a height in a current area, and to provide feedback based on the height.
7. The system according to any preceding claim, wherein the robotic work tool is a robotic lawnmower.
8. The system according to claims 6 and 7, wherein the height is the height of grass being cut.
9. The system according to any of claims 1 to 6, wherein the robotic work tool is a robotic snow remover.
10. The system according to claims 6 and 9, wherein the height is the height of snow being removed.
11. The system according to any of claims 1 to 6, wherein the robotic work tool is a robotic floor grinder.
12. The system according to any of claims 1 to 6, wherein the robotic work tool is a robotic gravel rake.
13. The system according to claims 6 and 12, wherein the height is the height of gravel being raked.
14. The system according to any preceding claim, wherein the robotic work tool comprises an RTK navigation sensor, and wherein the controller is configured to: determine a path travelled by the robotic work tool; determine an image area covered by the image sensor(s); determine a location of the robotic work tool; map the image area to a map area based on the location of the robotic work tool; determine if the image area is of processed and/or unprocessed areas based on the travelled path.
15. A method for robotic working tool system comprising a robotic working tool comprising at least one image sensor (190) and a work tool (160), wherein the method comprises: receiving at least one image from the image sensor (190); detecting at least one object in the received at least one image; classifying the at least one object as processed or unprocessed, wherein an object classified as processed represents a processed area and an object classified as unprocessed represents an unprocessed area; detecting an edge between a processed area and an unprocessed area; and navigating the robotic work tool (100) so that the work tool (160) overlaps the edge.
16. The method according to claim 15, wherein the robotic work tool comprises an RTK navigation sensor, and wherein the method comprises: determining a path travelled by the robotic work tool; determining an image area covered by the image sensor(s); determining a location of the robotic work tool; mapping the image area to a map area based on the location of the robotic work tool; determining if the image area is of processed and/or unprocessed areas based on the travelled path.
17. A method for training a robotic work tool, wherein the method comprises: receiving at least one image, wherein the image comprises at least one or more of a processed area and an unprocessed area; detecting at least one object in the received at least one image; classifying the at least one object as processed or unprocessed; comparing to see if the classified object is as indicated and train accordingly.
18. The method according to claim 17, wherein the robotic work tool comprises an RTK navigation sensor, and wherein the method comprises: determining a path travelled by the robotic work tool; determining an image area covered by the image sensor(s); determining a location of the robotic work tool; mapping the image area to a map area based on the location of the robotic work tool; determining if the image area is of processed and/or unprocessed areas based on the travelled path.
19. A server (240) configured to perform the method of claim 17 or 18.
PCT/SE2022/050930 2022-01-31 2022-10-14 Improved operation for a robotic work tool system WO2023146451A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
SE2250085A SE2250085A1 (en) 2022-01-31 2022-01-31 Improved operation for a robotic work tool system
SE2250085-4 2022-01-31

Publications (1)

Publication Number Publication Date
WO2023146451A1 true WO2023146451A1 (en) 2023-08-03

Family

ID=84044277

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/SE2022/050930 WO2023146451A1 (en) 2022-01-31 2022-10-14 Improved operation for a robotic work tool system

Country Status (2)

Country Link
SE (1) SE2250085A1 (en)
WO (1) WO2023146451A1 (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4211921A (en) * 1978-02-03 1980-07-08 Iseki Agricultural Machinery Mfg. Co. Ltd. Sensor for use in controlling operation of mobile farming machine
JPH0759407A (en) * 1993-08-25 1995-03-07 Seibutsukei Tokutei Sangyo Gijutsu Kenkyu Suishin Kiko Traveling controller of automatic traveling working car
WO1998046065A1 (en) * 1997-04-16 1998-10-22 Carnegie Mellon University Agricultural harvester with robotic control
WO2007109624A2 (en) * 2006-03-17 2007-09-27 Irobot Corporation Robot confinement
US20120265391A1 (en) * 2009-06-18 2012-10-18 Michael Todd Letsky Method for establishing a desired area of confinement for an autonomous robot and autonomous robot implementing a control system for executing the same
EP3232290A1 (en) * 2014-12-11 2017-10-18 Yanmar Co., Ltd. Work vehicle
CN210255452U (en) * 2019-08-20 2020-04-07 苏州蝼蚁智能建筑科技有限公司 Terrace grinds dust absorption robot
CN210561937U (en) * 2019-04-03 2020-05-19 新疆大学 Intelligent snow sweeping robot based on multi-perception interaction

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102009027396A1 (en) * 2009-07-01 2011-01-13 Robert Bosch Gmbh Autonomous mobile platform for surface treatment and surface treatment
DE102015119865B4 (en) * 2015-11-17 2023-12-21 RobArt GmbH Robot-assisted processing of a surface using a robot
WO2020090038A1 (en) * 2018-10-31 2020-05-07 本田技研工業株式会社 Autonomous work machine
WO2020142495A1 (en) * 2018-12-31 2020-07-09 Abb Schweiz Ag Multiple robot and/or positioner object learning system and method
CN112286175A (en) * 2019-07-09 2021-01-29 苏州宝时得电动工具有限公司 Automatic working system, intelligent snow sweeping robot and control method thereof
WO2021037116A1 (en) * 2019-08-27 2021-03-04 南京德朔实业有限公司 Self-propelled mowing system, and method of performing supplementary mowing operation on missed regions

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4211921A (en) * 1978-02-03 1980-07-08 Iseki Agricultural Machinery Mfg. Co. Ltd. Sensor for use in controlling operation of mobile farming machine
JPH0759407A (en) * 1993-08-25 1995-03-07 Seibutsukei Tokutei Sangyo Gijutsu Kenkyu Suishin Kiko Traveling controller of automatic traveling working car
WO1998046065A1 (en) * 1997-04-16 1998-10-22 Carnegie Mellon University Agricultural harvester with robotic control
WO2007109624A2 (en) * 2006-03-17 2007-09-27 Irobot Corporation Robot confinement
US20120265391A1 (en) * 2009-06-18 2012-10-18 Michael Todd Letsky Method for establishing a desired area of confinement for an autonomous robot and autonomous robot implementing a control system for executing the same
EP3232290A1 (en) * 2014-12-11 2017-10-18 Yanmar Co., Ltd. Work vehicle
CN210561937U (en) * 2019-04-03 2020-05-19 新疆大学 Intelligent snow sweeping robot based on multi-perception interaction
CN210255452U (en) * 2019-08-20 2020-04-07 苏州蝼蚁智能建筑科技有限公司 Terrace grinds dust absorption robot

Also Published As

Publication number Publication date
SE2250085A1 (en) 2023-08-01

Similar Documents

Publication Publication Date Title
CN112703881B (en) Intelligent mower, control method and system thereof and storage medium
CN113766825A (en) Energy-saving lawn maintenance vehicle
US20200404846A1 (en) Autonomous navigation system and the vehicle made therewith
WO2023146451A1 (en) Improved operation for a robotic work tool system
WO2022203562A1 (en) Improved navigation for a robotic work tool
CN114937258A (en) Control method for mowing robot, and computer storage medium
EP4381926A1 (en) Improved operation for a robotic work tool
US20240182074A1 (en) Operation for a robotic work tool
EP4368004A1 (en) Improved operation and installation for a robotic work tool
EP4332716A2 (en) Mapping objects encountered by a robotic garden tool
SE2250246A1 (en) Improved mapping for a robotic work tool system
WO2023167617A1 (en) Improved operation for a robotic lawnmower system
EP4268565B1 (en) Improved navigation for a robotic work tool system
EP4379489A1 (en) Improved definition of boundary for a robotic work tool
WO2023121528A1 (en) Improved navigation for a robotic work tool system
SE2151613A1 (en) Improved navigation for a robotic work tool system
WO2023068976A1 (en) Improved navigation for a robotic work tool system
US20230359221A1 (en) Navigation for a robotic work tool system
SE2250247A1 (en) Improved navigation for a robotic work tool system
US20220151146A1 (en) Energy Efficient Robotic Lawn Mower
SE546034C2 (en) Improved navigation for a robotic work tool system
SE2150497A1 (en) Improved obstacle handling for a robotic work tool
SE545376C2 (en) Navigation for a robotic work tool system
SE2250557A1 (en) Navigation for a robotic work tool system
SE545472C2 (en) System and method for navigating a robotic work tool

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22797526

Country of ref document: EP

Kind code of ref document: A1