SE2250834A1 - Improved determination of pose for a robotic work tool - Google Patents

Improved determination of pose for a robotic work tool

Info

Publication number
SE2250834A1
SE2250834A1 SE2250834A SE2250834A SE2250834A1 SE 2250834 A1 SE2250834 A1 SE 2250834A1 SE 2250834 A SE2250834 A SE 2250834A SE 2250834 A SE2250834 A SE 2250834A SE 2250834 A1 SE2250834 A1 SE 2250834A1
Authority
SE
Sweden
Prior art keywords
work tool
robotic work
operational area
controller
robotic
Prior art date
Application number
SE2250834A
Inventor
Odi Dahan
Sergey Liflandsky
Original Assignee
Husqvarna Ab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Husqvarna Ab filed Critical Husqvarna Ab
Priority to SE2250834A priority Critical patent/SE2250834A1/en
Priority to PCT/SE2023/050369 priority patent/WO2024010503A1/en
Publication of SE2250834A1 publication Critical patent/SE2250834A1/en

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/20Control system inputs
    • G05D1/24Arrangements for determining position or orientation
    • G05D1/243Means capturing signals occurring naturally from the environment, e.g. ambient optical, acoustic, gravitational or magnetic signals
    • G05D1/2435Extracting 3D information
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01DHARVESTING; MOWING
    • A01D75/00Accessories for harvesters or mowers
    • A01D75/28Control mechanisms for harvesters or mowers when moving on slopes; Devices preventing lateral pull
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01DHARVESTING; MOWING
    • A01D34/00Mowers; Mowing apparatus of harvesters
    • A01D34/006Control or measuring arrangements
    • A01D34/008Control or measuring arrangements for automated or remotely controlled operation
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/20Control system inputs
    • G05D1/22Command input arrangements
    • G05D1/221Remote-control arrangements
    • G05D1/222Remote-control arrangements operated by humans
    • G05D1/224Output arrangements on the remote controller, e.g. displays, haptics or speakers
    • G05D1/2244Optic
    • G05D1/2245Optic providing the operator with a purely computer-generated representation of the environment of the vehicle, e.g. virtual reality
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/20Control system inputs
    • G05D1/24Arrangements for determining position or orientation
    • G05D1/246Arrangements for determining position or orientation using environment maps, e.g. simultaneous localisation and mapping [SLAM]
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/40Control within particular dimensions
    • G05D1/46Control of position or course in three dimensions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/579Depth or shape recovery from multiple images from motion
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1656Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with passive imaging devices, e.g. cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • G01C21/30Map- or contour-matching
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D2105/00Specific applications of the controlled vehicles
    • G05D2105/15Specific applications of the controlled vehicles for harvesting, sowing or mowing in agriculture or forestry
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D2107/00Specific environments of the controlled vehicles
    • G05D2107/20Land use
    • G05D2107/23Gardens or lawns
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D2109/00Types of controlled vehicles
    • G05D2109/10Land vehicles
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D2111/00Details of signals used for control of position, course, altitude or attitude of land, water, air or space vehicles
    • G05D2111/10Optical signals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D2111/00Details of signals used for control of position, course, altitude or attitude of land, water, air or space vehicles
    • G05D2111/60Combination of two or more signals
    • G05D2111/63Combination of two or more signals of the same type, e.g. stereovision or optical flow
    • G05D2111/65Combination of two or more signals of the same type, e.g. stereovision or optical flow taken successively, e.g. visual odometry or optical flow

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Environmental Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

A robotic work tool system comprising a robotic work tool (100) arranged to operate in an operational area (205), the operational area (205) having a surface that is at least partially irregular, and the robotic work tool comprising controller (110), a memory (120), three or more wheels (130), deduced reckoning sensors (180), the memory (120) being configured to store a map application (120 A) and to store data regarding the positions of the wheels (130) on the robotic work tool (100), and wherein the controller (110) is configured to receive sensor input from the deduced reckoning sensors (180); determine a curvature (C) for the surface of the operational area (205); determine a pose based on the sensor input, wherein the pose is determined based on an analysis of the sensor input, the analysis being constrained by that the position of the three or more wheels being on the surface of the operational area, wherein the controller is further configured to determine that a wheel is on the surface taking into account the curvature (C) of the surface.

Description

TECHNICAL FIELD This application relates to a robotic Work tool, such as a laWnmoWer, and a method for providing an improved deterrnination of a pose for the robotic Work tool, and in particular to a robotic Work tool, such as a lawnmower, and a method for providing an improved deterrnination of a pose for the robotic Work tool utilizing deduced reckoning, and in particular to utilizing a mono camera as a visual odometry device.
BACKGROUND Automated or robotic Work tools such as robotic laWnmoWers are becoming increasingly more popular and so is the use of the robotic Work tool in various types of operational areas.
Such operational areas, in particular for robotic Work tools being robotic laWnmoWers, often include irregular surfaces such as slopes. Utilizing navigation techniques relying on odometry such as through deduced reckoning in such irregular areas suffer from draWbacks in such irregular areas, as the assumptions made in various odometry-based navigation techniques are based on a flat operating surface. Furthermore navigation techniques relying on visual odometry, such as for Visual Simultaneous Localization and Mapping (V-SLAM), also suffer from draWbacks in that advanced cameras are needed for establishing a scale of an observed area as Without a proper scale, the interpretation is difficult to match to a map. Such advanced cameras increase the cost of the robotic Work tool.
Thus, there is a need for an improved manner of providing advanced navigational functionality utilizing odometry-based navigation While still utilizing cheap or uncomplicated sensors, such as a mono camera.
SUMMARY It is therefore an object of the teachings of this application to overcome or at least reduce those problems by providing a robotic Work tool system comprising a robotic Work tool arranged to operate in an operational area, the operational area having a surface that is at least partially irregular, and the robotic Work tool comprising controller, a memory, three or more Wheels, deduced reckoning sensors, the memory being conf1gured to store a map application and to store data regarding the positions of the Wheels on the robotic Work tool, and Wherein the controller is configured to receive sensor input from the deduced reckoning sensors; determine a curvature for the surface of the operational area; determine a pose based on the sensor input, Wherein the pose is deterrnined based on an analysis of the sensor input, the analysis being constrained by that the position of the three or more Wheels being on the surface of the operational area, Wherein the controller is further configured to determine that a Wheel is on the surface taking into account the curvature of the surface.
In some embodiments the robotic Work tool further comprises a Visual odometry sensor being a mono camera and Wherein the sensor input comprises at least one image, and Wherein the controller is further configured to determine the pose based on the at least one image, Wherein the pose is deterrnined based on an analysis of the image.
In some embodiments the controller is further configured to determine the curvature based on Gauss-Bonnet theorem.
In some embodiments the pose includes rotation and translation.
In some embodiments an initial pose is deterrnined based on the sensor input.
In some embodiments the controller is further configured to determine a curvature for the surface of the operational area based on the received sensor input in combination With mapping the operational area.
In some embodiments the controller is further configured to determine a curvature for the surface of the operational area based on the map application.
In some embodiments the map application comprises indications of the curvature.
In some embodiments the sensor input is stored in the memory and the sensor input is received from the deduced reckoning sensors through having been stored in the memory as part of the map application stored as part of a mapping of the operational area.
In some embodiments the robotic Work tool is a robotic laWnmoWer.
It is also an object of the teachings of this application to overcome the problems by providing a method for use in a robotic Work tool system comprising a robotic Work tool arranged to operate in an operational area, the operational area having a surface that is at least partially irregular, and the robotic work tool comprising controller, a memory, three or more wheels, deduced reckoning sensors, the memory being configured to store a map application and to store data regarding the positions of the wheels on the robotic work tool, and wherein the method comprises: receiving sensor input from the deduced reckoning sensors; deterrnining a curvature for the surface of the operational area; deterrnining a pose based on the sensor input, wherein the pose is deterrnined based on an analysis of the sensor input, the analysis being constrained by that the position of the three or more wheels being on the surface of the operational area, wherein the method further comprises deterrnining that a wheel is on the surface taking into account the curvature of the surface.
It is also an object of the teachings of this application to overcome the problems by providing a computer-readable medium carrying computer instructions that when loaded into and executed by a controller of a robotic work tool enables the robotic work tool to implement the method according to herein.
Further embodiments and aspects are as in the attached patent claims and as discussed in the detailed description.
Other features and advantages of the disclosed embodiments will appear from the following detailed disclosure, from the attached dependent claims as well as from the drawings. Generally, all terms used in the claims are to be interpreted according to their ordinary meaning in the technical field, unless explicitly defined otherwise herein. All references to "a/an/the [element, device, component, means, step, etc.]" are to be interpreted openly as referring to at least one instance of the element, device, component, means, step, etc., unless explicitly stated otherwise. The steps of any method disclosed herein do not have to be performed in the exact order disclosed, unless explicitly stated.
BRIEF DESCRIPTION OF THE DRAWINGS The invention will be described in further detail under reference to the accompanying drawings in which: Figure 1A shows a schematic View of the components of an example of a robotic work tool being a robotic lawnmower according to some example embodiments of the teachings herein; Figure 1B shows a schematic side-View of an example of a robotic work tool operating on a flat surface; Figure 1C shows a schematic side-View of an example of a robotic work tool operating on an irregular surface; Figure 2 shows a schematic View of a robotic work tool system according to some example embodiments of the teachings herein; Figure 3 shows a schematic View of a robotic work tool system according to some example embodiments of the teachings herein; Figure 4A shows a schematic View of graph of geodesics according to some example embodiments of the teachings herein; Figure 4B shows a schematic View of graph of geodesics according to some example embodiments of the teachings herein; Figure 5 shows a corresponding flowchart for a method according to some example embodiments of the teachings herein; and Figure 6 shows a schematic View of a computer-readable medium carrying computer instructions that when loaded into and executed by a controller of a robotic work tool, enables the robotic work tool to implement the teachings herein.
DETAILED DESCRIPTION The disclosed embodiments will now be described more fully hereinafter with reference to the accompanying drawings, in which certain embodiments of the inVention are shown. This inVention may, howeVer, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Like reference numbers refer to like elements throughout.
It should be noted that eVen though the description giVen herein will be focused on robotic lawnmowers, the teachings herein may also be applied to, robotic ball collectors, robotic mine sweepers, robotic farrning equipment, or other robotic work tools.
Figure 1A shows a schematic overview of a robotic work tool 100, here exemplif1ed by a robotic lawnmower 100. The robotic work tool 100 may be a multi- chassis type or a mono-chassis type (as in figure 1A). A multi-chassis type comprises more than one main body parts that are movable with respect to one another. A mono- chassis type comprises only one main body part.
It should be noted that robotic lawnmower may be of different sizes, where the size ranges from merely a few decimetres for small garden robots, to even more than 1 meter for large robots arranged to service for example airf1elds.
It should be noted that even though the description herein is focussed on the example of a robotic lawnmower, the teachings may equally be applied to other types of robotic work tools, such as robotic watering tools, robotic golf ball collectors, and robotic mulchers to mention a few examples.
In some embodiments, and as will be discussed below, the robotic work tool is a semi-controlled or at least supervised autonomous work tool, such as farrning equipment or large lawnmowers, for example riders or comprising tractors being autonomously controlled.
It should also be noted that the robotic work tool is a self-propelled robotic work tool, capable of autonomous navigation within an operational area, where the robotic work tool propels itself across or around the operational area in a pattem (random or predeterrnined).
The robotic work tool 100, exemplif1ed as a robotic lawnmower 100, has a main body part 140, possibly comprising a chassis 140 and an outer shell 140A, and a plurality of wheels 130 (in this example four wheels 130, but other number of wheels are also possible, such as three or six).
The main body part 140 substantially houses all components of the robotic lawnmower 100. At least some of the wheels 130 are drivably connected to at least one electric motor 155 powered by a battery 150. It should be noted that even if the description herein is focused on electric motors, combustion engines may altematively be used, possibly in combination with an electric motor. In the example of figure 1, each of the wheels 130 is connected to a common or to a respective electric motor 155 for driving the wheels 130 to navigate the robotic lawnmower 100 in different manners.
The wheels, the motor 155 and possibly the battery 150 are thus examples of components making up a propulsion device. By controlling the motors 155, the propulsion device may be controlled to propel the robotic lawnmower 100 in a desired manner, and the propulsion device will therefore be seen as synonymous with the motor(s) 150. It should be noted that wheels 130 driven by electric motors is only one example of a propulsion system and other variants are possible such as caterpillar tracks.
The robotic lawnmower 100 also comprises a controller 110 and a computer readable storage medium or memory 120. The controller 110 may be implemented using instructions that enable hardware functionality, for example, by using executable computer program instructions in a general-purpose or special-purpose processor that may be stored on the memory 120 to be executed by such a processor. The controller 110 is configured to read instructions from the memory 120 and execute these instructions to control the operation of the robotic lawnmower 100 including, but not being limited to, the propulsion and navigation of the robotic lawnmower.
The controller 110 in combination with the electric motor 155 and the wheels 130 forms the base of a navigation system (possibly comprising further components) for the robotic lawnmower, enabling it to be self-propelled as discussed.
The controller 110 may be implemented using any suitable, available processor or Programmable Logic Circuit (PLC). The memory 120 may be implemented using any commonly known technology for computer-readable memories such as ROM, FLASH, DDR, or some other memory technology.
The robotic lawnmower 100 is further arranged with a wireless communication interface 115 for communicating with other devices, such as a server, a personal computer, a smartphone, the charging station, and/or other robotic work tools. Examples of such wireless communication devices are Bluetooth®, WiFi® (IEEE802. 1 lb), Global System Mobile (GSM) and LTE (Long Term Evolution), to name a few. The robotic lawnmower 100 may be arranged to communicate with a user equipment (not shown but will be regarded as being an example of a server, as an example of a connected device) as discussed in relation to figure 2 below for providing information regarding status, location, and progress of operation to the user equipment as well as receiving commands or settings from the user equipment. Altematively or additionally, the robotic 1aWnmoWer 100 may be arranged to communicate With a server (referenced 240 in figure 2) for providing information regarding status, location, and progress of operation as Well as receiving commands or settings.
The robotic 1aWnmoWer 100 also comprises a Work tool 160, Which in the example of the robotic 1aWnmoWer 100 is a grass cutting device 160, such as a rotating blade 160/2 driven by a cutter motor 160/ 1. In embodiments Where the robotic Work tool 100 is exemplified as an automatic grinder, the Work tool 160 is a rotating grinding disc.
For enabling the robotic laWnmoWer 100 to navigate With reference to a Wire, such as a boundary Wire or a guide Wire, emitting a magnetic field caused by a control signal transmitted through the Wire, the robotic 1aWnmoWer 100 is, in some embodiments, configured to have at least one magnetic field sensor 170 arranged to detect the magnetic field and for detecting the Wire and/or for receiving (and possibly also sending) information to/from a signal generator. In some embodiments, such a magnetic boundary is used to provide a border (not shown explicitly in figure 2, but deemed to be included in the boundary 220) enclosing an operational area (referenced 205 in figure 2).
In some embodiments the robotic 1aWnmoWer 100 comprises a satellite signal navigation sensor 175 configured to provide navigational information (such as position) based on receiving one or more signals from a satellite - possibly in combination With receiving a signal from a base station. In some embodiments the satellite navigation sensor is a GPS (Global Positioning System) device or other Global Navigation Satellite System (GNSS) device. In some embodiments the satellite navigation sensor 175 is a RTK sensor. This enables the robotic Work tool to operate in an operational area bounded by a virtual border (not shown explicitly in figure 2 but deemed to be included in the boundary 220).
The robotic 1aWnmoWer 100 also comprises deduced reckoning sensors 180. The deduced reckoning sensors may be odometers, accelerometers or other deduced reckoning sensors. In some embodiments, the robotic Work tool comprises a visual odometery sensor 185, possibly comprised in or connected to the deduced reckoning sensors 180. In some embodiments, the visual odometry sensor is a mono-camera 185, wherein mono relates to a camera having a single Field-Of-View (FOV) in contrast to a stereo camera. Such cameras have the benefit of being cheap.
In some embodiments, the deduced reckoning sensors are comprised in the propulsion device, wherein a deduced reckoning navigation may be provided by knowing the current supplied to a motor and the time the current is supplied, which will give an indication of the speed and thereby distance for the corresponding wheel.
The deduced reckoning sensors 180, especially in combination with the visual odometry sensor 185, enables the root to operate according to a map of the operational area. In some such embodiments, the navigation is based on SLAM, and in some embodiments, where a visual odometry sensor (such as a camera) 185 is utilized, the navigation is based on V-SLAM.
The robotic lawnmower 100 is in some embodiments arranged to operate according to a map application (indicated in figure 2 and referenced 120A) representing one or more operational areas (and possibly the surroundings of the operational area(s)) as well as features of the operational area(s) stored in the memory 120 of the robotic lawnmower 100. In some embodiments, the map is also or altematively stored in the memory of a server (referenced 240 in figure 2). The map application may be generated or supplemented as the robotic lawnmower 100 operates or otherwise moves around in the operational area. In some embodiments, the map application is downloaded, possibly from the server. In some embodiments, the map application also includes one or more transport areas. The robotic work tool 100 is arranged to navigate according to the map based on the deduced reckoning sensors 180.
In some embodiments the robotic work tool is arranged or configured to traverse and operate in operational areas that are not essentially flat, but contain terrain that is of varying altitude, such as undulating, comprising hills or slopes or such. The ground of such terrain is not flat and it is not straightforward how to determine an angle between a sensor mounted on the robotic work tool and the ground. The robotic work tool is also or altematively arranged or conf1gured to traverse and operate in an operational area that contains obstacles that are not easily discemed from the ground. Examples of such are grass or moss-covered rocks, roots or other obstacles that are close to ground and of a similar colour or texture as the ground. The robotic work tool is also or altematively arranged or conf1gured to traverse and operate in an operational area that contains obstacles that are overhanging, i.e. obstacles that may not be detectable from the ground up, such as low hanging branches of trees or bushes. Such a garden is thus not simply a flat lawn to be mowed or similar, but an operational area of unpredictable structure and characteristics. The operational area exemplif1ed with referenced to figure 2, may thus be such a non-uniforrn operational area as disclosed in this paragraph that the robotic work tool is arranged to traverse and/or operate in.
Figure 1B shows a situation where the robotic work tool 100 is navigating a flat surface of an operational area utilizing a (mono) camera 185. Knowing one or more angles ot, ß of the Field-Of-View (FOV) of the camera 185 as well as knowing the height of the camera°s placement form the ground, enables for an accurate deterrnination of the length L of the area covered by the FOV, which in tum enables for an easy deterrnination of any scale S in the FOV. It should eb noted that it is not needed to know exactly the height and/or the angles ot, ß indicated in figure 1B, but the same may be deterrnined based on other measurements. However, the disclosure herein will focus on these measurements to illustrate how a problem is overcome.
Figure 1C shows a situation where the robotic work tool 100 is navigating an irregular surface of an operational area utilizing a (mono) camera 185. Knowing the measurements will not enable the robotic work tool 100 to determine a proper scale S" in the FOV as the measurements no longer correlate. Also, simply deterrnining the angle of the robotic work tool 100 (utilizing for example a gyro or other IMU), will not solve the problem as such a deterrnination will also require that the surface is sloping at a same slope, and not be irregular as in figure 1C. In the situation of figure 1C, the robotic work tool 100 will thus not be able - or at least experience diff1culties -to determine a current pose of the robotic work tool 100, the pose including the current orientation as well as the location of the robotic work tool 100. This will reduce the accuracy of any SLAM-based navigation as the SLAM-navigation is dependent on deterrnining poses for the robotic work tool 100 over time.
Figure 2 shows a robotic work tool system 200 in some embodiments. The schematic view is not to scale. The robotic work tool system 200 comprises one or more robotic work tools 100 according to the teachings herein arranged to operate in one or more operational areas 205 possibly bounded by a boundary 220. It should be noted that the operational area 205 shown in figure 2 is simplif1ed for illustrative purposes.
The view of the operational area 205 is also intended to be an illustration or graphical representation of the map application l20A discussed in the above.
A server 240 is shown as an optional connected device for the robotic work tool l00 to communicate with - possibly for rece4iving maps or map updates. The server 240 comprises a controller 240A for controlling the operation of the server 240, a memory 240B for storing instructions and data relating to the operation of the server 240 and a communication interface 240C for enabling the server 240 to communicate with other entities, such as the robotic work tool l00, and/or a User Equipment such as a mobile phone. The controller, the memory and the communication interface may be of similar types as discussed in relation to figure l for the robotic work tool l00.
As with figure l, the robotic work tool(s) is exemplified by a robotic lawnmower, whereby the robotic work tool system may be a robotic lawnmower system or a system comprising a combinations of robotic work tools, one being a robotic lawnmower, but the teachings herein may also be applied to other robotic work tools adapted to operate within an operational area.
As is shown in figure 2 there may be obstacles such as houses, structures, trees to mention a few examples in the operational area 205. In figure 2 such obstacles are indicated and referenced H (as in house). There may also be one or more irregularities in the surface of the operational area, which are exemplified in figure 2 as two slopes S l, S2 It should be noted that any processing may be done in any, some or all of the controller ll0 of the robotic work tool l00 and/or the controller 240A of the server 240 and that the processing may also be done partially in one controller ll0/240A for supplemental processing in the other controller ll0/240A. This is indicated in f1gure 2A in that a dashed arrow is shown between the server 240and the robotic work tool l00 for indicating that information may be passed freely between them for (partial) processing.
The inventors have realized that by knowing - at least certain features of - the geometry of the robotic work tool l00, certain restraints can be put on the deterrnination 11 of the pose, based on the sensor input received from the deduced reckoning sensors 180, and possibly from the camera 185.
As discussed in the above, the scale is difficult to determine when the robotic work tool is operating on an irregular surface. As the scale is difficult to determine, so is the pose of the robotic work tool, and as the pose is difficult to determine, it will also be difficult to correlate the movements of the robotic work tool 100 to a map application. For example, if an incorrect orientation of the robotic work tool 100 is deterrnined, any sensor input, for example relating to wheel tum counts, will result in a translation of the robotic work tool 100 in the wrong direction. Furthermore, and as the inventors have realized, an orientation in a direction pointing away from the surface, will result in the robotic work tool having a deterrnined position that is impossible.
Figure 3 shows a situation where a robotic work tool 100 is navigating an irregular surface. The pose P of the robotic work tool 100 is to be deterrnined based on the deduced reckoning sensors 180, and/or the camera 185, even when operating on an irregular surface.
As mentioned above, the inventors have realized that by knowing - at least some features of- the geometry of the robotic work tool 100, certain constraints may be put on the deterrninations. IN the example of figure 3, the two features of the geometry is to know the location of the wheels, WP1, WP2 of the wheels 130 of the robotic work tool 100. The constraints put on the deterrninations based on these geometry features are that the wheels have to be on the surface.
As discussed in the above, the robotic work tool 100 is arranged to operate according to a map application 120A. The map application 120A is in some embodiments generated by the robotic work tool 100 itself, in some embodiments downloaded from the server 240 or in some embodiments being generated, such as through V-SLAM.
As the controller 110 deterrnines the pose P of the robotic work tool (the pose including the location as well as the orientation), the pose is correlated to the map application 120A.
As mentioned in the above, the correlation is made based on the constraints. 12 In order to be able to do this - utilizing only input form the deduced reckoning sensors 180 and/or possibly the (mono) camera 185, the inventors have proposed to enable the controller 110 to determine a curvature C at a current position or region in the map application l20A. The curvature C is in some embodiments deterrnined by the controller 110 based on map features. The curvature C is in some embodiments deterrnined by the controller 110 by being retrieved from the map application 120A, Whereby the curvature of a position or region is stored in the map application l20A.
As mentioned in the above, the pose P includes the position as Well as the orientation of the robotic Work tool 100. In some embodiments the position and the orientation is represented by a rotation matrix R, and a translation vector T from Which the current orientation and position of a pose is deterrnined based on a previous pose.
The robotic Work tool 100 is thus enabled or configured to - for the purpose of locating the robotic Work tool 100 in the map application 120A - estimate the pose of the robot by such a rotation matrix R and a translation vector T. The rotation matrix R representing a total rotation from some initial or previous pose orientation and the translation vector T representing a total translation relative to some initial or previous pose position.
The controller 110 is thus enabled to determine a pose of the robotic Work tool, based on the curvature C of the current region that the robotic Work tool 100 is in of the operational area 205 and by applying the constraints inferred by the geometry of the robotic Work tool 100. In some embodiments the curvature is deterrnined based on the Gauss-Bonnet theorem.
Figure 4 shows a schematic vieW of a graph of a deterrnined curvature for a position or region of a map application. As discussed above, the robotic Work tool is configured to determine the pose of the robotic Work tool relative real-World World coordinates, and (as Would be understood by a skilled person) by solving a system of equations verify if the constraints are met. In this example to verify that the Wheels 130 of the robotic Work tool 100 are tangent to the surface. The problem is relatively simple in the planar case, but its generalization to curved or irregular surfaces is rather involved. 13 If the constraint, in this example the tangency of the Wheels condition, is not satisfied the robotic Work tool is configured to solve an optimization problem on pose refinement.
In order to solve this optimization problem, the controller 110 is configured to determine a rotation matrix R" and a translation vector T" that are close to R and T respectively, that satisfy the tangency condition and the nonholonomic constraint that arises from the motion on the surface as Will be described later. This new estimate is then fed into a Kalman filter. In case of an accurate mapping of the environment such a correction is supposed to be correct on average and improve the estimate.
In one example, the distance from one (rear) Wheel dl is 223 mm to a center line (referenced "axis" in figure 1A) of the robotic Work tool 100, the distance from another (front) Wheel d2 is 150 mm to the center line of the robotic Work tool 100. The distance a betWeen the Wheels" axis is 331 mm and the radius r of a Wheel is 119 mm. In this example, the points (referenced WP in figure 3) of contact of the Wheels With the ground (or at least the loWest points on the Wheels) are expressed in the a coordinate system of the robotic Work tool 100 as p1 = (0, d1, - r) for the right rear Wheel, p2 = (0, -d1, -r) for the left rear Wheel, p3 = (a, d2, -r) for the front right Wheel, and p4 = (a, -d2, -r) for the front left Wheel.
Assuming that the operation of the robotic Work tool 100 is not in an irregular area, an initial pose is deterrnined based on the sensor input from the deduced reckoning sensors 180, and/or the camera 185, hereafter referred to as the (visual) odometry. This is a fair assumption as the robotic Work tool most likely starts to operate from a charging station that is located in a flat area. A relative rotation and translation are updated continuously or at (regular) intervals by the controller 110 during operation in the operational area 205.
In the following the relative rotation matrix from the initial position Will be denoted R and the relative translation vector from the initial position Will be denoted T. This enables the robotic Work tool 100 to localize the pose in the real-World coordinates system or rather the coordinate system of the map application 120A.
In addition, consecutive sensor input, such as camera or image frames, are in some embodiments utilized to create a 3 dimensional point cloud of matched feature points in 2 O 14 the consecutive sensor input, or in consecutive image frames, When the scale of the translation T is known, it is possible to localize the point cloud in 3D space in global 3D coordinates, Which Will be referred to as World coordinate system in Which the area of operation of the robotic Work tool is most naturally represented, i.e. the coordinate system of the map application l20A. When a sufficiently large point cloud is collected it becomes possible to extrapolate the area by spline approximations.
The spline approximation enables more compact storage of the geometry of the operating area a Where the representation of hundreds of points can be replaced by storing only of 6 to l0 coefficients for a patch. Such a representation gives on average a more curate representation of the geometry of the area in Which the robot operates.
In situations Where the operating environment of the robot is unchanging and the robot perforrns an action repetitively at the same location, this knowledge of the surface geometry can be used It is also possible to boost on average the precision of the position estimation by enforcing the geometry constraints. Suppose that in global coordinates patch the point cloud is approximated by the equation (X, Y, Z(X, Y) ). In addition, suppose that the total pose transformation of the robot is given by [R,T]. Then the coordinates of the Wheels contact points in the real World are given by 0 0 0 0 Pl=R Rewriting the components We obtain a system of equations Z(P l_X(R,T), P l_y(R,T)) = P l_z(R,T) Z(P2_X(R,T), P2_y(R,T)) = P2_z(R,T) Z(P3_X(R,T), P3_y(R,T)) = P3_Z(R,T) Z(P4_X(R,T), P4_y(R,T)) = P4_z(R,T) We assume that the R,T that are computed by the visual odometry are pretty close to the optimum, We could therefore solve this system of equations by gradient descent method once the Rotation matrix R is parametrized by quatemions to enforce the constraint of the rotation matrix. We therefore solve the minimization problem by solving the minimization problem min f(R(q), T) where q and T are free parameters by gradient descend, where the initial point is R(q0), TO estimated by the visual odometry. The process should converge to the minimum fast as the visual odometry provides a good initial guess.
In order to impose or apply the various constraints, such as a motion constraint, the equation for the geodesics GD shown in figure 4, the estimated translation vector T, and the angles are estimated. For each patch of surface, we store the corresponding equation in a symbolic representation, as well as a numerical one. This allows to easily construct and solve the equations for the geodesic lines connecting the points which will be used to enforce the constraint related to the motion on the surface. This can be done by letting YOIN) = (X(11,V),Y(11,V), 2(11,V)) be a regular parametrization of a patch of regular surface. In such a case we have two linearly independent tangent vectors: 3-1: = ru = (Xu(u,v), yu(u,v), Zu(u,v)) â-Z = rv = (Xv(u,v), yv(u,v), Zv(u,v)) The following coefficients are called the coefficients of the first fundamental form or the matric. The Christoffel symbols can then be computed by 16 .ï is ~ _ N *F fi" -ï- -s 'm .ä ~^^~ här: :itä .F Using the estimated initial and f1nal points, We can f1nd the unique shortest geodesic connecting the points. By solving nunierically the geodesic equation There is a theoreni that guarantees the existence of such a geodesic. We then obtain the geodesic triangle as appears in the image above. This triangle is in fact a geodesic triangle and We use the Gauss-Bonnet theoren1 to impose a constraint that relates the angles of the triangle.
The Gauss-Bonnet theoreni in1plies that for a geodesic triangle T on a sn1ooth 2- din1entional Rieniannian n1anifold holds Where alpha, beta, and gamnia are the angle of the triangle, and K is the Gauss curVature that for the surface (X,Y, Z(X,Y)) can be con1puted by the forrnula 17 In general case we can also use the Christoffel symbols Knowing the angles allows to ref1ne the rotation and the translation. To find the third vertex of the triangle we compute the intersection point of the two geodesics at the initial positions which are norrnal to the direction of motion as described in figure 4B. The angle theta can be inferred from the rotation matrix R. If we assume that on the average the base angle of the geodesic triangle will be equal, then the direction angle of the translation vector T is fp Raset-fl This imposes an additional constraint on the translation vector T and relates it to the rotation and the total motion on the curved surface. This is a natural generalization of the planar case which is a special case of this 15 forrnula for K=0. Imposing this constraint and fusing this estimation with the visual odometry result using Kalman filter will improve on average the positioning accuracy. Figure 5 shows a flowchart for a general method according to herein. The method is for use in a robotic work tool as in figure 1A in a manner as discussed above in relation to figures 2, 3, 4A and 4B, namely for use in a robotic work tool system comprising a 20 robotic work tool 100 arranged to operate in an operational area 205, the operational area 205 having a surface that is at least partially irregular, and the robotic work tool comprising controller 110, a memory 120, three or more wheels 130, deduced reckoning sensors 180, the memory 120 being conf1gured to store a map application 120A and to store data regarding the positions WP of the wheels 130 on the robotic work tool 100, 18 The method comprises a controller of the robotic Work tool system receiving 5 l0 sensor input from the deduced reckoning sensors 180 and deterrnining 520 a curvature C for the surface of the operational area 205. The method also comprises deterrnining 530 a pose based on the sensor input, Wherein the pose is deterrnined 540 based on an analysis of the sensor input, the analysis being constrained 540 by that the position of the three or more Wheels being on the surface of the operational area, Wherein the method further comprises deterrnining 550 that constraints are met, such as that a Wheel is on the surface taking into account the curvature (C) of the surface.
In some embodiments, and as discussed in the above, the robotic Work tool may further comprise a visual odometry sensor being a camera 185, in Which case the sensor input comprises at least one image. In such embodiments, the method comprises receiving image data 5 l5 and deterrnining the pose based on the at least one image, Wherein the pose is deterrnined based on an analysis of the image.
In some embodiments, and as discussed in the above, the method comprises deterrnining the curvature based on Gauss-Bonnet theorem.
In some embodiments, and as discussed in the above, the robotic Work tool is configured to map the area as it is traversed. In such embodiments the method comprises deterrnining the curvature (C) for the surface of the operational area (205) based on the received sensor input in combination With mapping the operational area.
In some embodiments, and as also discussed in the above, the robotic Work tool is also configured to retrieve the map from the memory and in such embodiments the method comprises deterrnining the curvature (C) for the surface of the operational area 205 based on the map application l20A. In some such embodiments the map application (l20A) comprises indications of the curvature (C) and the controller simply deterrnines the curvature based on the (pre-calculated) indications, Which may have been calculated When making the map or prior to doWnloading the map.
In some such embodiments, the sensor input may also be - at least supplementally - stored in the memory. In such cases the method comprises receiving sensor input from the deduced reckoning sensors through having been stored in the memory l20 as part of the map application l20A stored as part of a mapping of the operational area 205. 19 As discussed herein and assumed under figure 5, the deterrninations are made by the robotic work tool 100. In such embodiments the controller is the controller 110 of the robotic work tool 100.
In some embodiments, and as also discussed in the above, some processing may be done by the server and in such embodiments the controller is the controller 240A of the server 240.
And, in some embodiments the controller is the controller 110 of the robotic work tool 100 for perforrning some of the processing and the controller 240A of the server 240 for perforrning some of the processing for a shared processing, where some tasks are perforrned by one controller, and the remaining tasks by the other controller.
Figure 6 shows a schematic view of a computer-readable medium 600 carrying computer instructions 610 that when loaded into and executed by a controller of a device, such as a robotic work tool 100 or a server 240, enables the device to implement the teachings herein. In the example of figure 6, the device will be exemplified as the robotic work tool 100. The computer-readable medium 600 may be tangible such as a hard drive or a flash memory, for example a USB memory stick or a cloud server. Altematively, the computer-readable medium 600 may be intangible such as a signal carrying the computer instructions enabling the computer instructions to be downloaded through a network connection, such as an intemet connection. In the example of figure 6, a computer-readable medium 600 is shown as being a hard drive or computer disc 600 carrying computer-readable computer instructions 610, being inserted in a computer disc reader 620. The computer disc reader 620 may be part of a cloud server 630 - or other server - or the computer disc reader 620 may be connected to a cloud server 630 - or other server. The cloud server 630 may be part of the intemet or at least connected to the intemet. The cloud server 630 may altematively be connected through a proprietary or dedicated connection. In one example embodiment, the computer instructions are stored at a remote server 630 and be downloaded to the memory 120 of the robotic work tool 100 for being executed by the controller 110.
The computer disc reader 620 may also or altematively be connected to (or possibly inserted into) a robotic work tool 100 for transferring the computer-readable computer instructions 610 to a controller of the robotic Work tool 100 (presumably Via a memory of the robotic Work tool 100).
Figure 6 shows both the situation When a robotic Work tool 100 receives the computer-readable computer instructions 610 via a server connection and the situation When another robotic Work tool 100 receives the computer-readable computer instructions 610 through a Wired interface. This enables for computer-readable computer instructions 610 being downloaded into a robotic Work tool 100 thereby enabling the robotic Work tool 100 to operate according to and implement the inVention as disclosed herein.

Claims (12)

1. A robotic Work tool system comprising a robotic Work tool (100) arranged to operate in an operational area (205), the operational area (205) having a surface that is at least partially irregular, and the robotic Work tool comprising controller (110), a memory (120), three or more Wheels (130), deduced reckoning sensors (180), the memory (120) being configured to store a map application (120A) and to store data regarding the positions of the Wheels (130) on the robotic Work tool (100), and Wherein the controller (110) is configured to receive sensor input from the deduced reckoning sensors (180); determine a curvature (C) for the surface of the operational area (205); determine a pose based on the sensor input, Wherein the pose is deterrnined based on an analysis of the sensor input, the analysis being constrained by that the position of the three or more Wheels being on the surface of the operational area, Wherein the controller is further configured to determine that a Wheel is on the surface taking into account the curvature (C) of the surface.
2. The robotic Work tool system (300) according to claim 1, Wherein the robotic Work tool further comprises a Visual odometry sensor being a mono camera (185) and Wherein the sensor input comprises at least one image, and Wherein the controller is further configured to determine the pose based on the at least one image, Wherein the pose is deterrnined based on an analysis of the image.
3. The robotic Work tool system (3 00) according to claim 1 or 2, Wherein the controller is further configured to determine the curvature based on Gauss-Bonnet theorem.
4. The robotic Work tool system (3 00) according to any preceding claim, Wherein the pose includes rotation and translation.
5. The robotic Work tool system (3 00) according to any preceding claim, Wherein an initial pose is deterrnined based on the sensor input.
6. The robotic Work tool system (3 00) according to any preceding claim, Wherein the controller is further conf1gured to deterrnine a curvature (C) for the surface of the operational area (205) based on the received sensor input in combination With mapping the operational area.
7. The robotic Work tool system (300) according to any of claims l to 5, Wherein the controller is further conf1gured to deterrnine a curvature (C) for the surface of the operational area (205) based on the map application (l20A).
8. The robotic Work tool system (300) according to claim 7, Wherein the map application comprises indications of the curvature (C).
9. The robotic Work tool system (3 00) according to any preceding claim, Wherein the sensor input is stored in the memory and the sensor input is received from the deduced reckoning sensors through having been stored in the memory (l20) as part of the map application (l20A) stored as part of a mapping of the operational area (205).
10.l0. The robotic Work tool system (3 00) according to any preceding claim, Wherein the robotic Work tool (l00) is a robotic laWnmoWer (l00).
11.ll. A method for use in a robotic Work tool system comprising a robotic Work tool (l00) arranged to operate in an operational area (205), the operational area (205) having a surface that is at least partially irregular, and the robotic Work tool comprising controller (l l0), a memory (l20), three or more Wheels (l30), deduced reckoning sensors (180), the memory (l20) being conf1gured to store a map application (l20A) and to store data regarding the positions of the Wheels (l30) on the robotic Work tool (l 00), and Wherein the method comprises: receiving sensor input from the deduced reckoning sensors (l80); deterrnining a curvature (C) for the surface of the operational area (205);deterrnining a pose based on the sensor input, Wherein the pose is deterrnined based on an analysis of the sensor input, the analysis being constrained by that the position of the three or more Wheels being on the surface of the operational area, Wherein the method further comprises deterrnining that a Wheel is on the surface taking into account the curvature (C) of the surface.
12. A computer-readable medium (600) carrying computer instructions (610) that When loaded into and executed by a controller (110) of a robotic Work tool (100) enables the robotic Work tool (100) to implement the method according to claim 11.
SE2250834A 2022-07-04 2022-07-04 Improved determination of pose for a robotic work tool SE2250834A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
SE2250834A SE2250834A1 (en) 2022-07-04 2022-07-04 Improved determination of pose for a robotic work tool
PCT/SE2023/050369 WO2024010503A1 (en) 2022-07-04 2023-04-21 Improved determination of pose for a robotic work tool

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
SE2250834A SE2250834A1 (en) 2022-07-04 2022-07-04 Improved determination of pose for a robotic work tool

Publications (1)

Publication Number Publication Date
SE2250834A1 true SE2250834A1 (en) 2024-01-05

Family

ID=86286150

Family Applications (1)

Application Number Title Priority Date Filing Date
SE2250834A SE2250834A1 (en) 2022-07-04 2022-07-04 Improved determination of pose for a robotic work tool

Country Status (2)

Country Link
SE (1) SE2250834A1 (en)
WO (1) WO2024010503A1 (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4694639A (en) * 1985-12-30 1987-09-22 Chen Sheng K Robotic lawn mower
WO2007051972A1 (en) * 2005-10-31 2007-05-10 Qinetiq Limited Navigation system
US10037038B2 (en) * 2006-03-17 2018-07-31 Irobot Corporation Lawn care robot
US20190054621A1 (en) * 2017-08-16 2019-02-21 Franklin Robotics, Inc. Inertial Collision Detection Method For Outdoor Robots
US20200108499A1 (en) * 2015-03-18 2020-04-09 Irobot Corporation Localization and Mapping Using Physical Features
US20200409382A1 (en) * 2014-11-10 2020-12-31 Carnegie Mellon University Intelligent cleaning robot
US20210089034A1 (en) * 2019-09-25 2021-03-25 Husqvarna Ab Propulsion Control Arrangement, Robotic Tool, Method of Propelling Robotic Tool, and Related Devices
US20210289695A1 (en) * 2017-04-18 2021-09-23 Husqvarna Ab Method for Detecting Lifting of a Self-Propelled Tool from the Ground
US20220066456A1 (en) * 2016-02-29 2022-03-03 AI Incorporated Obstacle recognition method for autonomous robots

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2692667B1 (en) * 1992-06-18 1994-08-19 Alsthom Cge Alcatel Displacement measurement device for a vehicle, in particular for an all-terrain robot, and vehicle equipped with such a device.
PT106439A (en) * 2012-07-11 2014-01-13 Introsys Integration For Robotic Systems Integracao De Sist S Roboticos S A ROBOTIZED AND SELF-EMPLOYED ALL-O-VEHICLE
US11199853B1 (en) * 2018-07-11 2021-12-14 AI Incorporated Versatile mobile platform
SE544298C2 (en) * 2020-04-14 2022-03-29 Husqvarna Ab Robotic work tool system and method for defining a working area

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4694639A (en) * 1985-12-30 1987-09-22 Chen Sheng K Robotic lawn mower
WO2007051972A1 (en) * 2005-10-31 2007-05-10 Qinetiq Limited Navigation system
US10037038B2 (en) * 2006-03-17 2018-07-31 Irobot Corporation Lawn care robot
US20200409382A1 (en) * 2014-11-10 2020-12-31 Carnegie Mellon University Intelligent cleaning robot
US20200108499A1 (en) * 2015-03-18 2020-04-09 Irobot Corporation Localization and Mapping Using Physical Features
US20220066456A1 (en) * 2016-02-29 2022-03-03 AI Incorporated Obstacle recognition method for autonomous robots
US20210289695A1 (en) * 2017-04-18 2021-09-23 Husqvarna Ab Method for Detecting Lifting of a Self-Propelled Tool from the Ground
US20190054621A1 (en) * 2017-08-16 2019-02-21 Franklin Robotics, Inc. Inertial Collision Detection Method For Outdoor Robots
US20210089034A1 (en) * 2019-09-25 2021-03-25 Husqvarna Ab Propulsion Control Arrangement, Robotic Tool, Method of Propelling Robotic Tool, and Related Devices

Also Published As

Publication number Publication date
WO2024010503A1 (en) 2024-01-11

Similar Documents

Publication Publication Date Title
US20220253063A1 (en) Autonomous machine navigation and training using vision system
EP4386508A2 (en) Autonomous machine navigation using reflections from subsurface objects
EP4075229B1 (en) Improved installation for a robotic work tool
US20230195121A1 (en) Path generating method, program, path generating device, and autonomous mobile body
SE2250834A1 (en) Improved determination of pose for a robotic work tool
WO2022203562A1 (en) Improved navigation for a robotic work tool
EP4381925A1 (en) Improved detection of a solar panel for a robotic work tool
US20240199080A1 (en) Definition of boundary for a robotic work tool
EP4368004A1 (en) Improved operation and installation for a robotic work tool
EP4379489A1 (en) Improved definition of boundary for a robotic work tool
US20240182074A1 (en) Operation for a robotic work tool
US20230350421A1 (en) Navigation for a robotic work tool system
WO2023244150A1 (en) Improved navigation for a robotic work tool system
EP4397167A2 (en) Autonomous machine navigation and training using vision system
SE2151275A1 (en) Improved navigation for a robotic work tool system
SE2250557A1 (en) Navigation for a robotic work tool system
SE546019C2 (en) Improved mapping for a robotic work tool system
WO2023121528A1 (en) Improved navigation for a robotic work tool system
SE545472C2 (en) System and method for navigating a robotic work tool