WO2024010503A1 - Détermination améliorée d'une pose pour un outil de travail robotique - Google Patents

Détermination améliorée d'une pose pour un outil de travail robotique Download PDF

Info

Publication number
WO2024010503A1
WO2024010503A1 PCT/SE2023/050369 SE2023050369W WO2024010503A1 WO 2024010503 A1 WO2024010503 A1 WO 2024010503A1 SE 2023050369 W SE2023050369 W SE 2023050369W WO 2024010503 A1 WO2024010503 A1 WO 2024010503A1
Authority
WO
WIPO (PCT)
Prior art keywords
work tool
robotic work
operational area
controller
robotic
Prior art date
Application number
PCT/SE2023/050369
Other languages
English (en)
Inventor
Odi Dahan
Sergey LIFLANDSKY
Original Assignee
Husqvarna Ab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Husqvarna Ab filed Critical Husqvarna Ab
Publication of WO2024010503A1 publication Critical patent/WO2024010503A1/fr

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/20Control system inputs
    • G05D1/24Arrangements for determining position or orientation
    • G05D1/243Means capturing signals occurring naturally from the environment, e.g. ambient optical, acoustic, gravitational or magnetic signals
    • G05D1/2435Extracting 3D information
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01DHARVESTING; MOWING
    • A01D75/00Accessories for harvesters or mowers
    • A01D75/28Control mechanisms for harvesters or mowers when moving on slopes; Devices preventing lateral pull
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01DHARVESTING; MOWING
    • A01D34/00Mowers; Mowing apparatus of harvesters
    • A01D34/006Control or measuring arrangements
    • A01D34/008Control or measuring arrangements for automated or remotely controlled operation
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/20Control system inputs
    • G05D1/22Command input arrangements
    • G05D1/221Remote-control arrangements
    • G05D1/222Remote-control arrangements operated by humans
    • G05D1/224Output arrangements on the remote controller, e.g. displays, haptics or speakers
    • G05D1/2244Optic
    • G05D1/2245Optic providing the operator with a purely computer-generated representation of the environment of the vehicle, e.g. virtual reality
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/20Control system inputs
    • G05D1/24Arrangements for determining position or orientation
    • G05D1/246Arrangements for determining position or orientation using environment maps, e.g. simultaneous localisation and mapping [SLAM]
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/40Control within particular dimensions
    • G05D1/46Control of position or course in three dimensions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/579Depth or shape recovery from multiple images from motion
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1656Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with passive imaging devices, e.g. cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • G01C21/30Map- or contour-matching
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D2105/00Specific applications of the controlled vehicles
    • G05D2105/15Specific applications of the controlled vehicles for harvesting, sowing or mowing in agriculture or forestry
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D2107/00Specific environments of the controlled vehicles
    • G05D2107/20Land use
    • G05D2107/23Gardens or lawns
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D2109/00Types of controlled vehicles
    • G05D2109/10Land vehicles
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D2111/00Details of signals used for control of position, course, altitude or attitude of land, water, air or space vehicles
    • G05D2111/10Optical signals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D2111/00Details of signals used for control of position, course, altitude or attitude of land, water, air or space vehicles
    • G05D2111/60Combination of two or more signals
    • G05D2111/63Combination of two or more signals of the same type, e.g. stereovision or optical flow
    • G05D2111/65Combination of two or more signals of the same type, e.g. stereovision or optical flow taken successively, e.g. visual odometry or optical flow

Definitions

  • This application relates to a robotic work tool, such as a lawnmower, and a method for providing an improved determination of a pose for the robotic work tool, and in particular to a robotic work tool, such as a lawnmower, and a method for providing an improved determination of a pose for the robotic work tool utilizing deduced reckoning, and in particular to utilizing a mono camera as a visual odometry device.
  • Automated or robotic work tools such as robotic lawnmowers are becoming increasingly more popular and so is the use of the robotic work tool in various types of operational areas.
  • Such operational areas in particular for robotic work tools being robotic lawnmowers, often include irregular surfaces such as slopes.
  • Utilizing navigation techniques relying on odometry such as through deduced reckoning in such irregular areas suffer from drawbacks in such irregular areas, as the assumptions made in various odometry -based navigation techniques are based on a flat operating surface.
  • navigation techniques relying on visual odometry such as for Visual Simultaneous Localization and Mapping (V-SLAM)
  • V-SLAM Visual Simultaneous Localization and Mapping
  • advanced cameras are needed for establishing a scale of an observed area as without a proper scale, the interpretation is difficult to match to a map.
  • Such advanced cameras increase the cost of the robotic work tool.
  • a robotic work tool system comprising a robotic work tool arranged to operate in an operational area, the operational area having a surface that is at least partially irregular, and the robotic work tool comprising controller, a memory, three or more wheels, deduced reckoning sensors, the memory being configured to store a map application and to store data regarding the positions of the wheels on the robotic work tool, and wherein the controller is configured to receive sensor input from the deduced reckoning sensors; determine a curvature for the surface of the operational area; determine a pose based on the sensor input, wherein the pose is determined based on an analysis of the sensor input, the analysis being constrained by that the position of the three or more wheels being on the surface of the operational area, wherein the controller is further configured to determine that a wheel is on the surface taking into account the curvature of the surface.
  • the robotic work tool further comprises a visual odometry sensor being a mono camera and wherein the sensor input comprises at least one image, and wherein the controller is further configured to determine the pose based on the at least one image, wherein the pose is determined based on an analysis of the image.
  • a visual odometry sensor being a mono camera and wherein the sensor input comprises at least one image
  • the controller is further configured to determine the pose based on the at least one image, wherein the pose is determined based on an analysis of the image.
  • controller is further configured to determine the curvature based on Gauss-Bonnet theorem.
  • the pose includes rotation and translation.
  • an initial pose is determined based on the sensor input.
  • controller is further configured to determine a curvature for the surface of the operational area based on the received sensor input in combination with mapping the operational area.
  • controller is further configured to determine a curvature for the surface of the operational area based on the map application.
  • the map application comprises indications of the curvature.
  • the sensor input is stored in the memory and the sensor input is received from the deduced reckoning sensors through having been stored in the memory as part of the map application stored as part of a mapping of the operational area.
  • the robotic work tool is a robotic lawnmower.
  • a robotic work tool system comprising a robotic work tool arranged to operate in an operational area, the operational area having a surface that is at least partially irregular, and the robotic work tool comprising controller, a memory, three or more wheels, deduced reckoning sensors, the memory being configured to store a map application and to store data regarding the positions of the wheels on the robotic work tool, and wherein the method comprises: receiving sensor input from the deduced reckoning sensors; determining a curvature for the surface of the operational area; determining a pose based on the sensor input, wherein the pose is determined based on an analysis of the sensor input, the analysis being constrained by that the position of the three or more wheels being on the surface of the operational area, wherein the method further comprises determining that a wheel is on the surface taking into account the curvature of the surface.
  • Figure 1 A shows a schematic view of the components of an example of a robotic work tool being a robotic lawnmower according to some example embodiments of the teachings herein;
  • Figure IB shows a schematic side-view of an example of a robotic work tool operating on a flat surface
  • Figure 1C shows a schematic side-view of an example of a robotic work tool operating on an irregular surface
  • Figure 2 shows a schematic view of a robotic work tool system according to some example embodiments of the teachings herein;
  • Figure 3 shows a schematic view of a robotic work tool system according to some example embodiments of the teachings herein;
  • Figure 4A shows a schematic view of graph of geodesics according to some example embodiments of the teachings herein;
  • Figure 4B shows a schematic view of graph of geodesics according to some example embodiments of the teachings herein;
  • Figure 5 shows a corresponding flowchart for a method according to some example embodiments of the teachings herein.
  • Figure 6 shows a schematic view of a computer-readable medium carrying computer instructions that when loaded into and executed by a controller of a robotic work tool, enables the robotic work tool to implement the teachings herein.
  • Figure 1 A shows a schematic overview of a robotic work tool 100, here exemplified by a robotic lawnmower 100.
  • the robotic work tool 100 may be a multichassis type or a mono-chassis type (as in figure 1 A).
  • a multi-chassis type comprises more than one main body parts that are movable with respect to one another.
  • a monochassis type comprises only one main body part.
  • robotic lawnmower may be of different sizes, where the size ranges from merely a few decimetres for small garden robots, to even more than 1 meter for large robots arranged to service for example airfields.
  • robotic work tools such as robotic watering tools, robotic golfball collectors, and robotic mulchers to mention a few examples.
  • the robotic work tool is a semi-controlled or at least supervised autonomous work tool, such as farming equipment or large lawnmowers, for example riders or comprising tractors being autonomously controlled.
  • the robotic work tool is a self-propelled robotic work tool, capable of autonomous navigation within an operational area, where the robotic work tool propels itself across or around the operational area in a pattern (random or predetermined).
  • the robotic work tool 100 exemplified as a robotic lawnmower 100, has a main body part 140, possibly comprising a chassis 140 and an outer shell 140A, and a plurality of wheels 130 (in this example four wheels 130, but other number of wheels are also possible, such as three or six).
  • the main body part 140 substantially houses all components of the robotic lawnmower 100. At least some of the wheels 130 are drivably connected to at least one electric motor 155 powered by a battery 150. It should be noted that even if the description herein is focused on electric motors, combustion engines may alternatively be used, possibly in combination with an electric motor. In the example of figure 1, each of the wheels 130 is connected to a common or to a respective electric motor 155 for driving the wheels 130 to navigate the robotic lawnmower 100 in different manners. The wheels, the motor 155 and possibly the battery 150 are thus examples of components making up a propulsion device.
  • the propulsion device may be controlled to propel the robotic lawnmower 100 in a desired manner, and the propulsion device will therefore be seen as synonymous with the motor(s) 150.
  • wheels 130 driven by electric motors is only one example of a propulsion system and other variants are possible such as caterpillar tracks.
  • the robotic lawnmower 100 also comprises a controller 110 and a computer readable storage medium or memory 120.
  • the controller 110 may be implemented using instructions that enable hardware functionality, for example, by using executable computer program instructions in a general -purpose or special-purpose processor that may be stored on the memory 120 to be executed by such a processor.
  • the controller 110 is configured to read instructions from the memory 120 and execute these instructions to control the operation of the robotic lawnmower 100 including, but not being limited to, the propulsion and navigation of the robotic lawnmower.
  • the controller 110 in combination with the electric motor 155 and the wheels 130 forms the base of a navigation system (possibly comprising further components) for the robotic lawnmower, enabling it to be self-propelled as discussed.
  • the controller 110 may be implemented using any suitable, available processor or Programmable Logic Circuit (PLC).
  • PLC Programmable Logic Circuit
  • the memory 120 may be implemented using any commonly known technology for computer-readable memories such as ROM, FLASH, DDR, or some other memory technology.
  • the robotic lawnmower 100 is further arranged with a wireless communication interface 115 for communicating with other devices, such as a server, a personal computer, a smartphone, the charging station, and/or other robotic work tools. Examples of such wireless communication devices are Bluetooth®, WiFi® (IEEE802.1 lb), Global System Mobile (GSM) and LTE (Long Term Evolution), to name a few.
  • the robotic lawnmower 100 may be arranged to communicate with a user equipment (not shown but will be regarded as being an example of a server, as an example of a connected device) as discussed in relation to figure 2 below for providing information regarding status, location, and progress of operation to the user equipment as well as receiving commands or settings from the user equipment. Alternatively or additionally, the robotic lawnmower 100 may be arranged to communicate with a server (referenced 240 in figure 2) for providing information regarding status, location, and progress of operation as well as receiving commands or settings.
  • a server referenced 240 in figure 2
  • the robotic lawnmower 100 also comprises a work tool 160, which in the example of the robotic lawnmower 100 is a grass cutting device 160, such as a rotating blade 160/2 driven by a cutter motor 160/1.
  • a work tool 160 is a rotating grinding disc.
  • the robotic lawnmower 100 For enabling the robotic lawnmower 100 to navigate with reference to a wire, such as a boundary wire or a guide wire, emitting a magnetic field caused by a control signal transmitted through the wire, the robotic lawnmower 100 is, in some embodiments, configured to have at least one magnetic field sensor 170 arranged to detect the magnetic field and for detecting the wire and/or for receiving (and possibly also sending) information to/from a signal generator.
  • a magnetic boundary is used to provide a border (not shown explicitly in figure 2, but deemed to be included in the boundary 220) enclosing an operational area (referenced 205 in figure 2).
  • the robotic lawnmower 100 comprises a satellite signal navigation sensor 175 configured to provide navigational information (such as position) based on receiving one or more signals from a satellite - possibly in combination with receiving a signal from a base station.
  • the satellite navigation sensor is a GPS (Global Positioning System) device or other Global Navigation Satellite System (GNSS) device.
  • GNSS Global Navigation Satellite System
  • the satellite navigation sensor 175 is a RTK sensor. This enables the robotic work tool to operate in an operational area bounded by a virtual border (not shown explicitly in figure 2 but deemed to be included in the boundary 220).
  • the robotic lawnmower 100 also comprises deduced reckoning sensors 180.
  • the deduced reckoning sensors may be odometers, accelerometers or other deduced reckoning sensors.
  • the robotic work tool comprises a visual odometery sensor 185, possibly comprised in or connected to the deduced reckoning sensors 180.
  • the visual odometry sensor is a mono-camera 185, wherein mono relates to a camera having a single Field-Of-View (FOV) in contrast to a stereo camera. Such cameras have the benefit of being cheap.
  • FOV Field-Of-View
  • the deduced reckoning sensors are comprised in the propulsion device, wherein a deduced reckoning navigation may be provided by knowing the current supplied to a motor and the time the current is supplied, which will give an indication of the speed and thereby distance for the corresponding wheel.
  • the deduced reckoning sensors 180 enables the root to operate according to a map of the operational area.
  • the navigation is based on SLAM, and in some embodiments, where a visual odometry sensor (such as a camera) 185 is utilized, the navigation is based on V-SLAM.
  • the robotic lawnmower 100 is in some embodiments arranged to operate according to a map application (indicated in figure 2 and referenced 120 A) representing one or more operational areas (and possibly the surroundings of the operational area(s)) as well as features of the operational area(s) stored in the memory 120 of the robotic lawnmower 100.
  • the map is also or alternatively stored in the memory of a server (referenced 240 in figure 2).
  • the map application may be generated or supplemented as the robotic lawnmower 100 operates or otherwise moves around in the operational area.
  • the map application is downloaded, possibly from the server.
  • the map application also includes one or more transport areas.
  • the robotic work tool 100 is arranged to navigate according to the map based on the deduced reckoning sensors 180.
  • the robotic work tool is arranged or configured to traverse and operate in operational areas that are not essentially flat, but contain terrain that is of varying altitude, such as undulating, comprising hills or slopes or such.
  • the ground of such terrain is not flat and it is not straightforward how to determine an angle between a sensor mounted on the robotic work tool and the ground.
  • the robotic work tool is also or alternatively arranged or configured to traverse and operate in an operational area that contains obstacles that are not easily discerned from the ground. Examples of such are grass or moss-covered rocks, roots or other obstacles that are close to ground and of a similar colour or texture as the ground.
  • the robotic work tool is also or alternatively arranged or configured to traverse and operate in an operational area that contains obstacles that are overhanging, i.e.
  • the operational area exemplified with referenced to figure 2 may thus be such a non-uniform operational area as disclosed in this paragraph that the robotic work tool is arranged to traverse and/or operate in.
  • Figure IB shows a situation where the robotic work tool 100 is navigating a flat surface of an operational area utilizing a (mono) camera 185. Knowing one or more angles a, 0 of the Field-Of-View (FOV) of the camera 185 as well as knowing the height of the camera’s placement form the ground, enables for an accurate determination of the length L of the area covered by the FOV, which in turn enables for an easy determination of any scale S in the FOV. It should eb noted that it is not needed to know exactly the height and/or the angles a, 0 indicated in figure IB, but the same may be determined based on other measurements. However, the disclosure herein will focus on these measurements to illustrate how a problem is overcome.
  • FOV Field-Of-View
  • Figure 1C shows a situation where the robotic work tool 100 is navigating an irregular surface of an operational area utilizing a (mono) camera 185. Knowing the measurements will not enable the robotic work tool 100 to determine a proper scale S’ in the FOV as the measurements no longer correlate. Also, simply determining the angle of the robotic work tool 100 (utilizing for example a gyro or other IMU), will not solve the problem as such a determination will also require that the surface is sloping at a same slope, and not be irregular as in figure 1C. In the situation of figure 1C, the robotic work tool 100 will thus not be able - or at least experience difficulties - to determine a current pose of the robotic work tool 100, the pose including the current orientation as well as the location of the robotic work tool 100. This will reduce the accuracy of any SLAM-based navigation as the SLAM-navigation is dependent on determining poses for the robotic work tool 100 over time.
  • FIG. 2 shows a robotic work tool system 200 in some embodiments.
  • the schematic view is not to scale.
  • the robotic work tool system 200 comprises one or more robotic work tools 100 according to the teachings herein arranged to operate in one or more operational areas 205 possibly bounded by a boundary 220. It should be noted that the operational area 205 shown in figure 2 is simplified for illustrative purposes.
  • the view of the operational area 205 is also intended to be an illustration or graphical representation of the map application 120A discussed in the above.
  • a server 240 is shown as an optional connected device for the robotic work tool 100 to communicate with - possibly for rece4iving maps or map updates.
  • the server 240 comprises a controller 240A for controlling the operation of the server 240, a memory 240B for storing instructions and data relating to the operation of the server 240 and a communication interface 240C for enabling the server 240 to communicate with other entities, such as the robotic work tool 100, and/or a User Equipment such as a mobile phone.
  • the controller, the memory and the communication interface may be of similar types as discussed in relation to figure 1 for the robotic work tool 100.
  • the robotic work tool(s) is exemplified by a robotic lawnmower, whereby the robotic work tool system may be a robotic lawnmower system or a system comprising a combinations of robotic work tools, one being a robotic lawnmower, but the teachings herein may also be applied to other robotic work tools adapted to operate within an operational area.
  • the robotic work tool system may be a robotic lawnmower system or a system comprising a combinations of robotic work tools, one being a robotic lawnmower, but the teachings herein may also be applied to other robotic work tools adapted to operate within an operational area.
  • FIG 2 there may be obstacles such as houses, structures, trees to mention a few examples in the operational area 205.
  • such obstacles are indicated and referenced H (as in house).
  • H as in house
  • any processing may be done in any, some or all of the controller 110 of the robotic work tool 100 and/or the controller 240A of the server 240 and that the processing may also be done partially in one controller 110/240A for supplemental processing in the other controller 110/240A.
  • This is indicated in figure 2 A in that a dashed arrow is shown between the server 240and the robotic work tool 100 for indicating that information may be passed freely between them for (partial) processing.
  • the inventors have realized that by knowing - at least certain features of - the geometry of the robotic work tool 100, certain restraints can be put on the determination of the pose, based on the sensor input received from the deduced reckoning sensors 180, and possibly from the camera 185.
  • the scale is difficult to determine when the robotic work tool is operating on an irregular surface.
  • the pose of the robotic work tool so is the pose of the robotic work tool, and as the pose is difficult to determine, it will also be difficult to correlate the movements of the robotic work tool 100 to a map application. For example, if an incorrect orientation of the robotic work tool 100 is determined, any sensor input, for example relating to wheel turn counts, will result in a translation of the robotic work tool 100 in the wrong direction. Furthermore, and as the inventors have realized, an orientation in a direction pointing away from the surface, will result in the robotic work tool having a determined position that is impossible.
  • Figure 3 shows a situation where a robotic work tool 100 is navigating an irregular surface.
  • the pose P of the robotic work tool 100 is to be determined based on the deduced reckoning sensors 180, and/or the camera 185, even when operating on an irregular surface.
  • the inventors have realized that by knowing - at least some features of - the geometry of the robotic work tool 100, certain constraints may be put on the determinations.
  • the two features of the geometry is to know the location of the wheels, WP1, WP2 of the wheels 130 of the robotic work tool 100.
  • the constraints put on the determinations based on these geometry features are that the wheels have to be on the surface.
  • the robotic work tool 100 is arranged to operate according to a map application 120A.
  • the map application 120A is in some embodiments generated by the robotic work tool 100 itself, in some embodiments downloaded from the server 240 or in some embodiments being generated, such as through V-SLAM.
  • the controller 110 determines the pose P of the robotic work tool (the pose including the location as well as the orientation), the pose is correlated to the map application 120 A.
  • the correlation is made based on the constraints.
  • the inventors have proposed to enable the controller 110 to determine a curvature C at a current position or region in the map application 120A.
  • the curvature C is in some embodiments determined by the controller 110 based on map features.
  • the curvature C is in some embodiments determined by the controller 110 by being retrieved from the map application 120 A, whereby the curvature of a position or region is stored in the map application 120A.
  • the pose P includes the position as well as the orientation of the robotic work tool 100.
  • the position and the orientation is represented by a rotation matrix R, and a translation vector T from which the current orientation and position of a pose is determined based on a previous pose.
  • the robotic work tool 100 is thus enabled or configured to - for the purpose of locating the robotic work tool 100 in the map application 120A - estimate the pose of the robot by such a rotation matrix R and a translation vector T.
  • the rotation matrix R representing a total rotation from some initial or previous pose orientation
  • the translation vector T representing a total translation relative to some initial or previous pose position.
  • the controller 110 is thus enabled to determine a pose of the robotic work tool, based on the curvature C of the current region that the robotic work tool 100 is in of the operational area 205 and by applying the constraints inferred by the geometry of the robotic work tool 100.
  • the curvature is determined based on the Gauss-Bonnet theorem.
  • Figure 4 shows a schematic view of a graph of a determined curvature for a position or region of a map application.
  • the robotic work tool is configured to determine the pose of the robotic work tool relative real-world world coordinates, and (as would be understood by a skilled person) by solving a system of equations verify if the constraints are met.
  • the constraints are met.
  • the robotic work tool is configured to solve an optimization problem on pose refinement.
  • the controller 110 is configured to determine a rotation matrix R’ and a translation vector T’ that are close to R and T respectively, that satisfy the tangency condition and the nonholonomic constraint that arises from the motion on the surface as will be described later. This new estimate is then fed into a Kalman filter. In case of an accurate mapping of the environment such a correction is supposed to be correct on average and improve the estimate.
  • the distance from one (rear) wheel dl is 223 mm to a center line (referenced “axis” in figure 1 A) of the robotic work tool 100
  • the distance from another (front) wheel d2 is 150 mm to the center line of the robotic work tool 100.
  • the distance a between the wheels’ axis is 331 mm and the radius r of a wheel is 119 mm.
  • an initial pose is determined based on the sensor input from the deduced reckoning sensors 180, and/or the camera 185, hereafter referred to as the (visual) odometry. This is a fair assumption as the robotic work tool most likely starts to operate from a charging station that is located in a flat area.
  • a relative rotation and translation are updated continuously or at (regular) intervals by the controller 110 during operation in the operational area 205.
  • R the relative rotation matrix from the initial position
  • T the relative translation vector from the initial position
  • consecutive sensor input such as camera or image frames
  • consecutive sensor input are in some embodiments utilized to create a 3 dimensional point cloud of matched feature points in the consecutive sensor input, or in consecutive image frames, when the scale of the translation T is known, it is possible to localize the point cloud in 3D space in global 3D coordinates, which will be referred to as world coordinate system in which the area of operation of the robotic work tool is most naturally represented, i.e. the coordinate system of the map application 120A.
  • world coordinate system in which the area of operation of the robotic work tool is most naturally represented
  • the spline approximation enables more compact storage of the geometry of the operating area a where the representation of hundreds of points can be replaced by storing only of 6 to 10 coefficients for a patch.
  • Such a representation gives on average a more curate representation of the geometry of the area in which the robot operates.
  • the equation for the geodesics GD shown in figure 4 the estimated translation vector T, and the angles are estimated.
  • we store the corresponding equation in a symbolic representation, as well as a numerical one. This allows to easily construct and solve the equations for the geodesic lines connecting the points which will be used to enforce the constraint related to the motion on the surface. This can be done by letting r(u,v) (x(u,v), y(u,v), z(u,v)) be a regular parametrization of a patch of regular surface. In such a case we have two linearly independent tangent vectors:
  • the Gauss-Bonnet theorem implies that for a geodesic triangle T on a smooth 2- dimentional Riemannian manifold holds Where alpha, beta, and gamma are the angle of the triangle, and K is the Gauss curvature that for the surface (X,Y, Z(X,Y)) can be computed by the formula
  • the angle theta can be inferred from the rotation matrix R. If we assume that on the average the base angle of the geodesic triangle will be equal, then the direction angle of the translation vector T is
  • Figure 5 shows a flowchart for a general method according to herein.
  • the method is for use in a robotic work tool as in figure 1 A in a manner as discussed above in relation to figures 2, 3, 4A and 4B, namely for use in a robotic work tool system comprising a robotic work tool 100 arranged to operate in an operational area 205, the operational area 205 having a surface that is at least partially irregular, and the robotic work tool comprising controller 110, a memory 120, three or more wheels 130, deduced reckoning sensors 180, the memory 120 being configured to store a map application 120A and to store data regarding the positions WP of the wheels 130 on the robotic work tool 100,
  • the method comprises a controller of the robotic work tool system receiving 510 sensor input from the deduced reckoning sensors 180 and determining 520 a curvature C for the surface of the operational area 205.
  • the method also comprises determining 530 a pose based on the sensor input, wherein the pose is determined 540 based on an analysis of the sensor input, the analysis being constrained 540 by that the position of the three or more wheels being on the surface of the operational area, wherein the method further comprises determining 550 that constraints are met, such as that a wheel is on the surface taking into account the curvature (C) of the surface.
  • the robotic work tool may further comprise a visual odometry sensor being a camera 185, in which case the sensor input comprises at least one image.
  • the method comprises receiving image data 515 and determining the pose based on the at least one image, wherein the pose is determined based on an analysis of the image.
  • the method comprises determining the curvature based on Gauss-Bonnet theorem.
  • the robotic work tool is configured to map the area as it is traversed.
  • the method comprises determining the curvature (C) for the surface of the operational area (205) based on the received sensor input in combination with mapping the operational area.
  • the robotic work tool is also configured to retrieve the map from the memory and in such embodiments the method comprises determining the curvature (C) for the surface of the operational area 205 based on the map application 120 A.
  • the map application (120A) comprises indications of the curvature (C) and the controller simply determines the curvature based on the (pre-calculated) indications, which may have been calculated when making the map or prior to downloading the map.
  • the sensor input may also be - at least supplemental ⁇ - stored in the memory.
  • the method comprises receiving sensor input from the deduced reckoning sensors through having been stored in the memory 120 as part of the map application 120A stored as part of a mapping of the operational area 205.
  • the determinations are made by the robotic work tool 100.
  • the controller is the controller 110 of the robotic work tool 100.
  • some processing may be done by the server and in such embodiments the controller is the controller 240A of the server 240.
  • the controller is the controller 110 of the robotic work tool 100 for performing some of the processing and the controller 240 A of the server 240 for performing some of the processing for a shared processing, where some tasks are performed by one controller, and the remaining tasks by the other controller.
  • Figure 6 shows a schematic view of a computer-readable medium 600 carrying computer instructions 610 that when loaded into and executed by a controller of a device, such as a robotic work tool 100 or a server 240, enables the device to implement the teachings herein.
  • the device will be exemplified as the robotic work tool 100.
  • the computer-readable medium 600 may be tangible such as a hard drive or a flash memory, for example a USB memory stick or a cloud server.
  • the computer-readable medium 600 may be intangible such as a signal carrying the computer instructions enabling the computer instructions to be downloaded through a network connection, such as an internet connection.
  • a computer-readable medium 600 is shown as being a hard drive or computer disc 600 carrying computer-readable computer instructions 610, being inserted in a computer disc reader 620.
  • the computer disc reader 620 may be part of a cloud server 630 - or other server - or the computer disc reader 620 may be connected to a cloud server 630 - or other server.
  • the cloud server 630 may be part of the internet or at least connected to the internet.
  • the cloud server 630 may alternatively be connected through a proprietary or dedicated connection.
  • the computer instructions are stored at a remote server 630 and be downloaded to the memory 120 of the robotic work tool 100 for being executed by the controller 110.
  • the computer disc reader 620 may also or alternatively be connected to (or possibly inserted into) a robotic work tool 100 for transferring the computer-readable computer instructions 610 to a controller of the robotic work tool 100 (presumably via a memory of the robotic work tool 100).
  • Figure 6 shows both the situation when a robotic work tool 100 receives the computer-readable computer instructions 610 via a server connection and the situation when another robotic work tool 100 receives the computer-readable computer instructions 610 through a wired interface. This enables for computer-readable computer instructions 610 being downloaded into a robotic work tool 100 thereby enabling the robotic work tool 100 to operate according to and implement the invention as disclosed herein.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Environmental Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

Système d'outil de travail robotique comprenant un outil de travail robotique (100) conçu pour fonctionner dans une zone fonctionnelle (205), la zone fonctionnelle (205) comportant une surface qui est au moins partiellement irrégulière, et l'outil de travail robotique comprenant un dispositif de commande (110), une mémoire (120), trois roues ou plus (130), et des capteurs d'estime déduite (180), la mémoire (120) étant configurée pour stocker une application de carte (120A) et pour stocker des données concernant les positions des roues (130) sur l'outil de travail robotique (100), et le dispositif de commande (110) étant configuré pour recevoir une entrée de capteur en provenance des capteurs d'estime déduite (180) ; déterminer une courbure (C) pour la surface de la zone fonctionnelle (205) ; déterminer une pose sur la base de l'entrée de capteur, la pose étant déterminée sur la base d'une analyse de l'entrée de capteur, l'analyse étant contrainte par le fait que la position des trois roues ou plus se trouve sur la surface de la zone fonctionnelle, le dispositif de commande étant en outre configuré pour déterminer qu'une roue se trouve sur la surface en tenant compte de la courbure (C) de la surface.
PCT/SE2023/050369 2022-07-04 2023-04-21 Détermination améliorée d'une pose pour un outil de travail robotique WO2024010503A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
SE2250834A SE2250834A1 (en) 2022-07-04 2022-07-04 Improved determination of pose for a robotic work tool
SE2250834-5 2022-07-04

Publications (1)

Publication Number Publication Date
WO2024010503A1 true WO2024010503A1 (fr) 2024-01-11

Family

ID=86286150

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/SE2023/050369 WO2024010503A1 (fr) 2022-07-04 2023-04-21 Détermination améliorée d'une pose pour un outil de travail robotique

Country Status (2)

Country Link
SE (1) SE2250834A1 (fr)
WO (1) WO2024010503A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5410478A (en) * 1992-06-18 1995-04-25 Alcatel Alsthom Compagnie Generale D'electricite Apparatus for measuring the displacement of a vehicle, in particular an all-terrain robot, and a vehicle equipped with such apparatus
EP2874037B1 (fr) * 2012-07-11 2020-02-12 Introsys - Integration for Robotic Systems-Integração de Sistemas Robóticos, S.A. Véhicule tout-terrain robotisé et autonome
SE2050416A1 (en) * 2020-04-14 2021-10-15 Husqvarna Ab Robotic working tool system and method for defining a working area
US11199853B1 (en) * 2018-07-11 2021-12-14 AI Incorporated Versatile mobile platform

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4694639A (en) * 1985-12-30 1987-09-22 Chen Sheng K Robotic lawn mower
WO2007051972A1 (fr) * 2005-10-31 2007-05-10 Qinetiq Limited Systeme de navigation
EP2013671B1 (fr) * 2006-03-17 2018-04-25 iRobot Corporation Robot d'entretien de pelouses
US20200409382A1 (en) * 2014-11-10 2020-12-31 Carnegie Mellon University Intelligent cleaning robot
US9630319B2 (en) * 2015-03-18 2017-04-25 Irobot Corporation Localization and mapping using physical features
US11927965B2 (en) * 2016-02-29 2024-03-12 AI Incorporated Obstacle recognition method for autonomous robots
SE541866C2 (en) * 2017-04-18 2020-01-02 Husqvarna Ab Method for detecting lifting of a self-propelled robotic tool and a self-propelled robotic tool
US20190054621A1 (en) * 2017-08-16 2019-02-21 Franklin Robotics, Inc. Inertial Collision Detection Method For Outdoor Robots
SE544545C2 (en) * 2019-09-25 2022-07-12 Husqvarna Ab Propulsion control arrangement, robotic tool, method of propelling robotic tool and related devices

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5410478A (en) * 1992-06-18 1995-04-25 Alcatel Alsthom Compagnie Generale D'electricite Apparatus for measuring the displacement of a vehicle, in particular an all-terrain robot, and a vehicle equipped with such apparatus
EP2874037B1 (fr) * 2012-07-11 2020-02-12 Introsys - Integration for Robotic Systems-Integração de Sistemas Robóticos, S.A. Véhicule tout-terrain robotisé et autonome
US11199853B1 (en) * 2018-07-11 2021-12-14 AI Incorporated Versatile mobile platform
SE2050416A1 (en) * 2020-04-14 2021-10-15 Husqvarna Ab Robotic working tool system and method for defining a working area

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
BURG VAN DER J ET AL: "ANTI-LOCK BRAKING AND TRACTION CONTROL CONCEPT FOR ALL-TERRAIN ROBOTIC VEHICLES", PROCEEDINGS OF THE 1997 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION. ALBUQUERQUE, APR. 20 - 25, 1997; [PROCEEDINGS OF THE IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION], NEW YORK, IEEE, US, vol. CONF. 14, 20 April 1997 (1997-04-20), pages 1400 - 1405, XP000774380, ISBN: 978-0-7803-3613-1 *

Also Published As

Publication number Publication date
SE2250834A1 (en) 2024-01-05

Similar Documents

Publication Publication Date Title
US11334082B2 (en) Autonomous machine navigation and training using vision system
US20230236604A1 (en) Autonomous machine navigation using reflections from subsurface objects
ES2923218T3 (es) Sistema de navegación autónomo y vehículo fabricado con el mismo
EP4075229B1 (fr) Installation améliorée pour un outil de travail robotique
WO2024010503A1 (fr) Détermination améliorée d'une pose pour un outil de travail robotique
EP4068040A1 (fr) Fonctionnement amélioré pour un outil de travail robotique
WO2022203562A1 (fr) Navigation améliorée pour outil de travail robotique
US20230069475A1 (en) Autonomous machine navigation with object detection and 3d point cloud
US20240180072A1 (en) Detection of a solar panel for a robotic work tool
EP4368004A1 (fr) Fonctionnement et installation améliorés pour un outil de travail robotique
US20240176350A1 (en) Definition of boundary for a robotic work tool
US20240182074A1 (en) Operation for a robotic work tool
EP4268565A1 (fr) Navigation améliorée pour un système d'outil de travail robotique
WO2023244150A1 (fr) Navigation améliorée pour système d'outil de travail robotique
US20230195121A1 (en) Path generating method, program, path generating device, and autonomous mobile body
SE545472C2 (en) System and method for navigating a robotic work tool
WO2023163624A1 (fr) Cartographie améliorée pour un système d'outil de travail robotisé
WO2023121528A1 (fr) Navigation améliorée pour un système d'outil de travail robotisé
SE2151275A1 (en) Improved navigation for a robotic work tool system
WO2023146451A1 (fr) Fonctionnement amélioré pour un système d'outil de travail robotisé