WO2023136758A1 - Method and arrangment for controlling a robotic tool - Google Patents

Method and arrangment for controlling a robotic tool Download PDF

Info

Publication number
WO2023136758A1
WO2023136758A1 PCT/SE2022/051064 SE2022051064W WO2023136758A1 WO 2023136758 A1 WO2023136758 A1 WO 2023136758A1 SE 2022051064 W SE2022051064 W SE 2022051064W WO 2023136758 A1 WO2023136758 A1 WO 2023136758A1
Authority
WO
WIPO (PCT)
Prior art keywords
robotic tool
sensor
correlation
sensor arrangement
surroundings
Prior art date
Application number
PCT/SE2022/051064
Other languages
French (fr)
Inventor
Marcus Homelius
Arvi JONNARTH
Abdelbaki Bouguerra
Original Assignee
Husqvarna Ab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Husqvarna Ab filed Critical Husqvarna Ab
Publication of WO2023136758A1 publication Critical patent/WO2023136758A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01DHARVESTING; MOWING
    • A01D34/00Mowers; Mowing apparatus of harvesters
    • A01D34/006Control or measuring arrangements
    • A01D34/008Control or measuring arrangements for automated or remotely controlled operation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/865Combination of radar systems with lidar systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/40Means for monitoring or calibrating
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/20Control system inputs
    • G05D1/24Arrangements for determining position or orientation
    • G05D1/246Arrangements for determining position or orientation using environment maps, e.g. simultaneous localisation and mapping [SLAM]
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/40Control within particular dimensions
    • G05D1/43Control of position or course in two dimensions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/008Reliability or availability analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D2109/00Types of controlled vehicles
    • G05D2109/10Land vehicles
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D2111/00Details of signals used for control of position, course, altitude or attitude of land, water, air or space vehicles
    • G05D2111/50Internal signals, i.e. from sensors located in the vehicle, e.g. from compasses or angular sensors

Definitions

  • the present disclosure relates to a method for controlling an autonomous robotic tool, the tool comprising a plurality of sensor arrangements configured to acquire data concerning the surroundings of the robotic tool.
  • Autonomous robotic tools are used increasingly in different applications, such as for instance as robotic lawn mowers.
  • the use of several different sensor arrangements allow the provision of a more capable robotic tool that can operate in a safe and efficient manner while detecting features in its vicinity.
  • One object of the present disclosure is therefore to obtain a robotic tool with improved reliability.
  • This object is achieved with a robotic tool carrying out a method as defined in claim 1 .
  • data is acquired from a group comprising at least a first, a second and a third sensor arrangement covering a common overlapped field of the robotic tool surroundings.
  • a correlation measure is determined for each pair of sensor arrangements in the group, and if the correlation measures related to one sensor arrangement in the group falls short of a threshold, tagging that sensor arrangement as unreliable.
  • a robotic tool itself can verify the operation of its sensors by letting two sensors determine the status of a third, for instance.
  • the robotic tool may initiate different procedures to deal with this problem, resulting in a more reliable robotic tool.
  • the present disclosure also considers a method for controlling an autonomous robotic tool, the tool comprising a plurality of sensor arrangements configured to acquire data concerning the surroundings of the robotic tool.
  • the method includes acquiring data from at least a first, and a second sensor arrangement covering a common overlapped field of the robotic tool surroundings, determining a correlation measure for the first and second sensor arrangements, and if the correlation falls short of a threshold, tagging the first and second sensor arrangements as unreliable. This provides a result even if only two sensor arrangements are used, and it is not possible to determine which one of them is faulty.
  • At least one sensor signal may be time-shifted prior to determining a correlation. This may compensate for different area coverages as the robotic tool moves.
  • a counter may be associated with each sensor arrangement, the counter being incremented each time the sensor arrangement is tagged as unreliable, and an action being initiated once the counter reaches a threshold.
  • the present disclosure further considers an autonomous robotic tool configured to carry out the above methods.
  • the robotic tool then comprises a processor and is configured to carry out the steps of the method.
  • the sensors with which the autonomous robotic tool is equipped may include sensors in the group comprising cameras, ultrasound sensors, RADARs, and LIDARs, for instance.
  • Fig 1 illustrates a robotic tool under operation.
  • Fig 2 schematically illustrates a control system for a robotic tool.
  • Fig 3 illustrates schematically the sensor verification according to an example of the present disclosure.
  • Fig 4 shows a flow chart illustrating a method according to the present disclosure.
  • Fig 1 illustrates a robotic tool 1 under operation.
  • the robotic tool is an autonomous lawn mower 1 , comprising an implement carrier 3 and which is devised to move using wheels 5 driven by one or more motors in the implement carrier 3.
  • An implement 7 is connected to the implement carrier, in the illustrated case a cutting deck. The lawn mower 1 thus moves over a lawn while cutting grass, and this operation may be autonomous.
  • a working area was limited by a buried cable, which could be detected by the robotic lawn mower such that it could remain within the working area, moving therein in a random or semi-random fashion to process the area as a whole over time.
  • the robotic tool 1 It has been suggested to develop more sophisticated control strategies allowing the robotic tool 1 to actually read its environment in order to navigate as well as avoiding obstacles, finding a charging station, etc. Then, the robotic tool 1 is provided with a number of sensors 9, 11 , 13 to that end.
  • the present disclosure is concerned with such developed robotic tools.
  • the sensors may include cameras, RADARs (Radio detection and ranging), LIDARs (Light detection and ranging), ultrasonic sensors, etc.
  • the robotic tool may comprise different kinds to navigation sensors such as GPS and RTK (real-time kinematic) sensors and inertia measurement units.
  • the present disclosure is, however, mostly concerned with the former group which are configured to acquire data concerning the surroundings of the robotic tool 1 , such as typically objects 15 nearby.
  • a camera could be covered by cut grass, dust or falling leaves, for example, that makes its output unreliable and unsuitable as a basis for robotic tool control.
  • Fig 2 schematically illustrates a control system 19 for a robotic tool.
  • the control system 19 receives data from sensor systems such as a camera 9, a LIDAR 11 and a RADAR 13. Based on data therefrom, and from data originating from other subsystems, a processor 21 of the control system 19, optionally using other stored data 23, controls steering- and processing features 25 of the robotic tool, such as wheel motors, steering, and operation of a cutting deck, for instance.
  • the different sensors 9, 11 , 13 cover most of the surrounding area close to the robotic tool 1 .
  • a camera 9, for instance may cover a sector 27 in front of the robotic tool 1 .
  • a LIDAR 11 may cover a 360-degree sector 29 around the robotic tool 1 , and the same applies to a RADAR sensor 13 also covering an entirely surrounding sector 31 , but possibly with a different range. There may exist some blind spots 33 in the close vicinity of the robotic tool 1 , but this does not substantially influence the method to be disclosed.
  • Fig 3 illustrates schematically the sensor verification according to an example of the present disclosure.
  • Data from different sensors 9, 11 , 13 are pair-wise correlated in correlation modules 35.
  • the module 35 in the top of fig 3 correlates data from the camera 9 and the RADAR 13, and outputs a correlation result.
  • This correlation result may be tested by comparison with a threshold in a threshold unit 37.
  • the threshold may be specific for the involved pair of sensors 9, 13 and may also depend on other features such as the robotic tool’s 1 velocity, for instance. It is also possible to let the threshold depend on outputs from the parallel correlation units 35 testing other pairs of sensors 11 , 13; 9, 11 . This makes it possible to single out one individual, disturbed sensor.
  • determining a correlation measure for each pair of sensor arrangements in a group of sensors may be carried out. If the correlation measures related to one sensor arrangement in the group falls short of a threshold, that sensor may be tagged as unreliable.
  • a time shift unit 39 may be introduced in each parallel comparison between different sensors, which introduces a time shift, positive or negative, such that the correlation is made with time adjusted data.
  • the sensor sectors 27, 29, 31 need not be instantaneously overlapping. It is sufficient if they overlap a common area surrounding the robotic tool 1 within a period of time.
  • the above ‘units’ are typically realized with software executed on a digital signal processor or the like.
  • Fig 4 shows a flow chart illustrating a basic method according to the present disclosure.
  • data is acquired 41 from a group comprising at least a first, a second and a third sensor arrangement covering a common overlapped field of the robotic tool surroundings. It is determined 43 a correlation measure for each pair of sensor arrangements in the group. If the correlation measures related to one sensor arrangement in the group falls short of a threshold, that sensor is tagged as unreliable.
  • the sensors do not agree, they use ‘voting’ to identify the sensor or sensors not agreeing with the majority of the other sensors. This can be done by pairwise correlating the sensors as described above. The system tags the disagreeing sensor as unreliable. This requires three or more sensor arrangements. With only two, the system instead determine that the correlation is too low and flags that an error is likely.
  • the main steps include: a) For each COR, collect the observations of each sensor belonging to that COR. b) If all sensors agree on the extracted observation, then do nothing. Otherwise go to step c). c) Use voting to identify the sensor or sensors not agreeing with the majority of the other sensors. d) Check if the deviating sensor is observing closer objects that would have prevented it from seeing the common observations through occlusion, i.e. , the sensor might be seeing another object that occludes the common objects observed by the other sensors. e) If the sensor is being occluded, then do nothing. The test can be repeated once the tool has moved. When there is no occlusion, proceed to the next step.
  • the sensor should have observed the common feature, but did not, then mark it and associate a counter with that sensor. g) If one sensor keeps deviating for a certain predefined number of times as determined by the counter, then: g1 ) If the sensor does see the common features but it is misplacing them with a consistent error, then correct the pose of that sensor using the accumulated deviations. g2) Otherwise, the sensor likely behaves in an unpredictable way, which should generate a warning about its behavior.
  • This information can be used in different ways. There may be possible measures to reset or clean a sensor, for instance. It is also possible to calibrate one sensor with the help of two or more other sensors. As another option a fault signal may be produced, which warns a user locally or remotely about a possible fault with the robotic tool 1 .

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Theoretical Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Quality & Reliability (AREA)
  • Electromagnetism (AREA)
  • Multimedia (AREA)
  • General Engineering & Computer Science (AREA)
  • Environmental Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Manipulator (AREA)

Abstract

The present disclosure relates to a method for controlling an autonomous robotic tool (1) comprising a plurality of sensor arrangements (9, 11, 13) configured to acquire data concerning the surroundings of the robotic tool. The method involves the steps of acquiring data (41) from a group comprising at least a first, a second and a third sensor arrangement covering a common overlapped field of the robotic tool surroundings, determining a correlation (43) measure for each pair of sensor arrangements in the group, and if the correlation measures related to one sensor arrangement in the group falls short of a threshold, tagging (45) that sensor arrangement as unreliable. This allows the robotic tool to independently verify the status of its different sensor arrangements and to initiate measures if one sensor arrangement is found to be unreliable.

Description

METHOD AND ARRANGMENT FOR CONTROLLING A ROBOTIC TOOL
Technical field
The present disclosure relates to a method for controlling an autonomous robotic tool, the tool comprising a plurality of sensor arrangements configured to acquire data concerning the surroundings of the robotic tool.
Background
Autonomous robotic tools are used increasingly in different applications, such as for instance as robotic lawn mowers. The use of several different sensor arrangements allow the provision of a more capable robotic tool that can operate in a safe and efficient manner while detecting features in its vicinity.
One general problem associated with robotic tools of this kind is to verify that they operate correctly, especially since there in many cases is no user nearby checking that the behavior of the robotic tool is as expected and intended.
Summary
One object of the present disclosure is therefore to obtain a robotic tool with improved reliability. This object is achieved with a robotic tool carrying out a method as defined in claim 1 . More specifically, in a method of the initially mentioned kind, data is acquired from a group comprising at least a first, a second and a third sensor arrangement covering a common overlapped field of the robotic tool surroundings. A correlation measure is determined for each pair of sensor arrangements in the group, and if the correlation measures related to one sensor arrangement in the group falls short of a threshold, tagging that sensor arrangement as unreliable. This means that a robotic tool itself can verify the operation of its sensors by letting two sensors determine the status of a third, for instance.
Therefore, if for example a camera, a LIDAR, and a RADAR is used, the latter two can determine that the camera does not operate properly, perhaps being obscured by dirt or leaves, for instance. If so, the robotic tool may initiate different procedures to deal with this problem, resulting in a more reliable robotic tool.
The present disclosure also considers a method for controlling an autonomous robotic tool, the tool comprising a plurality of sensor arrangements configured to acquire data concerning the surroundings of the robotic tool. The method includes acquiring data from at least a first, and a second sensor arrangement covering a common overlapped field of the robotic tool surroundings, determining a correlation measure for the first and second sensor arrangements, and if the correlation falls short of a threshold, tagging the first and second sensor arrangements as unreliable. This provides a result even if only two sensor arrangements are used, and it is not possible to determine which one of them is faulty.
Regardless of which of the above approaches is used, at least one sensor signal may be time-shifted prior to determining a correlation. This may compensate for different area coverages as the robotic tool moves.
A counter may be associated with each sensor arrangement, the counter being incremented each time the sensor arrangement is tagged as unreliable, and an action being initiated once the counter reaches a threshold.
The present disclosure further considers an autonomous robotic tool configured to carry out the above methods. Generally, the robotic tool then comprises a processor and is configured to carry out the steps of the method.
The sensors with which the autonomous robotic tool is equipped may include sensors in the group comprising cameras, ultrasound sensors, RADARs, and LIDARs, for instance.
Brief description of the drawings
Fig 1 illustrates a robotic tool under operation.
Fig 2 schematically illustrates a control system for a robotic tool.
Fig 3 illustrates schematically the sensor verification according to an example of the present disclosure.
Fig 4 shows a flow chart illustrating a method according to the present disclosure.
Detailed description
The present disclosure relates to robotic tools typically capable of autonomous operation. Robotic garden tools are primarily considered, such as robotic lawn mowers. However, the method of verifying sensors in a robotic system to be disclosed is applicable to general robotic and/or autonomous systems that move and/or carry out operations, such as cleaning robots or even transport vehicles. Fig 1 illustrates a robotic tool 1 under operation. In the illustrated case the robotic tool is an autonomous lawn mower 1 , comprising an implement carrier 3 and which is devised to move using wheels 5 driven by one or more motors in the implement carrier 3. An implement 7 is connected to the implement carrier, in the illustrated case a cutting deck. The lawn mower 1 thus moves over a lawn while cutting grass, and this operation may be autonomous. In early types of robotic lawnmowers, a working area was limited by a buried cable, which could be detected by the robotic lawn mower such that it could remain within the working area, moving therein in a random or semi-random fashion to process the area as a whole over time.
It has been suggested to develop more sophisticated control strategies allowing the robotic tool 1 to actually read its environment in order to navigate as well as avoiding obstacles, finding a charging station, etc. Then, the robotic tool 1 is provided with a number of sensors 9, 11 , 13 to that end. The present disclosure is concerned with such developed robotic tools. The sensors may include cameras, RADARs (Radio detection and ranging), LIDARs (Light detection and ranging), ultrasonic sensors, etc.
In addition to this, the robotic tool may comprise different kinds to navigation sensors such as GPS and RTK (real-time kinematic) sensors and inertia measurement units. The present disclosure is, however, mostly concerned with the former group which are configured to acquire data concerning the surroundings of the robotic tool 1 , such as typically objects 15 nearby.
In an autonomous system it is desired to verify the correct data acquisition of the sensors 9, 11 , 13. For instance, a camera could be covered by cut grass, dust or falling leaves, for example, that makes its output unreliable and unsuitable as a basis for robotic tool control.
Fig 2 schematically illustrates a control system 19 for a robotic tool. The control system 19 receives data from sensor systems such as a camera 9, a LIDAR 11 and a RADAR 13. Based on data therefrom, and from data originating from other subsystems, a processor 21 of the control system 19, optionally using other stored data 23, controls steering- and processing features 25 of the robotic tool, such as wheel motors, steering, and operation of a cutting deck, for instance. Returning to fig 1 , the different sensors 9, 11 , 13 cover most of the surrounding area close to the robotic tool 1 . A camera 9, for instance, may cover a sector 27 in front of the robotic tool 1 . A LIDAR 11 may cover a 360-degree sector 29 around the robotic tool 1 , and the same applies to a RADAR sensor 13 also covering an entirely surrounding sector 31 , but possibly with a different range. There may exist some blind spots 33 in the close vicinity of the robotic tool 1 , but this does not substantially influence the method to be disclosed.
As is shown in fig 1 , the sectors 27, 29, 31 are mutually overlapping to some extent and this is used to provide a verification of the sensors 9, 11 , 13. Fig 3 illustrates schematically the sensor verification according to an example of the present disclosure.
Data from different sensors 9, 11 , 13 are pair-wise correlated in correlation modules 35. For instance, the module 35 in the top of fig 3 correlates data from the camera 9 and the RADAR 13, and outputs a correlation result. This correlation result may be tested by comparison with a threshold in a threshold unit 37. The threshold may be specific for the involved pair of sensors 9, 13 and may also depend on other features such as the robotic tool’s 1 velocity, for instance. It is also possible to let the threshold depend on outputs from the parallel correlation units 35 testing other pairs of sensors 11 , 13; 9, 11 . This makes it possible to single out one individual, disturbed sensor. Thus, determining a correlation measure for each pair of sensor arrangements in a group of sensors may be carried out. If the correlation measures related to one sensor arrangement in the group falls short of a threshold, that sensor may be tagged as unreliable.
In its simplest form, only two sensor arrangements are used, and a too low correlation between those two may indicate that the system as a whole is not functioning correctly.
As is evident from fig 1 features of the surroundings, e.g. object 15, will be detected at different instants by the different sensors 9, 11 , 13. Therefore, returning to fig 3, a time shift unit 39 may be introduced in each parallel comparison between different sensors, which introduces a time shift, positive or negative, such that the correlation is made with time adjusted data. This also means that the sensor sectors 27, 29, 31 need not be instantaneously overlapping. It is sufficient if they overlap a common area surrounding the robotic tool 1 within a period of time.
The above ‘units’ are typically realized with software executed on a digital signal processor or the like.
Fig 4 shows a flow chart illustrating a basic method according to the present disclosure.
Thus, in a first step, data is acquired 41 from a group comprising at least a first, a second and a third sensor arrangement covering a common overlapped field of the robotic tool surroundings. It is determined 43 a correlation measure for each pair of sensor arrangements in the group. If the correlation measures related to one sensor arrangement in the group falls short of a threshold, that sensor is tagged as unreliable.
Expressed differently, for each overlapped area, observations of each sensor relating to that area is collected. If all sensors agree on the acquired observation, then the robotic tool proceeds with its tasks without further investigations.
However, if the sensors do not agree, they use ‘voting’ to identify the sensor or sensors not agreeing with the majority of the other sensors. This can be done by pairwise correlating the sensors as described above. The system tags the disagreeing sensor as unreliable. This requires three or more sensor arrangements. With only two, the system instead determine that the correlation is too low and flags that an error is likely.
It is further possible to check whether the deviating sensors are disturbed by closer objects that would have prevented them from seeing the common observations through occlusion, that is the sensor might be occluded by another closer object, not allowing the sensor to see the common objects observed by the other sensors. If this is the case, there is no need to tag the sensor as unreliable. Then the test is repeated at another location where the previously occluding object is not likely a problem.
It is possible to count instances where a sensor is tagged as unreliable and to initiate a measure if that counter exceeds a threshold. If the sensor does see a common feature, but misplaced with a consistent error, then it is possible to correct the sensor. This can be done physically, e.g. by redirecting a camera, or by adapting the robotic tool’s 1 control software to take the misplacement into account.
A complete algorithm where three or more sensors are positioned so that their fields of view intersect can be summarized as follows. Such an intersection is called a Common Observation Region, COR.
The main steps include: a) For each COR, collect the observations of each sensor belonging to that COR. b) If all sensors agree on the extracted observation, then do nothing. Otherwise go to step c). c) Use voting to identify the sensor or sensors not agreeing with the majority of the other sensors. d) Check if the deviating sensor is observing closer objects that would have prevented it from seeing the common observations through occlusion, i.e. , the sensor might be seeing another object that occludes the common objects observed by the other sensors. e) If the sensor is being occluded, then do nothing. The test can be repeated once the tool has moved. When there is no occlusion, proceed to the next step. f) If the sensor should have observed the common feature, but did not, then mark it and associate a counter with that sensor. g) If one sensor keeps deviating for a certain predefined number of times as determined by the counter, then: g1 ) If the sensor does see the common features but it is misplacing them with a consistent error, then correct the pose of that sensor using the accumulated deviations. g2) Otherwise, the sensor likely behaves in an unpredictable way, which should generate a warning about its behavior.
This information can be used in different ways. There may be possible measures to reset or clean a sensor, for instance. It is also possible to calibrate one sensor with the help of two or more other sensors. As another option a fault signal may be produced, which warns a user locally or remotely about a possible fault with the robotic tool 1 .
The present invention is not limited to the above described examples and can be altered and varied in different ways within the scope of the appended claims.

Claims

8 CLAIMS
1 . A method for controlling an autonomous robotic tool (1 ), the tool comprising a plurality of sensor arrangements (9, 11 , 13) configured to acquire data concerning the surroundings of the robotic tool characterized by the following steps: acquiring data (41 ) from a group comprising at least a first, a second and a third sensor arrangement covering a common overlapped field of the robotic tool surroundings, determining a correlation (43) measure for each pair of sensor arrangements in the group, and if the correlation measures related to one sensor arrangement in the group falls short of a threshold, tagging (45) that sensor arrangement as unreliable.
2. A method for controlling an autonomous robotic tool, the tool comprising a plurality of sensor arrangements configured to acquire data concerning the surroundings of the robotic tool characterized by the following steps: acquiring data (41 ) from at least a first, and a second sensor arrangement covering a common overlapped field of the robotic tool surroundings, determining a correlation (43) measure for the first and second sensor arrangements, and if the correlation falls short of a threshold, tagging (45) the first and second sensor arrangements as unreliable.
3. Method according any of claims 1 or 2, wherein at least one sensor signal is time-shifted prior to determining a correlation.
4. Method according to any of the preceding claims, wherein a counter is associated with each sensor arrangement, the counter being incremented each time the sensor arrangement is tagged as unreliable, and an action is initiated once the counter reaches a threshold.
5. An autonomous robotic tool (1 ) comprising a plurality of sensor arrangements (9, 11 , 13) configured to acquire data concerning the surroundings 9 of the robotic tool characterized by being configured to acquire data (41 ) from a group comprising at least a first, a second and a third sensor arrangement covering a common overlapped field of the robotic tool surroundings, determine a correlation (43) measure for each pair of sensor arrangements in the group, and if the correlation measures related to one sensor arrangement in the group falls short of a threshold, tag (45) that sensor arrangement as unreliable.
6. An autonomous robotic tool comprising a plurality of sensor arrangements configured to acquire data concerning the surroundings of the robotic tool characterized by being configured to: acquire data (41 ) from at least a first, and a second sensor arrangement covering a common overlapped field of the robotic tool surroundings, determine a correlation (43) measure for the first and second sensor arrangements, and if the correlation falls short of a threshold, tag (45) the first and second sensor arrangements as unreliable.
7. Autonomous robotic tool according any of claims 5 or 6, wherein at least one sensor signal is time-shifted prior to determining a correlation.
8. Autonomous robotic tool according any of claims 5-7, wherein the sensors include sensors in the group comprising cameras, ultrasound sensors, RADARs, and LIDARs.
9. Autonomous robotic tool according any of claims 5-8, wherein a counter is associated with each sensor arrangement, the counter being incremented each time the sensor arrangement is tagged as unreliable, and the robotic tool is configured to initiate an action once the counter reaches a threshold.
PCT/SE2022/051064 2022-01-17 2022-11-14 Method and arrangment for controlling a robotic tool WO2023136758A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
SE2250029A SE545938C2 (en) 2022-01-17 2022-01-17 Method and device for determining if a sensor arrangement is unreliable
SE2250029-2 2022-01-17

Publications (1)

Publication Number Publication Date
WO2023136758A1 true WO2023136758A1 (en) 2023-07-20

Family

ID=84360103

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/SE2022/051064 WO2023136758A1 (en) 2022-01-17 2022-11-14 Method and arrangment for controlling a robotic tool

Country Status (2)

Country Link
SE (1) SE545938C2 (en)
WO (1) WO2023136758A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190049958A1 (en) * 2017-08-08 2019-02-14 Nio Usa, Inc. Method and system for multiple sensor correlation diagnostic and sensor fusion/dnn monitor for autonomous driving application
US20190302793A1 (en) * 2018-04-03 2019-10-03 Sharkninja Operating, Llc Time of flight sensor arrangement for robot navigation and methods of localization using same
US11119478B2 (en) * 2018-07-13 2021-09-14 Waymo Llc Vehicle sensor verification and calibration

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008111727A1 (en) * 2007-03-14 2008-09-18 Electronics And Telecommunications Research Institute Method and apparatus for transmitting sensor status of radio frequency identification tag
US11036230B1 (en) * 2016-03-03 2021-06-15 AI Incorporated Method for developing navigation plan in a robotic floor-cleaning device
US10317905B2 (en) * 2017-08-10 2019-06-11 RavenOPS, Inc. Autonomous robotic technologies for industrial inspection
SE1751546A1 (en) * 2017-12-14 2019-05-14 Husqvarna Ab A robotic working tool and work area boundary sensing method
US10884119B2 (en) * 2018-06-08 2021-01-05 Ford Global Technologies, Llc Object tracking in blind-spot
AU2018438158B2 (en) * 2018-08-28 2024-05-30 Techtronic Cordless Gp An autonomous lawn mower and a system for navigating thereof

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190049958A1 (en) * 2017-08-08 2019-02-14 Nio Usa, Inc. Method and system for multiple sensor correlation diagnostic and sensor fusion/dnn monitor for autonomous driving application
US20190302793A1 (en) * 2018-04-03 2019-10-03 Sharkninja Operating, Llc Time of flight sensor arrangement for robot navigation and methods of localization using same
US11119478B2 (en) * 2018-07-13 2021-09-14 Waymo Llc Vehicle sensor verification and calibration

Also Published As

Publication number Publication date
SE545938C2 (en) 2024-03-19
SE2250029A1 (en) 2023-07-18

Similar Documents

Publication Publication Date Title
Bac et al. Performance evaluation of a harvesting robot for sweet pepper
US11660749B2 (en) Using object observations of mobile robots to generate a spatio-temporal object inventory, and using the inventory to determine monitoring parameters for the mobile robots
EP3366100B1 (en) Robotic garden tool
EP3373097B1 (en) Robotic mower with object detection system
US11493930B2 (en) Determining changes in marker setups for robot localization
WO2016097891A1 (en) Robotic vehicle for detecting gps shadow zones
Chateau et al. Automatic guidance of agricultural vehicles using a laser sensor
Bostelman et al. Navigation performance evaluation for automatic guided vehicles
Milella et al. RFID‐assisted mobile robot system for mapping and surveillance of indoor environments
CN115933674A (en) Obstacle detouring method and device for robot and storage medium
US20210370960A1 (en) Systems and methods for monitoring an operation of one or more self-driving vehicles
WO2023136758A1 (en) Method and arrangment for controlling a robotic tool
CN115880673B (en) Obstacle avoidance method and system based on computer vision
WO2021176031A1 (en) Method and system for determining visibility region of different object types for an autonomous vehicle
US20230056652A1 (en) Robotic Grasping Via RF-Visual Sensing And Learning
CN110989623A (en) Ground unmanned operation equipment, method and device for controlling movement of ground unmanned operation equipment, and storage medium
CN113701767B (en) Triggering method and system for map updating
CN113739819A (en) Verification method and device, electronic equipment, storage medium and chip
CN105922258B (en) A kind of Omni-mobile manipulator autonomous navigation method based on iGPS
US11826906B2 (en) Method for eliminating misjudgment of reflective light and optical sensing system
WO2014074026A1 (en) A method for navigation and joint coordination of automated devices
US20240195354A1 (en) Machines and methods for monitoring photovoltaic systems
US12036668B2 (en) Method for eliminating misjudgment of reflective light and optical sensing system
Yang et al. Cliff-sensor-based Low-level Obstacle Detection for a Wheeled Robot in an Indoor Environment
Blanke et al. Autonomous robot supervision using fault diagnosis and semantic mapping in an orchard

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22806022

Country of ref document: EP

Kind code of ref document: A1