SE545938C2 - Method and device for determining if a sensor arrangement is unreliable - Google Patents

Method and device for determining if a sensor arrangement is unreliable

Info

Publication number
SE545938C2
SE545938C2 SE2250029A SE2250029A SE545938C2 SE 545938 C2 SE545938 C2 SE 545938C2 SE 2250029 A SE2250029 A SE 2250029A SE 2250029 A SE2250029 A SE 2250029A SE 545938 C2 SE545938 C2 SE 545938C2
Authority
SE
Sweden
Prior art keywords
sensor
robotic tool
sensor arrangement
correlation
surroundings
Prior art date
Application number
SE2250029A
Other languages
Swedish (sv)
Other versions
SE2250029A1 (en
Inventor
Abdelbaki Bouguerra
Arvi Jonnarth
Marcus Homelius
Original Assignee
Husqvarna Ab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Husqvarna Ab filed Critical Husqvarna Ab
Priority to SE2250029A priority Critical patent/SE545938C2/en
Priority to PCT/SE2022/051064 priority patent/WO2023136758A1/en
Publication of SE2250029A1 publication Critical patent/SE2250029A1/en
Publication of SE545938C2 publication Critical patent/SE545938C2/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01DHARVESTING; MOWING
    • A01D34/00Mowers; Mowing apparatus of harvesters
    • A01D34/006Control or measuring arrangements
    • A01D34/008Control or measuring arrangements for automated or remotely controlled operation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/865Combination of radar systems with lidar systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/40Means for monitoring or calibrating
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • G05D1/246
    • G05D1/43
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/008Reliability or availability analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G05D2109/10
    • G05D2111/50

Abstract

The present disclosure relates to a method for controlling an autonomous robotic tool (1) comprising a plurality of sensor arrangements (9, 11, 13) configured to acquire data concerning the surroundings of the robotic tool. The method involves the steps of acquiring data (41) from a group comprising at least a first, a second and a third sensor arrangement covering a common overlapped field of the robotic tool surroundings, determining a correlation (43) measure for each pair of sensor arrangements in the group, and if the correlation measures related to one sensor arrangement in the group falls short of a threshold, tagging (45) that sensor arrangement as unreliable. This allows the robotic tool to independently verify the status of its different sensor arrangements and to initiate measures if one sensor arrangement is found to be unreliable.

Description

METHOD AND DEVICE FOR DETERMINING IF A SENSOR ARRANGMENT IS UNRELIABLE Technical field The present disclosure relates to a method for controlling an autonomous robotic tool, the tool comprising a plurality of sensor arrangements configured to acquire data concerning the surroundings of the robotic tool.
Background Autonomous robotic tools are used increasingly in different applications, such as for instance as robotic lawn mowers. The use of several different sensor arrange- ments allow the provision of a more capable robotic tool that can operate in a safe and efficient manner while detecting features in its vicinity.
One general problem associated with robotic tools of this kind is to verify that they operate correctly, especially since there in many cases is no user nearby check- ing that the behavior of the robotic tool is as expected and intended.
Summary One object of the present disclosure is therefore to obtain a robotic tool with improved reliability. This object is achieved with a robotic tool carrying out a method as defined in claim 1. More specifically, in a method of the initially men- tioned kind, data is acquired from a group comprising at least a first, a second and a third sensor arrangement covering a common overlapped field of the robotic tool surroundings. A correlation measure is determined for each pair of sensor ar- rangements in the group, and if the correlation measures related to one sensor arrangement in the group falls short of a threshold, tagging that sensor arrange- ment as unreliable. This means that a robotic tool itself can verify the operation of its sensors by letting two sensors determine the status of a third, for instance. Therefore, if for example a camera, a LIDAR, and a RADAR is used, the latter two can determine that the camera does not operate properly, perhaps being obscur- ed by dirt or leaves, for instance. lf so, the robotic tool may initiate different pro- cedures to deal with this problem, resulting in a more reliable robotic tool.
The present disclosure also considers a method for controlling an autonomous robotic tool, the tool comprising a plurality of sensor arrangements configured to acquire data concerning the surroundings of the robotic tool. The method includes acquiring data from at least a first, and a second sensor arrangement covering a common overlapped field of the robotic tool surroundings, determining a cor- relation measure for the first and second sensor arrangements, and if the correla- tion falls short of a threshold, tagging the first and second sensor arrangements as unreliable. This provides a result even if only two sensor arrangements are used, and it is not possible to determine which one of them is faulty.
Regardless of which of the above approaches is used, at least one sensor signal may be time-shifted prior to determining a correlation. This may compensate for different area coverages as the robotic tool moves.
A counter may be associated with each sensor arrangement, the counter being incremented each time the sensor arrangement is tagged as unreliable, and an action being initiated once the counter reaches a threshold.
The present disclosure further considers an autonomous robotic tool configured to carry out the above methods. Generally, the robotic tool then comprises a proces- sor and is configured to carry out the steps of the method.
The sensors with which the autonomous robotic tool is equipped may include sensors in the group comprising cameras, ultrasound sensors, RADARs, and LlDARs, for instance.
Brief description of the drawinqs Fig 1 illustrates a robotic tool under operation. Fig 2 schematically illustrates a control system for a robotic tool.
Fig 3 illustrates schematically the sensor verification according to an example of the present disclosure.
Fig 4 shows a flow chart illustrating a method according to the present disclosure.
Detailed description The present disclosure relates to robotic tools typically capable of autonomous operation. Robotic garden tools are primarily considered, such as robotic lawn mowers. However, the method of verifying sensors in a robotic system to be disclosed is applicable to general robotic and/or autonomous systems that move and/or carry out operations, such as cleaning robots or even transport vehicles.
Fig 1 illustrates a robotic tool 1 under operation. ln the illustrated case the robotic tool is an autonomous lawn mower 1, comprising an implement carrier 3 and which is devised to move using wheels 5 driven by one or more motors in the implement carrier 3. An implement 7 is connected to the implement carrier, in the illustrated case a cutting deck. The lawn mower 1 thus moves over a lawn while cutting grass, and this operation may be autonomous. ln early types of robotic lawnmowers, a working area was limited by a buried cable, which could be detected by the robotic lawn mower such that it could remain within the working area, moving therein in a random or semi-random fashion to process the area as a whole over time. lt has been suggested to develop more sophisticated control strategies allowing the robotic tool 1 to actually read its environment in order to navigate as well as avoiding obstacles, finding a charging station, etc. Then, the robotic tool 1 is provided with a number of sensors 9, 11, 13 to that end. The present disclosure is concerned with such developed robotic tools. The sensors may include cameras, RADARs (Radio detection and ranging), LlDARs (Light detection and ranging), ultrasonic sensors, etc. ln addition to this, the robotic tool may comprise different kinds to navigation sensors such as GPS and RTK (real-time kinematic) sensors and inertia measurement units. The present disclosure is, however, mostly concerned with the former group which are configured to acquire data concerning the surroundings of the robotic tool 1, such as typically objects 15 nearby. ln an autonomous system it is desired to verify the correct data acquisition of the sensors 9, 11, 13. For instance, a camera could be covered by cut grass, dust or falling leaves, for example, that makes its output unreliable and unsuitable as a basis for robotic tool control.
Fig 2 schematically illustrates a control system 19 for a robotic tool. The control system 19 receives data from sensor systems such as a camera 9, a LIDAR 11 and a RADAR 13. Based on data therefrom, and from data originating from other subsystems, a processor 21 of the control system 19, optionally using other stored data 23, controls steering- and processing features 25 of the robotic tool, such as wheel motors, steering, and operation of a cutting deck, for instance.
Returning to fig 1, the different sensors 9, 11, 13 cover most of the surrounding area close to the robotic tool 1. A camera 9, for instance, may cover a sector 27 in front of the robotic tool 1. A LIDAR 11 may cover a 360-degree sector 29 around the robotic tool 1, and the same applies to a RADAR sensor 13 also covering an entirely surrounding sector 31, but possibly with a different range. There may exist some blind spots 33 in the close vicinity of the robotic tool 1, but this does not substantially influence the method to be disclosed.
As is shown in fig 1, the sectors 27, 29, 31 are mutually overlapping to some extent and this is used to provide a verification of the sensors 9, 11, 13. Fig 3 illustrates schematically the sensor verification according to an example of the present disclosure.
Data from different sensors 9, 11, 13 are pair-wise correlated in correlation modules 35. For instance, the module 35 in the top of fig 3 correlates data from the camera 9 and the RADAR 13, and outputs a correlation result. This correlation result may be tested by comparison with a threshold in a threshold unit 37. The threshold may be specific for the involved pair of sensors 9, 13 and may also depend on other features such as the robotic tool's 1 velocity, for instance. lt is also possible to let the threshold depend on outputs from the parallel correlation units 35 testing other pairs of sensors 11, 13; 9, 11. This makes it possible to single out one individual, disturbed sensor. Thus, determining a correlation measure for each pair of sensor arrangements in a group of sensors may be carried out. lf the correlation measures related to one sensor arrangement in the group falls short of a threshold, that sensor may be tagged as unreliable. ln its simplest form, only two sensor arrangements are used, and a too low correlation between those two may indicate that the system as a whole is not functioning correctly.
As is evident from fig 1 features of the surroundings, e.g. object 15, will be detected at different instants by the different sensors 9, 11, 13. Therefore, returning to fig 3, a time shift unit 39 may be introduced in each parallel comparison between different sensors, which introduces a time shift, positive or negative, such that the correlation is made with time adjusted data.
This also means that the sensor sectors 27, 29, 31 need not be instantaneously overlapping. lt is sufficient if they overlap a common area surrounding the robotic tool 1 within a period of time.
The above 'units' are typically realized with software executed on a digital signal processor or the like.
Fig 4 shows a flow chart illustrating a basic method according to the present disclosure.
Thus, in a first step, data is acquired 41 from a group comprising at least a first, a second and a third sensor arrangement covering a common overlapped field of the robotic tool surroundings. lt is determined 43 a correlation measure for each pair of sensor arrangements in the group. lf the correlation measures related to one sensor arrangement in the group falls short of a threshold, that sensor is tagged as unreliable.
Expressed differently, for each overlapped area, observations of each sensor relating to that area is collected. lf all sensors agree on the acquired observation, then the robotic tool proceeds with its tasks without further investigations.
However, if the sensors do not agree, they use 'voting' to identify the sensor or sensors not agreeing with the majority of the other sensors. This can be done by pain/vise correlating the sensors as described above. The system tags the dis- agreeing sensor as unreliable. This requires three or more sensor arrangements. With only two, the system instead determine that the correlation is too low and flags that an error is likely. lt is further possible to check whether the deviating sensors are disturbed by closer objects that would have prevented them from seeing the common observations through occlusion, that is the sensor might be occluded by another closer object, not allowing the sensor to see the common objects observed by the other sensors. lf this is the case, there is no need to tag the sensor as unreliable. Then the test is repeated at another location where the previously occluding object is not likely a problem. lt is possible to count instances where a sensor is tagged as unreliable and to initiate a measure if that counter exceeds a threshold. lf the sensor does see a common feature, but misplaced with a consistent error, then it is possible to correct the sensor. This can be done physically, e.g. by redirecting a camera, or by adapting the robotic tool's 1 control software to take the misplacement into account.
A complete algorithm where three or more sensors are positioned so that their fields of view intersect can be summarized as follows. Such an intersection is called a Common Observation Region, COR.
The main steps include: a) b) d) f) 9) For each COR, collect the observations of each sensor belonging to that COR. lf all sensors agree on the extracted observation, then do nothing.
Othervvise go to step c).
Use voting to identify the sensor or sensors not agreeing with the majority of the other sensors.
Check if the deviating sensor is observing closer objects that would have prevented it from seeing the common observations through occlusion, i.e., the sensor might be seeing another object that occludes the common objects observed by the other sensors. lf the sensor is being occluded, then do nothing. The test can be repeated once the tool has moved. When there is no occlusion, proceed to the next step. lf the sensor should have observed the common feature, but did not, then mark it and associate a counter with that sensor. lf one sensor keeps deviating for a certain predefined number of times as determined by the counter, then: g1) lf the sensor does see the common features but it is misplacing them with a consistent error, then correct the pose of that sensor using the accumulated deviations. g2) Othervvise, the sensor likely behaves in an unpredictable way, which should generate a warning about its behavior.
This information can be used in different ways. There may be possible measures to reset or clean a sensor, for instance. lt is also possible to calibrate one sensor with the help of two or more other sensors. As another option a fault signal may be produced, which warns a user locally or remotely about a possible fault with the robotic tool The present invention is not limited to the above described examples and can be altered and varied in different ways within the scope of the appended claims.

Claims (9)

Claims
1. A method for controlling an autonomous robotic tool (1), the tool comprising a plurality of sensor arrangements (9, 11, 13) configured to acquire data concerning the surroundings of the robotic tool characterized by the following steps: acquiring data (41) from a group comprising at least a first, a second and a third sensor arrangement covering a common overlapped field of the robotic tool surroundings, determining a correlation (43) measure for each pair of sensor arrangements in the group, and if the correlation measures related to one sensor arrangement in the group falls short of a threshold, tagging (45) that sensor arrangement as unreliable.
2. A method for controlling an autonomous robotic tool, the tool comprising a plurality of sensor arrangements configured to acquire data concerning the surroundings of the robotic tool characterized by the following steps: acquiring data (41) from at least a first, and a second sensor arrangement covering a common overlapped field of the robotic tool surroundings, determining a correlation (43) measure for the first and second sensor arrangements, and if the correlation falls short of a threshold, tagging (45) the first and second sensor arrangements as unreliable.
3. l\/lethod according_j;¿;;=_ any of claims 1 or 2, wherein at least one sensor signal is time-shifted prior to determining a correlation.
4. l\/lethod according to any of the preceding claims, wherein a counter is associated with each sensor arrangement, the counter being incre- mented each time the sensor arrangement is tagged as unreliable, and an action is initiated once the counter reaches a threshold.
5. An autonomous robotic tool (1) comprising a plurality of sensor arrangements (9, 11, 13) configured to acquire data concerning the surroundings of the robotic tool characterized by being configured to acquire data (41) from a group comprising at least a first, a second and a third sensor arrangement covering a common overlapped field of the robotic tool surroundings, determine a correlation (43) measure for each pair of sensor arrangements in the group, and if the correlation measures related to one sensor arrangement in the group falls short of a threshold, tag (45) that sensor arrangement as unreliable.
6. An autonomous robotic tool comprising a plurality of sensor arrangements configured to acquire data concerning the surroundings of the robotic tool characterized by being configured to: acquire data (41) from at least a first, and a second sensor arrangement covering a common overlapped field of the robotic tool surroundings, determine a correlation (43) measure for the first and second sensor arrangements, and if the correlation falls short of a threshold, tag (45) the first and second sensor arrangements as unreliable.
7. Autonomous robotic tool according any of claims 5 or 6, wherein at least one sensor signal is time-shifted prior to determining a correlation.
8. Autonomous robotic tool according any of claims 5-7, wherein the sensors include sensors in the group comprising cameras, ultrasound sensors, RADARs, and LlDARs.
9. Autonomous robotic tool according any of claims 5-8, wherein a counter is associated with each sensor arrangement, the counter being incre- mented each time the sensor arrangement is tagged as unreliable, and the robotic tool is configured to initiate an action once the counter reaches a threshold.
SE2250029A 2022-01-17 2022-01-17 Method and device for determining if a sensor arrangement is unreliable SE545938C2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
SE2250029A SE545938C2 (en) 2022-01-17 2022-01-17 Method and device for determining if a sensor arrangement is unreliable
PCT/SE2022/051064 WO2023136758A1 (en) 2022-01-17 2022-11-14 Method and arrangment for controlling a robotic tool

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
SE2250029A SE545938C2 (en) 2022-01-17 2022-01-17 Method and device for determining if a sensor arrangement is unreliable

Publications (2)

Publication Number Publication Date
SE2250029A1 SE2250029A1 (en) 2023-07-18
SE545938C2 true SE545938C2 (en) 2024-03-19

Family

ID=84360103

Family Applications (1)

Application Number Title Priority Date Filing Date
SE2250029A SE545938C2 (en) 2022-01-17 2022-01-17 Method and device for determining if a sensor arrangement is unreliable

Country Status (2)

Country Link
SE (1) SE545938C2 (en)
WO (1) WO2023136758A1 (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008111727A1 (en) * 2007-03-14 2008-09-18 Electronics And Telecommunications Research Institute Method and apparatus for transmitting sensor status of radio frequency identification tag
US20190049962A1 (en) * 2017-08-10 2019-02-14 RavenOPS, Inc. Autonomous robotic technologies for industrial inspection
US20190302793A1 (en) * 2018-04-03 2019-10-03 Sharkninja Operating, Llc Time of flight sensor arrangement for robot navigation and methods of localization using same
US20190377086A1 (en) * 2018-06-08 2019-12-12 Ford Global Technologies, Llc Object tracking in blind-spot
US20200383265A1 (en) * 2017-12-14 2020-12-10 Husqvarna Ab A Robotic Working Tool and Work Area Boundary Sensing Method
US11036230B1 (en) * 2016-03-03 2021-06-15 AI Incorporated Method for developing navigation plan in a robotic floor-cleaning device
US20210176915A1 (en) * 2018-08-28 2021-06-17 Techtronic Cordless Gp An autonomous lawn mower and a system for navigating thereof

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10551838B2 (en) * 2017-08-08 2020-02-04 Nio Usa, Inc. Method and system for multiple sensor correlation diagnostic and sensor fusion/DNN monitor for autonomous driving application
US11119478B2 (en) * 2018-07-13 2021-09-14 Waymo Llc Vehicle sensor verification and calibration

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008111727A1 (en) * 2007-03-14 2008-09-18 Electronics And Telecommunications Research Institute Method and apparatus for transmitting sensor status of radio frequency identification tag
US11036230B1 (en) * 2016-03-03 2021-06-15 AI Incorporated Method for developing navigation plan in a robotic floor-cleaning device
US20190049962A1 (en) * 2017-08-10 2019-02-14 RavenOPS, Inc. Autonomous robotic technologies for industrial inspection
US20200383265A1 (en) * 2017-12-14 2020-12-10 Husqvarna Ab A Robotic Working Tool and Work Area Boundary Sensing Method
US20190302793A1 (en) * 2018-04-03 2019-10-03 Sharkninja Operating, Llc Time of flight sensor arrangement for robot navigation and methods of localization using same
US20190377086A1 (en) * 2018-06-08 2019-12-12 Ford Global Technologies, Llc Object tracking in blind-spot
US20210176915A1 (en) * 2018-08-28 2021-06-17 Techtronic Cordless Gp An autonomous lawn mower and a system for navigating thereof

Also Published As

Publication number Publication date
SE2250029A1 (en) 2023-07-18
WO2023136758A1 (en) 2023-07-20

Similar Documents

Publication Publication Date Title
EP3366100B1 (en) Robotic garden tool
Freitas et al. A practical obstacle detection system for autonomous orchard vehicles
AU2010257460A1 (en) Autonomous cutting element for sculpting grass
WO2016097891A1 (en) Robotic vehicle for detecting gps shadow zones
Chateau et al. Automatic guidance of agricultural vehicles using a laser sensor
Singh et al. Comprehensive automation for specialty crops: Year 1 results and lessons learned
Freitas et al. A low-cost, practical localization system for agricultural vehicles
CN112051841B (en) Obstacle boundary generation method and device
CN111604898B (en) Livestock retrieval method, robot, terminal equipment and storage medium
WO2022208219A1 (en) Agricultural analysis robotic systems and methods thereof
Velasquez et al. Multi-sensor fusion based robust row following for compact agricultural robots
Milella et al. RFID‐assisted mobile robot system for mapping and surveillance of indoor environments
Juman et al. An integrated path planning system for a robot designed for oil palm plantations
CN115063762A (en) Method, device and equipment for detecting lane line and storage medium
SE545938C2 (en) Method and device for determining if a sensor arrangement is unreliable
CN115880673B (en) Obstacle avoidance method and system based on computer vision
CN110989623A (en) Ground unmanned operation equipment, method and device for controlling movement of ground unmanned operation equipment, and storage medium
CN113701767B (en) Triggering method and system for map updating
CN113768420A (en) Sweeper and control method and device thereof
Liu et al. Vision-based Vineyard Navigation Solution with Automatic Annotation
Krause et al. AI-TEST-FIELD-An Agricultural Test Environment for Semantic Environment Perception with Respect to Harsh And Changing Environmental Conditions
WO2014074026A1 (en) A method for navigation and joint coordination of automated devices
CN105922258B (en) A kind of Omni-mobile manipulator autonomous navigation method based on iGPS
CN117054444B (en) Method and system for pipeline detection
Yairi Covisibility-based map learning method for mobile robots