SE544414C2 - Robotic tool and method of processing a work area partly during night using structured light - Google Patents
Robotic tool and method of processing a work area partly during night using structured lightInfo
- Publication number
- SE544414C2 SE544414C2 SE2050216A SE2050216A SE544414C2 SE 544414 C2 SE544414 C2 SE 544414C2 SE 2050216 A SE2050216 A SE 2050216A SE 2050216 A SE2050216 A SE 2050216A SE 544414 C2 SE544414 C2 SE 544414C2
- Authority
- SE
- Sweden
- Prior art keywords
- work tool
- robotic work
- objects
- robotic
- tool
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 19
- 238000003384 imaging method Methods 0.000 claims description 6
- 230000004888 barrier function Effects 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 238000013507 mapping Methods 0.000 description 2
- 239000011435 rock Substances 0.000 description 2
- 238000007664 blowing Methods 0.000 description 1
- 238000004140 cleaning Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000008570 general process Effects 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 230000000704 physical effect Effects 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 238000001429 visible spectrum Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01D—HARVESTING; MOWING
- A01D34/00—Mowers; Mowing apparatus of harvesters
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/38—Electronic maps specially adapted for navigation; Updating thereof
- G01C21/3804—Creation or updating of map data
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/40—Control within particular dimensions
- G05D1/43—Control of position or course in two dimensions
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D2101/00—Details of software or hardware architectures used for the control of position
- G05D2101/10—Details of software or hardware architectures used for the control of position using artificial intelligence [AI] techniques
- G05D2101/15—Details of software or hardware architectures used for the control of position using artificial intelligence [AI] techniques using machine learning, e.g. neural networks
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Environmental Sciences (AREA)
- Aviation & Aerospace Engineering (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- Manipulator (AREA)
- Harvester Elements (AREA)
Abstract
A robotic work tool (1) comprises a camera (5), a positioning device, and a map recorder. The robotic work tool (1) comprises a light source (6) emitting structured light in a light pattern. The camera (5) records image data comprising a projection of the emitted light pattern onto objects in front of the camera (5). The map recorder obtains spatial data representing said objects based on the light pattern, the recorded projection of the light pattern, and position data from the positioning device, and store a virtual map of a work area (2) based on said spatial data.A method of operating a robotic work tool (1) in a work area (2), comprises the steps of emitting a light pattern, detecting light reflected by one or more objects, comparing the emitted light pattern with the detected light, generating geometric data about characteristics of the one or more objects, and combining the generated data with position information into a map of the work area (2).
Description
ROBOTIC TOOL AND METHOD OF PROCESSING A WORK AREAPARTLY DURING NIGHT USING STRUCTURED LIGHT Field of the inventionThe present invention relates to a robotic work tool comprising a camera, apositioning device, and a map recorder.
The invention also relates to a method of operating a robotic work tool in a work area.
BackgroundEP2354878 A1 discloses a robotic mower with a structured light sensor for interpreting and measuring objects in its environment, so that a boundary of a workarea may be designated and maintained.
Although this and other examples of robotic work tools using structured light are known in the art, there may be some problems and limitations in their use.
Often the light sensors are used for determining the boundaries of the workarea, which means that the boundaries need to be physical boundaries, from whichstructured light may be reflected. This may be the case for robots used indoors,where the existing walls are natural barriers. For robots used outdoors, there may bea need for providing dedicated physical barriers, which may be cumbersome and time consuming.
Also, especially in the case of robotic work tools for outdoor use, there is thecombined problem of keeping track of the boundaries, as well as avoiding obstacles,which may have a varying character. Further, there is a need for navigating andperforming the work efficiently in the work area, while avoiding obstacles.
Summarylt is an object of the present invention to solve, or at least mitigate, parts or all of the above-mentioned problems. To this end, there is, in a first aspect of thedisclosure, provided a robotic work tool comprising a camera, a positioning device,and a map recorder, wherein the robotic work tool further comprises a light sourceconfigured to emit structured light in a light pattern and the camera is configured torecord image data comprising a projection of the emitted light pattern onto objects in front of the camera, and the map recorder is configured to obtain spatial datarepresenting said objects based on the light pattern, the recorded projection of thelight pattern onto said objects, and position data from the positioning device, andstore a virtual map of a work area based on said spatial data. _ The advantage of this is that a generated map may include the exactpositions, size and extent of obstacles, such as barriers, stones, pits, shrubbery, aswell as the boundaries of the work area. ln an embodiment, the robotic work tool has a positioning device which utilizesa GPS system.
Hereby, the positioning will be independent of the route travelled by the roboticwork tool. ln a further embodiment the robotic work tool has a positioning device whichutilizes an RTK system.
Thereby the positioning will be particularly accurate. ln an even further embodiment, the light pattern includes one or more ofrandom dot patterns, shadow lines and laser light.
Other types of structured light illumination may be used in combination withthe mentioned types. ln an embodiment the robotic work tool is further configured to use the virtualmap for navigation in the work area and around obstacles. ln a further embodiment the robotic work tool is further configured to obtainsaid spatial data representing said objects at least partly during darkness.
Using structured light for drawing up the virtual map during darkness has theeffect of increasing the depth perception of the camera. ln an even further embodiment, the robotic tool is configured to obtain saidspatial data representing said objects exclusively during darkness. ln a still further embodiment, the robotic work tool is further configured to workat least partly during darkness. ln yet another embodiment the robotic work tool is further configured tonavigate during daylight using the virtual map stored during darkness. ln an embodiment the robotic work tool is further configured obtain said spatialdata representing said objects at least partly while the robotic work tool works in the area.
Hereby the virtual map may be updated periodically or continuously duringwork, so that new obstacles or the absence of old obstacles will be integrated into the virtual map. ln another embodiment the robotic work tool is configured to capture, whilenavigating during daylight, imaging data of said objects, and based on said imagingdata, associate the imaging data with said spatial data representing said objects. ln a further embodiment the robotic work tool is configured to navigate, duringdaylight, based on the spatial data associated with the captured imaging data.
Hereby, information obtained during darkness may be used for an improvednavigation when the robotic tool is working, although the work is performed duringdaytime. ln an even further embodiment the robotic work tool is configured to createthe virtual map by combining the spatial data obtained during darkness, and theimage data from the camera during daylight. ln an embodiment the robotic work tool is an outdoor tool.ln a further embodiment the robotic work tool is a lawn mower. ln a second aspect of the disclosure, a method of operating a robotic work tool in a work area, comprises the steps of emitting a light pattern, detecting lightreflected by one or more objects, comparing the emitted light pattern with thedetected light, generating geometric data about e.g. size, shape and/or othercharacteristics of the one or more objects, combining the generated data withposition information, thereby generating a map of the work area. 3 The method may be performed with a wide variety of robotic work tools, e.g.,but not limited to a work tool in accordance with the preceding claims. lt is noted that embodiments of the invention may be embodied by all possiblecombinations of features recited in the claims. Further, it will be appreciated that thevarious embodiments described for the device are all combinable with the method asdefined in accordance with the second aspect of the present invention, and viceversa.
Brief description of the drawinqsThe above, as well as additional objects, features and advantages of the present invention, will be better understood through the following illustrative and non-limiting detailed description of preferred embodiments of the present invention, withreference to the appended drawings, where the same reference numerals will beused for similar elements, wherein: Fig. 1 in a work area; is a diagrammatic view in perspective of a robotic work tool Fig. 2illumination of an object; and is a schematic view of the principles of structured light Fig. 3 is a flow chart illustrating the steps of the method.
All the figures are schematic, not necessarily to scale, and generally only showparts which are necessary in order to elucidate the embodiments, wherein other partsmay be omitted.
Detailed description of the exemplarv embodimentsFig. 1 illustrates a robotic work tool 1, which is intended for work in a work area 2. The type of work and the type of work area 2 may be any of a wide variety,but often the work area is outdoors, such as a lawn, a driveway, or a sports field etc.The type of work performed may be lawn mowing, leaf collection, snow blowing, golfball collection etc. However, robotic work tools 1 may be used indoors as well, e. g.for cleaning purposes.
Typically, the work area 2 has boundaries 3, which are sometimes demarcatedby physical objects, such as walls, fences, etc. Sometimes the demarcation is 4 performed by an electric cable, which emits a magnetic field, which in turn may besensed by sensor means in the robotic work tool, in accordance with the knowledgeof the skilled person. Also, there is usually a base station 4, in order provide acharging point for the battery in the robotic work tool. ln order to avoid the placement of an electric cable in or on the ground, whichmay be cumbersome, the work area 2 may be defined by satellite navigation, e. g.GPS (Global Positioning System), RTK (Real Time Kinematic), or some otherpositioning system, preferably with as high a precision as possible. Often a localbeacon is provided in the base station 4 or elsewhere, as a complement to the GPSsignals, so that a higher degree of precision in the positioning may be obtained. Therobotic work tool 1 will hereby be able to decide its position in the work area 2 as wellas the boundaries 3 of the work area 2.
Fig. 1 is only a schematic illustration of the work area 2, and in practice theremay be any number of obstacles within the area, such as rocks, branches, pits,shrubbery, garden furniture, toys that have been left behind etc. The obstacles mayin some cases be permanent, and in some cases temporary. While the boundaries 3of the work area 2 are more or less predefined, the obstacles within the area 2 arenot known beforehand, and they may change over time. ln order to be able to process the work area 2 properly, despite any obstaclestherein, the robotic work tool 1 may be provided with a camera 5. Known methods ofimage recognition may be used, in order to determine the type and size of obstaclesencountered, already before the robotic work tool 1 makes physical contacttherewith. An advantageous route may be planned for avoiding the obstacles and forperforming the work efficiently.
However, often the known methods for image recognition are insufficient fordetermining the depth and structure of a detected obstacle. ln this disclosure thecamera 5 is supplemented with a light source 6 emitting structured light.
Structured light may be provided in many different forms. For example,random dot patterns, line patterns, shadow lines, or laser light may be used. Thewavelength of the light may vary in different applications. Visible light as well as lightoutside the visible spectrum may be used. As long as the camera 5 is able to detectand process the emitted light, any suitable wavelength may be used. lt should be noted that if structured light is used in daylight, difficulties may be encountered indetecting the reflected light, since daylight, be it direct or indirect sunlight, iscomposed of a wide spectrum of wavelengths, which may interfere with thewavelength of the structured light. Hence structured light is advantageously usedduring night-time, although depending on the wavelengths used, daytime use is alsopossible.
The general principle of structured light is that light with a known structure,such as lines, dots etc. is emitted from a light source 6 towards an object. This isshown in Fig. 2. The surface structure of the object, as well as its geometricalextension will distort the light pattern when it is reflected, and a camera 5 maycapture the reflected light pattern. By comparing the light pattern emitted from thelight source 6 with the light pattern captured by the camera 5, the size, extension,structure, and details of the object may be calculated. ln this disclosure structured light is combined with the use of an accurate positioning system, so that a virtual map of the work area 2 may be created. Theboundaries 3 of the work area, as well as any obstacles therein, may be registered inthe virtual map. The exact size and position of each obstacle may be registered, andin some embodiments, even the type of obstacle may be determined, which in turndetermines the minimum distance to be kept between the obstacle and the roboticwork tool 1 in order to avoid damages either to the obstacle or to the robotic work tool1. ln order to get as accurate information about each obstacle as possible, amapping of the work area 2 is often performed at night, when structured light may beused most efficiently. The virtual map obtained may, however be used in the daytimeas well as at night.
One option is to create the virtual map at night, when structured light may beused easily and efficiently in the darkness, and then let the robotic work tool 1perform its work, e. g. lawn mowing, during daytime. One advantage of that would bethat dew, in some climates, is less prevalent during daytime. Also, the tolerance fornoise, if any, from the robotic work tool 1 is higher during daytime. The camera 5 maybe used during daytime for navigation, i. e. the camera 5 may be used for providingimage data, which may be compared with data in the virtual map. Alternatively, the positioning system may be used for navigating in parallel with the camera 5 orinstead of the camera 5. By the term “darkness” is intended, herein, an illuminance ofapproximately less than 400 lux, below which illuminance structured light isparticularly efficient in mapping the geometry of the work area.
Another option is to create the virtual map at night and use it for navigatingduring work both at night and during daytime. As mentioned above, the camera 5and/or the positioning system may be used for navigation, possibly in combination.Any updating of the virtual map may take place during darkness, with efficient use ofstructured light.
Yet another option is to update the map with image data collected by thecamera during daytime, as well as regular updates during night-time. This updatingmay take place as a dedicated procedure, but it may also take place repeatedly whilethe robotic work tool 1 is working in the work area 2. ln this way the virtual map maybe more accurate, since it is updated more often with any new obstacles or withremoval of previously recorded obstacles. The work tool 1 may hence work moreefficiently and with a lower risk of encountering new obstacles. ln order to create a virtual map, the robotic work tool will go through a process,which is outlined in the flow chart of Fig. 3.
To begin with, the present position of the robotic work tool 1 is determined instep 7. The robotic work tool 1 will compare the present position with information ofthe boundaries 3 of the work area 2 in step 8, in order to determine whether therobotic work tool 1 is within the designated work area 2. Information of theboundaries 3 may be obtained e. g. from the presence of physical bars, such as afence, the presence of an electric boundary cable emitting a magnetic field, or fromGPS coordinates, which have been stored in the robotic work tool 1. The boundaries3 of the work area 2 will be recorded in the virtual map in the robotic work tool 1 instep 9.
At the same time, the camera 5 may register image data, either from reflectedstructured light or from general ambient light. The registered image data may becompared with stored image data, for identification of obstacles in step 10. lf anobstacle is encountered, its position and properties, such as its size, surfacestructure and details may be recorded in the virtual map. ln some cases, the robotic 7 work tool 1 may be able to deduct the type of obstacle from the recorded data, and itmay then be possible to decide on a suitable behaviour of the robotic work tool 1close to the obstacle. For example, the robotic work tool 1 may be able to pass closerto shrubbery with flexible twigs, than to non-resilient obstacles such as rocks.
When an obstacle has not been identified in step 10, the steps 7-10 may beiterated a number of times, and the virtual map is updated with the boundaries 3 ofthe work area 2 as well as the obstacles found therein. The general process for therecordal of the virtual map is basically the same regardless of whether structuredlight or ambient light is used for detecting obstacles.
When a virtual map has been established in the robotic work tool 1, it may beused for guidance of the robotic work tool 1 during its work. Hence the mappingprocedure and the work procedure may be separate procedures performed atseparate time periods.
Another option is to perform an update procedure of the virtual map at thesame time as the robotic work tool 1 is working in the work area 2. A first version ofthe virtual map has been created during an initial mapping procedure, and wheneverthe robotic work tool 1 encounters an obstacle or a boundary 3, a comparison will bemade with the virtual map, in order to verify that the obstacle or boundary 3 is thesame as before, with regard to its position as well as to its physical properties.
By the same token, when the virtual map defines the presence of a boundary3 or an obstacle, a comparison will be made with the present image data registeredby the camera 5. ln this way the virtual map may be regularly updated with newinformation about the obstacles and boundaries 3. There may be new obstacles,such as fallen branches or molehills, and previous obstacles may have beenremoved, either intentionally by an operator or unintentionally, such as by the wind.
The invention has mainly been described above with reference to a fewembodiments. However, as is readily appreciated by a person skilled in the art, otherembodiments than the ones disclosed above are equally possible within the scope ofthe invention, as defined by the appended patent claims. ln the claims, the word "comprising" does not exclude other elements or steps,and the indefinite article "a" or "an" does not exclude a plurality. 8
Claims (15)
1. _ A robotic work tool (1) comprising a camera (5), a positioning device, and a map recorder, wherein the robotic work tool (1) further comprises a lightsource (6) configured to emit structured light in a light pattern and the camera(5) is configured to record image data comprising a projection of the emittedlight pattern onto objects in front of the camera (5), and the map recorder isconfigured to obtain spatial data representing said objects based on the lightpattern, the recorded projection of the light pattern onto said objects, andposition data from the positioning device, and store a virtual map of a workarea (2) based on said spatial data, wherein the robotic work tool is furtherconfigured to obtain said spatial data representing said objects at least partlyduring darkness.
2. _ A robotic work tool (1) according to claim 1, wherein the positioning device utilizes a GPS system.
3. _ A robotic work tool (1) according to claim 1 or claim 2, wherein the positioning device utilizes an RTK system.
4. _ A robotic work tool (1) according to any of the preceding claims, wherein the light pattern includes one or more of random dot patterns, shadow lines andlaser light.
5. _ A robotic work tool (1) according to any of claims 1 to 4, further configured to use the virtual map for navigation in the work area (2) and around obstacles.
6. _ A robotic work tool (1) according to any of claims 1 to 5, wherein the robotic tool is configured to obtain said spatial data representing said objects exclusively during darkness.
7. _ A robotic work tool (1) according to any of claims 1 to 6, further configured to work at least partly during darkness.
8. _ A robotic work tool (1) according to any of claims 1 to 7, further configured to navigate during daylight using the virtual map stored during darkness.
9. A robotic work tool (1) according to any of the preceding claims, furtherconfigured to obtain said spatial data representing said objects at least partlywhile the robotic work tool (1 ) works in the area (2).
10.A robotic work tool (1) according to any of the preceding claims, wherein therobotic work tool (1) is configured to capture, while navigating during daylight,imaging data of said objects, and based on said imaging data, associate theimaging data with said spatial data representing said objects.
11.A robotic work tool (1) according to claim 10, wherein the robotic work tool (1)is configured to navigate, during daylight, based on the spatial dataassociated with the captured imaging data.
12.A robotic work tool (1) according to claim 10 or claim 11, further configured tocreate the virtual map by combining the spatial data obtained duringdarkness, and the image data from the camera (5) during daylight.
13.A robotic work tool (1) according to any of the preceding claims, wherein thework tool (1) is an outdoor tool.
14.A robotic work tool (1) according to any of the preceding claims, wherein the work tool (1) is a lawn mower.
15.Method of operating a robotic work tool (1) in a work area (2), comprising emitting a light pattern; detecting light reflected by one or more objects; comparing the emitted light pattern with the detected light; generating geometric data about e.g. size, shape and/or othercharacteristics of the one or more objects; combining the generated data with position information, thereby generating a map ofthe work area (2) wherein said data representing said objects is obtained at leastpartly during darkness.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
SE2050216A SE544414C2 (en) | 2020-02-27 | 2020-02-27 | Robotic tool and method of processing a work area partly during night using structured light |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
SE2050216A SE544414C2 (en) | 2020-02-27 | 2020-02-27 | Robotic tool and method of processing a work area partly during night using structured light |
Publications (2)
Publication Number | Publication Date |
---|---|
SE2050216A1 SE2050216A1 (en) | 2021-08-28 |
SE544414C2 true SE544414C2 (en) | 2022-05-17 |
Family
ID=77745391
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
SE2050216A SE544414C2 (en) | 2020-02-27 | 2020-02-27 | Robotic tool and method of processing a work area partly during night using structured light |
Country Status (1)
Country | Link |
---|---|
SE (1) | SE544414C2 (en) |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2354878A1 (en) * | 2010-02-02 | 2011-08-10 | Deere & Company | Method for regenerating a boundary containing a mobile robot |
AU2013263851A1 (en) * | 2010-12-30 | 2014-01-09 | Irobot Corporation | Mobile robot system |
US9423797B2 (en) * | 2011-04-19 | 2016-08-23 | Lg Electronics Inc. | Robot cleaner and remote monitoring system and method of the same |
JP2017050018A (en) * | 2010-12-30 | 2017-03-09 | アイロボット コーポレイション | Movable robot system |
WO2019012534A1 (en) * | 2017-07-12 | 2019-01-17 | Guardian Optical Technologies Ltd. | Visual, depth and micro-vibration data extraction using a unified imaging device |
US20190120633A1 (en) * | 2017-10-17 | 2019-04-25 | AI Incorporated | Discovering and plotting the boundary of an enclosure |
CN110275540A (en) * | 2019-07-01 | 2019-09-24 | 湖南海森格诺信息技术有限公司 | Semantic navigation method and its system for sweeping robot |
-
2020
- 2020-02-27 SE SE2050216A patent/SE544414C2/en unknown
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2354878A1 (en) * | 2010-02-02 | 2011-08-10 | Deere & Company | Method for regenerating a boundary containing a mobile robot |
AU2013263851A1 (en) * | 2010-12-30 | 2014-01-09 | Irobot Corporation | Mobile robot system |
JP2017050018A (en) * | 2010-12-30 | 2017-03-09 | アイロボット コーポレイション | Movable robot system |
US9423797B2 (en) * | 2011-04-19 | 2016-08-23 | Lg Electronics Inc. | Robot cleaner and remote monitoring system and method of the same |
WO2019012534A1 (en) * | 2017-07-12 | 2019-01-17 | Guardian Optical Technologies Ltd. | Visual, depth and micro-vibration data extraction using a unified imaging device |
US20190120633A1 (en) * | 2017-10-17 | 2019-04-25 | AI Incorporated | Discovering and plotting the boundary of an enclosure |
CN110275540A (en) * | 2019-07-01 | 2019-09-24 | 湖南海森格诺信息技术有限公司 | Semantic navigation method and its system for sweeping robot |
Also Published As
Publication number | Publication date |
---|---|
SE2050216A1 (en) | 2021-08-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Holopainen et al. | Tree mapping using airborne, terrestrial and mobile laser scanning–A case study in a heterogeneous urban forest | |
ES2314909T3 (en) | DEVICE FOR PREPARING A MAP OF BAD HERBS. | |
ES2320023T3 (en) | PROCEDURE FOR THE ANALYSIS OF SOIL SURFACES AND LAWN MAINTENANCE ROBOT TO PRACTICE THE PROCEDURE. | |
Reutebuch et al. | Light detection and ranging (LIDAR): an emerging tool for multiple resource inventory | |
CN106093954A (en) | A kind of Quick Response Code laser ranging vehicle positioning method and equipment thereof | |
US7650013B2 (en) | System and method for map and position-determination enhancement | |
CN104067145B (en) | Beta pruning robot system | |
Tanhuanpää et al. | Mapping of urban roadside trees–A case study in the tree register update process in Helsinki City | |
Underwood et al. | Lidar‐based tree recognition and platform localization in orchards | |
JP4980606B2 (en) | Mobile automatic monitoring device | |
US20110150348A1 (en) | Automated tagging for landmark identification | |
CN113167581A (en) | Measuring method, measuring system and auxiliary measuring instrument | |
KR20110119783A (en) | Method for visualization of point cloud data based on scene content | |
CN111142116B (en) | Road detection and modeling method based on three-dimensional laser | |
Singh et al. | Comprehensive automation for specialty crops: Year 1 results and lessons learned | |
KR20120104529A (en) | Method and system for locating a stem of a target tree | |
Han et al. | Extraction of multilayer vegetation coverage using airborne LiDAR discrete points with intensity information in urban areas: A case study in Nanjing City, China | |
EP3616121A2 (en) | Determining risk posed by vegetation | |
FI20186029A1 (en) | Method and system for generating forestry data | |
USRE42439E1 (en) | Canopy modification using computer modelling | |
CN106873583A (en) | Autonomous type implement | |
CN111602028B (en) | Method for automatically guiding a vehicle along a virtual rail system | |
CN113115622A (en) | Visual robot obstacle avoidance control method and device and mowing robot | |
Abd Latif et al. | Delineation of tree crown and canopy height using airborne LiDAR and aerial photo | |
SE544414C2 (en) | Robotic tool and method of processing a work area partly during night using structured light |