GB2563347A - Method for a working region acquisition of at least one working region of an autonomous service robot - Google Patents
Method for a working region acquisition of at least one working region of an autonomous service robot Download PDFInfo
- Publication number
- GB2563347A GB2563347A GB1813548.3A GB201813548A GB2563347A GB 2563347 A GB2563347 A GB 2563347A GB 201813548 A GB201813548 A GB 201813548A GB 2563347 A GB2563347 A GB 2563347A
- Authority
- GB
- United Kingdom
- Prior art keywords
- working region
- service robot
- autonomous service
- map
- unit
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 238000000034 method Methods 0.000 title claims abstract description 129
- 230000000007 visual effect Effects 0.000 claims abstract description 38
- 238000001514 detection method Methods 0.000 claims abstract description 12
- 230000001105 regulatory effect Effects 0.000 claims description 28
- 230000004807 localization Effects 0.000 description 5
- 230000033001 locomotion Effects 0.000 description 4
- 230000006870 function Effects 0.000 description 3
- 238000004140 cleaning Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000000638 solvent extraction Methods 0.000 description 2
- 238000010408 sweeping Methods 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 230000006978 adaptation Effects 0.000 description 1
- 230000000712 assembly Effects 0.000 description 1
- 238000000429 assembly Methods 0.000 description 1
- 238000013475 authorization Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 230000000704 physical effect Effects 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 230000009182 swimming Effects 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01D—HARVESTING; MOWING
- A01D34/00—Mowers; Mowing apparatus of harvesters
- A01D34/006—Control or measuring arrangements
- A01D34/008—Control or measuring arrangements for automated or remotely controlled operation
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0246—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0268—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
- G05D1/0274—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Physics & Mathematics (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- General Physics & Mathematics (AREA)
- Aviation & Aerospace Engineering (AREA)
- Environmental Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Electromagnetism (AREA)
- Manipulator (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
A method for a working region acquisition of at least one working region of an autonomous service robot 10a in particular of an autonomous lawn mower, by means of at least one capture unit 12a and a generating unit 14a wherein in a first method step 16a, before a working operation of the autonomous service robot, at least one visual media file of the at least one working region is captured by the at least one capture unit and in a second method step 18a a map of the at least one working region is generated by the generating unit from the at least one visual media file, wherein in a further method step, on detection of a change in the at least one working region, a notice for a renewed generation of the map is automatically output to an operator. Also disclosed is an autonomous service robot adapted to carry out the method.
Description
Description
Method for a working region acquisition of at least one working region of an autonomous service robot
Prior art
There has already been proposed a method for a working region acquisition of at least one working region of an autonomous service robot.
Disclosure of the invention
There is proposed a method for a working region acquisition of at least one working region of an autonomous service robot, in particular of an autonomous lawn mower, by means of at least one capture unit and a generating unit, wherein in a first method step, before a working operation of the autonomous service robot, at least one visual media file of the at least one working region is captured by the at least one capture unit and in a second method step a map of the at least one working region is generated by the generating unit from the at least one visual media file. A "working region acquisition" is to be understood in this context in particular as a process in which a working region, in particular a working region of an autonomous service robot, is at least partly recognised. Preferably, it is to be understood in particular as a process in which a working region is in particular virtually acquired. Particularly preferably, it is to be understood in particular as a process in which a working region is at least two-dimensionally acquired and in particular a delimitation of the working region is set. Furthermore, a "working region" is to be understood in this context in particular as a region which defines an area to be worked on by the autonomous service robot. In addition, an "autonomous service robot" is to be understood in this context in particular as an apparatus which performs work at least partly automatically, such as in particular automatically begins it, automatically ends it and/or automatically selects at least one parameter, such as in particular a route parameter, and/or a reversal point etc. Preferably, it is to be understood as an apparatus which is at least partly provided for rendering a service line to an operator and/or human in general. Preferably, it is to be understood in particular as an apparatus which moves automatically at least for performing work, or moves along in particular autonomously in a preset working region. Particularly preferably, the apparatus is provided for working on a working region and in particular the area, such as for example for sweeping, vacuuming, cleaning and/or mowing a lawn situated on the area. In this case, various autonomous service robots which are considered appropriate by a person skilled in the art are conceivable, such as in particular autonomous sweeping machines, autonomous vacuum cleaners, autonomous swimming pool cleaning machines, autonomous floor wiping robots and/or particularly preferably autonomous lawn mowers. In principle, however, other autonomous service robots which are considered appropriate by a person skilled in the art are also conceivable. A "capture unit" is to be understood in this context in particular as a unit which is provided for a visual acquisition. Preferably, it is to be understood in particular as a unit which has a sensor, such as in particular at least one image sensor, which is provided for a capture of at least one piece of visual information. Particularly preferably, it is to be understood in particular as a unit which is provided for a capture of individual images and/or image sequences and/or films. Furthermore, a "generating unit" is to be understood in this context in particular as a unit which is provided for a generation of a map. Preferably, it is to be understood as a unit which is provided for a generation of a map from at least one visual media file. Preferably, the generating unit has at least one computing unit. Particularly preferably, the generating unit is formed separately from the autonomous service robot. Alternatively or additionally, the generating unit could also form a part of the autonomous service robot. In this case, a "computing unit" is to be understood in particular as a unit with an information input, an information processing and/or an information output. Advantageously, the computing unit has at least a processor, a memory, input and output means, further electrical components, an operating program, regulating routines, control routines and/or calculating routines. Preferably, the components of the computing unit are arranged on a common printed circuit board and/or advantageously in a common housing. A "working operation of the autonomous service robot" is to be understood in this context in particular as an operating state of the autonomous service robot in which the autonomous service robot performs effective work. Preferably, it is to be understood as an operating state in which the autonomous service robot is provided for performing core work. In this case, "core work" is to be understood in particular as work which consists in particular in a service to be performed by the autonomous service robot. A "visual media file" is to be understood in this context in particular as a file which comprises at least visual information which is present in particular in digital form. Preferably, it is to be understood in particular as a file which comprises in particular at least image and/or video information. Particularly preferably, it is to be understood in particular as an image file and/or a video file. In principle, however, other visual media files which are considered appropriate by a person skilled in the art are also conceivable. Furthermore, a "map" is to be understood in this context in particular as a, preferably digital, representation of a region and/or an environment. Preferably, it is to be understood in particular as a topographical map which comprises in particular information on heights, scale, ground conditions and/or possible obstacles, "provided" is to be understood in this context in particular as specially programmed, designed and/or equipped. That an object is provided for a particular function is to be understood in particular that the object fulfils and/or performs this particular function in at least one application state and/or operating state. A working region acquisition can be effected advantageously simply by the method according to the invention. In particular, a working region acquisition can thereby be effected even before a working operation of the autonomous service robot. Furthermore, a working region acquisition can be effected from visual media files, whereby in particular a laying or an arranging of a physical boundary can be avoided. As a result, in particular an expenditure of labour can in turn be kept low. In addition, a working region acquisition can thereby in turn be effected free from changes of a working region.
Furthermore, it is proposed that in the first method step, before a first putting into operation of the autonomous service robot, at least one visual media file of the at least one working region is captured by the capture unit. A "first putting into operation" is to be understood in this context in particular as a first putting into operation of the autonomous service robot after a purchase and/or after a resetting and/or after a change of a working region. A particularly advantageous working region acquisition can thereby be effected. Furthermore, a working region acquisition can thereby be effected even before a first putting into operation.
In addition, it is proposed that in the first method step, before a working operation of the autonomous service robot, at least one visual media file of the at least one working region is captured by the at least one capture unit independent of the autonomous service robot, "independent" is to be understood in this context in particular to mean that the capture unit is formed separately from the autonomous service robot. Preferably, it is to be understood to mean that the capture unit works or functions independently of the autonomous service robot. Particularly preferably, it is to be understood in particular to mean that the capture unit is formed by a stand-alone apparatus. The at least one visual media file of the at least one working region can thereby be captured independently of the autonomous service robot. In particular, the visual media file can be captured even before a first putting into operation.
It is furthermore proposed that the method has a further method step, in which a boundary of the at least one working region is determined in the map of the at least one working region by the generating unit. A "boundary" is to be understood in this context in particular as at least one material and/or immaterial line which delimits a working region. Preferably, it is to be understood in particular as at least one line which completely delimits a working region. Particularly preferably, the boundary consists of one or more in each case closed line/lines. A boundary of the at least one working region can thereby be set advantageously automatically. Preferably, in particular already given, in particular logical, delimitations can thereby be determined as boundaries. In particular, obstacles, steep slopes in the ground and/or other conditions which are considered appropriate by a person skilled in the art can be acquired by the generating unit and already be identified and subsequently determined as logical boundaries.
It is further proposed that the map and/or the boundary of the at least one working region is released by the generating unit for an editing. Preferably, the map and/or the boundary of the at least one working region is released by the generating unit for an editing by an operator. Particularly preferably, the map and/or the boundary of the at least one working region is output by the generating unit to an operator for an editing. Particularly preferably, an editing of the map and/or of the boundary of the at least one working region is made possible for an operator via the generating unit directly on the generating unit and/or on a unit connected to the generating unit. In particular, errors in the map and/or peculiarities in a boundary can thereby be manually corrected or adapted by an operator .
Furthermore, it is proposed that the method has a further method step, in which the at least one media file of the at least one working region and/or the map of the at least one working region and/or the boundary of the at least one working region is transmitted via an interface of the autonomous service robot to the autonomous service robot. Preferably, in the further method step, the at least one media file of the at least one working region and/or the map of the at least one working region and/or the boundary of the at least one working region is transmitted via an interface of the autonomous service robot to the autonomous service robot on a first putting into operation of the autonomous service robot and/or after a change of the working region. An "interface" is to be understood in this context in particular as a unit which is provided for an exchange of data. In particular, the interface has at least one information input and at least one information output. Preferably, the interface has at least two information inputs and at least two information outputs, in each case at least one information input and at least one information output being provided for a connection to a physical system. Particularly preferably, it is to be understood as an interface between at least two physical systems, such as in particular between the autonomous service robot and at least one apparatus of the outside world. Various interfaces which are considered appropriate by a person skilled in the art are conceivable. In particular, however, it is to be understood as a wireless interface, such as for example Bluetooth, WLAN, UMTS or NFC, a wired interface, such as for example a USB connection, and/or a drive interface by means of a memory medium, such as for example a memory card, a memory stick or a CD. In particular an advantageous transmission of the at least one media file and/or of the map and/or of the boundary can thereby be realised.
In addition, it is proposed that the method has a further method step, in which the autonomous service robot localises itself in a working region with the aid of the map of the at least one working region. Preferably, the autonomous service robot has at least one sensor unit, via which a localisation in the working region is effected. A "sensor unit" is to be understood in this context in particular as a unit which is provided for capturing at least one characteristic quantity and/or one physical property, it being possible for the capture to take place actively, such as in particular by the generating and emitting of an electrical measuring signal, and/or passively, such as in particular by an acquisition of property changes of a sensor component. Various sensor units which are considered appropriate by a person skilled in the art are conceivable, such as for example optical sensors, GPS receivers and/or acoustic sensors, such as in particular ultrasonic sensors. The autonomous service robot can thereby advantageously navigate with the aid of the map. Furthermore, a random setting down of the autonomous service robot within the working region can thereby be realised. In addition, in particular an operation independently of a position of a base station of the autonomous service robot can thereby be effected.
It is furthermore proposed that the method has a further method step, in which, on detection of a change of the at least one working region, the map of the at least one working region is automatically adapted by the autonomous service robot. A continuous adaptation of the map to the working region can thereby advantageously be effected. Furthermore, it can be achieved that changes which have been produced for example by an operator are in particular automatically acquired and taken into account. In particular a continuous complete remapping of the working region can thereby be avoided.
It is further proposed that the method has a further method step, in which, on detection of a change of the at least one working region, a notice for a renewed generation of a map of the at least one working region is automatically output to an operator by the autonomous service robot. Changes of the working region can thereby be taken into account in particular advantageously in a simple and cost-effective manner. Furthermore, it can thereby in particular be achieved that a map is continuously kept up-to-date and is adapted to changes of the working region.
Additionally, an autonomous service robot, in particular an autonomous lawn mower, for carrying out a method according to the invention is proposed.
It is proposed that the autonomous service robot has at least one control and/or regulating unit which has at least one memory unit and at least one interface which is provided for transmitting at least one externally generated map of at least one working region to the memory unit. A "control and/or regulating unit" is to be understood in particular as a unit having at least control electronics, "control electronics" is to be understood in particular as a unit having a processor unit and having a memory unit and also having an operating program stored in the memory unit.
Furthermore, a "memory unit" is to be understood in this context in particular as a unit which is provided for storing at least one piece of information, advantageously independently of a power supply. Advantageously the externally generated map of the working region can thereby be stored in the autonomous service robot and fetched as required.
Furthermore, it is proposed that the at least one control and/or regulating unit has at least one control system which is provided for a processing of the at least one map of at least one working region. A "control system" is to be understood in this context in particular as a system which forms at least a part of an operating system and/or software. Preferably, the control system is provided for being executed on at least one processor. Particularly preferably, the control system has at least one regulating routine, control routine and/or calculating routine. Advantageously a processing of the at least one map can thereby be effected directly on the autonomous service robot.
In addition, it is proposed that the autonomous service robot has at least one sensor unit which is provided for detecting unforeseen obstacles. An "unforeseen obstacle" is to be understood in this context in particular as an obstacle which is not stored on the at least one map and/or which is not foreseeable by the autonomous implement. Advantageously a technically dangerous collision, in particular even a collision generally, can thereby be prevented. Furthermore, obstacles in the working region which are not mapped can thereby be acquired.
It is furthermore proposed that the sensor unit is provided, on a detection of an unforeseen obstacle, for outputting a signal to the control and/or regulating unit, which control and/or regulating unit is provided for storing the position of the unforeseen obstacle on the at least one map of the at least one working region. New, unforeseen obstacles can thereby be permanently stored and thus also taken into account. In particular, a renewed map generation can thereby be avoided. Furthermore, it can thereby be achieved that a map dynamically adapts to changes of the environment.
The method according to the invention and also the autonomous service robot according to the invention are not intended to be limited here to the above-described application and embodiment. In particular, the method according to the invention and also the autonomous service robot according to the invention can have, for a fulfilling of a functioning described herein, a number of individual elements, components and units differing from a number mentioned herein.
Drawing
Further advantages emerge from the following description of the drawing. Three exemplary embodiments of the invention are represented in the drawing. The drawing, the description and the claims contain numerous features in combination. A person skilled in the art will consider the features expediently also individually and combine them to form useful further combinations.
In the drawing:
Fig. 1 shows a flow chart of a method according to the invention by means of a capture unit, by means of a generating unit and by means of an autonomous service robot in a schematic representation,
Fig. 2 shows the autonomous service robot for carrying out the method according to the invention in a schematic representation,
Fig. 3 shows a flow chart of an alternative method according to the invention by means of a capture and generating apparatus and by means of an autonomous service robot in a schematic representation,
Fig. 4 shows a flow chart of a further alternative method according to the invention by means of a capture unit and by means of an autonomous service robot in a schematic representation, and
Fig. 5 shows the autonomous service robot for carrying out the further alternative method according to the invention in a schematic representation.
Description of the exemplary embodiments
Figure 1 shows a flow chart of a method according to the invention by means of a capture unit 12a, by means of a generating unit 14a and by means of an autonomous service robot 10a.
The autonomous service robot 10a is formed by an autonomous lawn mower. The autonomous service robot 10a is provided for mowing a working region. The working region is formed by a meadow or a lawn of a garden. In principle, however, it would also be conceivable for the autonomous service robot 10a to be formed by a vacuum cleaner robot or another service robot which is considered appropriate by a person skilled in the art and for the working region accordingly to be formed, for example, by a room. Furthermore, the autonomous service robot 10a can also be programmed for a plurality of working regions. The different working regions can in this case be separately stored and separately selected according to the current location of the autonomous service robot 10a. The autonomous service robot 10a has a control and regulating unit 32a. The control and regulating unit 32a has a memory unit 34a and an interface 24a. Furthermore, the control and regulating unit 32a is arranged in a housing 40a of the autonomous service robot 10a. The interface 24a of the control and regulating unit 32a is provided for transmitting an externally generated map of a working region to the memory unit 34a. The interface 24a is formed by a wireless interface. The interface 24a is formed by a WLAN interface. In principle, however, other wireless interfaces which are considered appropriate by a person skilled in the art are also conceivable. Additionally or alternatively, it would, however, also be conceivable for the interface 24a to be formed by a wired interface or by a drive interface or a further additional interface. The control and regulating unit 32a furthermore has a control system 36a. The control system 36a is providing for a processing the map of the working region. The control system 36a is provided for reading out and interpreting the map or a file of the map which is stored in the memory unit 34a. The control system 36a is executed on a processor of the control and regulating unit 32a. In addition, the control and regulating unit 32a has an output unit 42a, via which information can be output to an operator 48a. The output unit 42a is formed by a display (Figure 2).
In addition, the autonomous service robot 10a has a sensor unit 38a. The sensor unit 38a is provided for detecting unforeseen obstacles. The sensor unit 38a is provided for outputting a signal to the control and regulating unit 32a on a detection of an unforeseen obstacle, which control and regulating unit is provided for storing the position of the unforeseen obstacle on the map of the working region. The sensor unit 38a is arranged substantially inside the housing 40a of the autonomous service robot 10a. Viewed along a direction of travel 44a of the autonomous service robot 10a, the sensor unit 38a is arranged in a front region of the autonomous service robot 10a. In addition, the sensor unit 38a is aimed forwards approximately parallel to the direction of travel 44a. Furthermore, the sensor unit 38a is formed by an optical sensor. The sensor unit 38a is formed by a camera. In principle, however, another configuration of the sensor unit 38a which is considered appropriate by a person skilled in the art would also be conceivable. Furthermore, it would additionally, in principle, be conceivable for the autonomous service robot 10a to have at least one further sensor unit (Figure 2).
The method according to the invention is provided for a working region acquisition of a working region of the autonomous service robot 10a. The method is effected by means of a capture unit 12a, by means of the generating unit 14a and by means of the autonomous service robot 10a. The capture unit 12a is formed by a photo-camera. In principle, however, it would also be conceivable for the capture unit 12a to be formed by another apparatus which is considered appropriate by a person skilled in the art or a part of an apparatus, such as in particular by a mobile telephone, by a smartphone or by a video camera, or for the capture unit 12a to form a part of another apparatus. The capture unit 12a is formed independently of the autonomous service robot 10a. The capture unit 12a is completely separated from the autonomous service robot 10a and can be formed by a commercially available capture unit, such as in particular a commercially available photo-camera. The generating unit 14a is formed by a computer. In principle, however, it would also be conceivable for the generating unit 14a to be formed by another apparatus which is considered appropriate by a person skilled in the art or a part of an apparatus. In the method, in a first method step 16a, 16a' a plurality of visual media files of the working region are captured by the capture unit 12a. The first method step 16a, 16a' of the method is effected before a working operation of the autonomous service robot 10a. The first method step 16a, 16a' of the method is effected before a first putting into operation, i.e. before a first journey of the autonomous service robot 10a. The visual media files are formed by image files or by video files. In a first variant of the first method step 16a, image files of the working region are captured. In a second variant of the first method step 16a', a video file of the working region is captured. Subsequently, the virtual media files are transmitted by the capture unit 12a to the generating unit 14a (Figure 1).
In a second method step 18a, a map of the working region is generated by the generating unit 14a from the visual media files. The generation is effected by means of software using the virtual media files. The generation is effected by means of image processing algorithms. The generation is effected by means of Bundle Adjustment, Structure for Motion and Visual SLAM. In principle, however, other image processing algorithms which are considered appropriate by a person skilled in the art or only a part of the aforementioned image processing algorithms are also conceivable. The generation of the map can, alternatively or additionally to the generating unit 14a, also be transferred at least partly via an Internet connection to a server 46a and/or carried out by means of an Internet application. In principle, it would also be conceivable for the generating unit 14a to be formed by a server 46a or a part of a server 46a and for the capture unit 12a to transmit the media files directly to the server 46a. Subsequently, in a further method step 20a, a boundary of the working region is determined by the generating unit 14a in the map of the working region. The boundary is in this case determined from logical delimitations. In this case, obstacles and steep slopes in the ground serve as logical delimitations. A boundary is automatically set by the generating unit 14a as soon as the ground would become too steep for the autonomous service robot 10a. Furthermore, boundaries are automatically set around or at obstacles such as steps, trees, hedges etc. (Figure 1).
After the boundary of the working region has been determined, the map of the working region with the boundary of the working region is released by the generating unit 14a for an editing by an operator 48a. The released map with the boundary can be edited by an operator 48a in a further method step 50a. An editing is effected in this case directly on the generating unit 14a. The map with the boundary is in this case output via a display screen to an operator 48a and can be changed as desired. In principle, however, it would also be conceivable for a distinction to be made between different authorisation levels of operators 48a upon an editing. In this case, it would in particular be conceivable for only authorised service personnel to be able to edit the map per se and for an end user to be merely permitted to edit the boundary (Figure 1).
After the editing of the map by the operator 48a, a partitioning of the map of the working region is additionally carried out by the generating unit 14a in a further method step 52a. In this case, the working region is subdivided into different partial working regions. The partial working regions are in this case assigned different priorities and different repetition frequencies by the generating unit 14a. In principle, the prioritising and the determination of the repetition frequencies can be carried out in particular also by the operator 48a. Furthermore, in the method step 52a, a preferred trajectory of the autonomous service robot 10a is determined by the generating unit 14a. The preferred trajectory in this case presets a movement path of the autonomous service robot 10a for an operation. In principle, however, it would also be conceivable for the generating unit 14a to generate merely a virtual auxiliary line in the map for the autonomous service robot 10a. The virtual auxiliary line can in this case assist the autonomous service robot 10a in a particularly efficient working. Furthermore, operator requests which cannot be directly input on the autonomous service robot 10a can be taken into account in the virtual auxiliary line (Figure 1).
Subsequently, in a further method step 22a, the map of the working region with the boundary of the working region is transmitted via the interface 24a of the autonomous service robot 10a to the autonomous service robot 10a. Via the interface 24a of the autonomous service robot 10a, the map with the boundary is directly stored in the memory unit 34a of the autonomous service robot 10a. In principle, it would additionally or alternatively be conceivable for the map with the boundary to be transmitted via the interface 24a to the server 46a or to a server and stored there. The map can thereby be made available in particular also to other autonomous service robots 10a. Furthermore, a memory capacity of the memory unit 34a can thereby be kept low.
The interface 24a of the autonomous service robot 10a can in this case be connected to the Internet via a WLAN network of the building of the garden. Through a continuous Internet connection, it would be additionally conceivable for the autonomous service robot 10a to continuously receive new information, such as weather information, via the server 46a, in order to work particularly efficiently. The server 46a can in this case also continuously revise and improve the map, the boundary or the preferred trajectory. In addition, the autonomous service robot 10a could via the Internet receive information from other devices, such as environment sensors, rain sensors or person recognitions, in the garden or an environment. After the map has been stored in the memory unit 34a, an operation of the autonomous service robot 10a is started. After the operation has been started, the autonomous service robot 10a localises itself, on a first putting into operation in a further method step 26a, in a working region with the aid of the map of the at least one working region. A localisation is effected in this case via the sensor unit 38a of the autonomous service robot 10a. The autonomous service robot 10a in this case compares current data of the sensor unit 38a with data of the map in the memory unit 34a, in order to thereby localise itself (Figure 1) .
After a localisation, a regular operation 54a of the autonomous service robot 10a begins. During the regular operation 54a, navigation software ensures that the autonomous service robot 10a does not leave the working region defined by the map with the boundary. For this purpose, the autonomous service robot 10a continuously determines its position on the map. If a change of the working region is detected during the regular operation 54a, in the method a further method step 28a is initiated in which the map of the working region is automatically adapted by the autonomous service robot 10a. If a change of the working region is detected by the sensor unit 38a, such as for example a new obstacle, the map in the memory unit 34a of the autonomous service robot 10a is automatically adapted by the autonomous service robot 10a to the change. In principle, however, it would also be conceivable, on detection of a change of the working region, as indicated dashed, during a regular operation 54a in the further method step 30a, for a notice for a renewed generation of a map of the at least one working region to be automatically output to an operator 48a by the autonomous service robot 10a. The notice would in this case be output to an operator 48a via the output unit 42a of the autonomous service robot 10a. An operator 48a could in this case then decide in a further method step 56a whether he can simply remove the obstacle or would like to carry out a map generation once again. Particularly in the case of large deviations of the working region from the map, a renewed generation of a map may be appropriate or even necessary (Figure 1).
In Figures 3 to 5, two further exemplary embodiments of the invention are shown. The following descriptions and the drawings are limited essentially to the differences between the exemplary embodiments, and reference can be made, with regard to identically designated components, in particular with regard to components having identical reference symbols, in principle also to the drawings and/or the description of the other exemplary embodiments, in particular of Figures 1 and 2. In order to distinguish between the exemplary embodiments, the letter a is added after the reference symbols of the exemplary embodiment in Figures 1 and 2. The letter a is replaced by the letters b and c in the exemplary embodiments of Figures 3 to 5.
Figure 3 shows a flow chart of an alternative method according to the invention by means of a capture and generating apparatus 58b and by means of an autonomous service robot 10b. The capture and generating apparatus 58b has a capture unit 12b and a generating unit 14b. The capture and generating unit 58b is formed by a smartphone.
The method according to the invention is provided for a working region acquisition of a working region of the autonomous service robot 10b. The method is effected by means of a capture unit 12b, by means of the generating unit 14b and by means of the autonomous service robot 10b. The capture unit 12b and the generating unit 14b in this case form a part of the capture and generating apparatus 58b. The capture unit 12b is formed by a built-in camera of the capture and generating apparatus 58b. The generating unit 14b is formed by a part of a computing unit of the capture and generating apparatus 58b. The capture and generating apparatus 58b is formed independently of the autonomous service robot 10b. The capture and generating apparatus 58b is formed completely separately from the autonomous service robot 10b and can be formed by a commercially available smartphone, on which merely a special application is installed. In the method, in a first method step 16b, 16b' a plurality of visual media files of the working region are captured by the capture unit 12b. By forming the capture and generating apparatus 58b as a smartphone, it would in principle be conceivable for there to be stored for each visual media file, in addition to the media file per se, for example an inclination angle of the capture unit 12b during the capture and/or a precise position of the capture unit 12b during the capture and/or a relative position of the capture unit 12b during the capture with respect to a last capture. In this case, in particular already existing sensors of the capture and generating apparatus 58b, such as in particular an inclination sensor, acceleration sensor, angular rate sensor and/or a GPS receiver, could be used. This additional data could be used in particular in a generation of the map, in order to achieve a particularly high accuracy. The first method step 16b, 16b' of the method is effected before a working operation of the autonomous service robot 10b. The first method step 16b, 16b' of the method is effected before a first putting into operation, i.e. before a first journey of the autonomous service robot 10b. In a second method step 18b, a map of the working region is generated by the generating unit 14b from the visual media files. The visual media files are in this case directly further processed by the generating unit 14b. The generation is effected by means of software installed on the capture and generating apparatus 58b, using the virtual media files. The generation is effected by means of image processing algorithms. The generation is performed by means of Bundle Adjustment, Structure for Motion and Visual SLAM. Subsequently, in a further method step 20b, a boundary of the working region is determined by the generating unit 14b in the map of the working region.
After the boundary of the working region has been determined, the map of the working region with the boundary of the working region is released by the generating unit 14b for an editing by an operator 48b. After the editing of the map by the operator 48b, a partitioning of the map of the working region is additionally carried out by the generating unit 14b in a further method step 52b. Furthermore, in the method step 52b, a preferred trajectory of the autonomous service robot 10b is determined by the generating unit 14b.
In a further method step 22b, the map of the working region with the boundary of the working region is transmitted via the interface 24b of the autonomous service robot 10b to the autonomous service robot 10b. Via the interface 24b of the autonomous service robot 10b, the map with the boundary is directly stored in the memory unit 34b of the autonomous service robot 10b. After the map has been stored in the memory unit 34b, an operation of the autonomous service robot 10b is started. After the operation has been started, the autonomous service robot 10b localises itself, on a first putting into operation in a further method step 26b, in a working region with the aid of the map of the at least one working region. After a localisation, a regular operation 54b of the autonomous service robot 10b begins.
If a change of the working region is detected during the regular operation 54b, in the method a further method step 28b is initiated in which the map of the working region is automatically adapted by the autonomous service robot 10b.
Figure 4 shows a flow chart of a further alternative method according to the invention by means of a capture unit 12c and by means of an autonomous service robot 10c.
The autonomous service robot 10c is formed by an autonomous lawn mower. The autonomous service robot 10c has a control and regulating unit 32c. The control and regulating unit 32c has a memory unit 34c and an interface 24c. The interface 24c of the control and regulating unit 32c is provided for transmitting an externally generated visual media file of a working region to the memory unit 34c. Furthermore, the control and regulating unit 32c has a control system 36c. In addition, the autonomous service robot 10c has a generating unit 14c. The generating unit 14c is formed by a computing unit. The generating unit 14c is arranged in a housing 40c of the autonomous service robot 10c. In principle, it would be conceivable for the autonomous service robot 10c additionally to have the capture unit 12c. The capture unit 12c could be integrated into the housing 40c of the autonomous service robot 10c.
In particular a number of assemblies could thereby be kept low. In addition, the autonomous service robot 10a has a sensor unit 38a (Figure 5).
The method according to the invention is provided for a working region acquisition of a working region of the autonomous service robot 10c. The method is effected by means of a capture unit 12c, by means of the generating unit 14c and by means of the autonomous service robot 10c. The capture unit 12c is formed by a photo-camera. The capture unit 12c is formed independently of the autonomous service robot 10c. The capture unit 12c is completely separated from the autonomous service robot 10c and can be formed by a commercially available capture unit, such as in particular a commercially available photo-camera. The generating unit 14c forms a part of the autonomous service robot 10c. In the method, in a first method step 16c, 16c' a plurality of visual media files of the working region are captured by the capture unit 12c. The first method step 16c, 16c' of the method is effected before a working operation of the autonomous service robot 10c. The first method step 16c, 16c' of the method is effected before a first putting into operation, i.e. before a first journey of the autonomous service robot 10c.
Subsequently, in a further method step 22c, the visual media files of the working region with the boundary of the working region are transmitted via the interface 24c of the autonomous service robot 10c to the autonomous service robot 10c. Via the interface 24c of the autonomous service robot 10c, the visual media files are stored in the memory unit 34c of the autonomous service robot 10c where they can be fetched by the generating unit 14c.
In a second method step 18c, a map of the working region is generated by the generating unit 14c of the autonomous service robot 10c from the visual media files. The generation is effected by means of software using the virtual media files. The generation is effected by means of image processing algorithms. The generation is effected by means of Bundle Adjustment, Structure for Motion and Visual SLAM. Subsequently, in a further method step 20c, a boundary of the working region is determined by the generating unit 14c in the map of the working region.
After the boundary of the working region has been determined, the map of the working region with the boundary of the working region is released by the generating unit 14c for an editing by an operator 48c. The released map with the boundary can be edited by an operator 48c in a further method step 50c. An editing is effected in this case directly on the generating unit 14c or directly on the autonomous service robot 10c. The map with the boundary is in this case output via an output unit 42c and can be edited there. After the editing of the map by the operator 48c, the map with the boundary is stored in the memory unit 34c. In principle, it would additionally or alternatively be conceivable for the map with the boundary to be transmitted via the interface 24c to a server 46c or to a further server and stored there. The map can thereby be made available in particular also to other autonomous service robots 10c. Furthermore, a memory capacity of the memory unit 34c can thereby be kept low. The interface 24c of the autonomous service robot 10c can in this case be connected to the Internet via a WLAN network of the building of the garden.
After the map has been stored in the memory unit 34c, an operation of the autonomous service robot 10c is started. After the operation has been started, the autonomous service robot 10c localises itself, on a first putting into operation in a further method step 26c, in a working region with the aid of the map of the at least one working region.
After a localisation, a regular operation 54c of the autonomous service robot 10c begins. If a change of the working region is detected during the regular operation 54c, in the method a further method step 28c is initiated in which the map of the working region is automatically adapted by the autonomous service robot 10c.
Aspects of the invention are further described below. 1. A first aspect of the invention relates to a method for a working region acquisition of at least one working region of an autonomous service robot (10a; 10b; 10c), in particular of an autonomous lawn mower, by means of at least one capture unit (12a; 12b; 12c) and a generating unit (14a; 14b; 14c), wherein in a first method step (16a, 16a'; 16b, 16b'; 16c, 16c'), before a working operation of the autonomous service robot (10a; 10b; 10c) , at least one visual media file of the at least one working region is captured by the at least one capture unit (12a; 12b; 12c) and in a second method step (18a; 18b; 18c) a map of the at least one working region is generated by the generating unit (14a; 14b; 14c) from the at least one visual media file. 2. A method according to aspect 1, characterised in that in the first method step (16a, 16a'; 16b, 16b'; 16c, 16c'), before a first putting into operation of the autonomous service robot (10a; 10b; 10c), at least one visual media file of the at least one working region is captured by the capture unit (12a; 12b; 12c). 3. A method according to aspects 1 or 2, characterised in that in the first method step (16a, 16a'; 16b, 16b'; 16c, 16c'), before a working operation of the autonomous service robot (10a; 10b; 10c), at least one visual media file of the at least one working region is captured by the at least one capture unit (12a; 12b; 12c) independent of the autonomous service robot (10a; 10b; 10c). 4. A method according to any of the preceding aspects, characterised by a further method step (20a; 20b; 20c) , in which a boundary of the at least one working region is determined in the map of the at least one working region by the generating unit (14a; 14b; 14c). 5. A method according to any of the preceding aspects, characterised in that the map of the at least one working region and/or the boundary of the at least one working region is released by the generating unit (14a; 14b; 14c) for an editing. 6. A method according to any of the preceding aspects, characterised by a further method step (22a; 22b; 22c) , in which the at least one media file of the at least one working region and/or the map of the at least one working region and/or the boundary of the at least one working region is transmitted via an interface (24a; 24b; 24c) of the autonomous service robot (10a; 10b; 10c) to the autonomous service robot (10a; 10b; 10c). 7. A method according to any of the preceding aspects, characterised by a further method step (26a; 26b; 26c), in which the autonomous service robot (10a; 10b; 10c) localises itself in a working region with the aid of the map of the at least one working region. 8. A method according to any of the preceding aspects, characterised by a further method step (28a; 28b; 28c) , in which, on detection of a change of the at least one working region, the map of the at least one working region is automatically adapted by the autonomous service robot (10a; 10b; 10c). 9. A method according to any of aspects 2 to 7, characterised by a further method step (30a; 30b; 30c), in which, on detection of a change of the at least one working region, a notice for a renewed generation of a map of the at least one working region is automatically output to an operator by the autonomous service robot (10a; 10b; 10c) . 10. A second aspect relates to an autonomous service robot, in particular autonomous lawn mower, for carrying out a method at least according to one of the preceding claims . 11. An autonomous service robot according to aspect 10, characterised by at least one control and/or regulating unit (32a; 32b; 32c) which has at least one memory unit (34a; 34b; 34c) and at least one interface (24a; 24b; 24c) which is provided for transmitting at least one externally generated map of at least one working region to the memory unit (34a; 34b; 34c). 12. An autonomous service robot according to aspect 11, characterised in that the at least one control and/or regulating unit (32a; 32b; 32c) has at least one control system (36a; 36b; 36c) which is provided for a processing of the at least one map of at least one working region. 13. An autonomous service robot according to any of aspects 10 to 12, characterised by at least one sensor unit (38a; 38b; 38c) which is provided for detecting unforeseen obstacles. 14. An autonomous service robot according to aspect 13, characterised in that the sensor unit (38a; 38b; 38c) is provided, on a detection of an unforeseen obstacle, for outputting a signal to the control and/or regulating unit (32a; 32b; 32c), which control and/or regulating unit is provided for storing the position of the unforeseen obstacle on the at least one map of the at least one working region. 15. A third aspect relates to a system for carrying out a method according to any of aspects 1 to 9, having an autonomous service robot (10a; 10b; 10c), in particular according to one of Claims 10 to 14, having a capture unit (12a; 12b; 12c) and having a generating unit (14a; 14b; 14c) .
Claims (14)
1. Method for a working region acquisition of at least one working region of an autonomous service robot (10a; 10b; 10c), by means of at least one capture unit (12a; 12b; 12c) and a generating unit (14a; 14b; 14c), wherein in a first method step (16a, 16a'; 16b, 16b'; 16c, 16c'), before a working operation of the autonomous service robot (10a; 10b; 10c), at least one visual media file of the at least one working region is captured by the at least one capture unit (12a; 12b; 12c) and in a second method step (18a; 18b; 18c) a map of the at least one working region is generated by the generating unit (14a; 14b; 14c) from the at least one visual media file, wherein in a further method step (30a; 30b; 30c), on detection of a change of the at least one working region, a notice for a renewed generation of a map of the at least one working region is automatically output to an operator by the autonomous service robot (10a; 10b; 10c).
2. Method according to Claim 1, characterised in that in the first method step (16a, 16a'; 16b, 16b'; 16c, 16c'), before a first putting into operation of the autonomous service robot (10a; 10b; 10c), at least one visual media file of the at least one working region is captured by the capture unit (12a; 12b; 12c) .
3. Method according to Claim 1 or 2, characterised in that in the first method step (16a, 16a'; 16b, 16b'; 16c, 16c'), before a working operation of the autonomous service robot (10a; 10b; 10c), at least one visual media file of the at least one working region is captured by the at least one capture unit (12a; 12b; 12c) independent of the autonomous service robot (10a; 10b; 10c).
4. Method according to any of the preceding claims, characterised by a further method step (20a; 20b; 20c), in which a boundary of the at least one working region is determined in the map of the at least one working region by the generating unit (14a; 14b; 14c).
5. Method according to any of the preceding claims, characterised by a further method step (22a; 22b; 22c) , in which the at least one media file of the at least one working region and/or the map of the at least one working region and/or the boundary of the at least one working region is transmitted via an interface (24a; 24b; 24c) of the autonomous service robot (10a; 10b; 10c) to the autonomous service robot (10a; 10b; 10c).
6. Method according to any of the preceding claims, characterised by a further method step (26a; 26b; 26c), in which the autonomous service robot (10a; 10b; 10c) localises itself in a working region with the aid of the map of the at least one working region.
7. Method according to any of the preceding claims, characterised by a further method step (28a; 28b; 28c), in which, on detection of a change of the at least one working region, the map of the at least one working region is automatically adapted by the autonomous service robot (10a; 10b; 10c) .
8. Method according to any of the preceding claims wherein the autonomous service robot (10a; 10b; 10c) is an autonomous lawnmower.
9. An autonomous service robot (10a; 10b; 10c) comprising at least one capture unit (12a; 12b; 12c) and a generating unit (14a; 14b, 14c) adapted to carry out the method of claim 1.
10. An autonomous service robot according to Claim 9, characterised by at least one control and/or regulating unit (32a; 32b; 32c) which has at least one memory unit (34a; 34b; 34c) and at least one interface (24a; 24b; 24c) which is provided for transmitting at least one externally generated map of at least one working region to the memory unit (34a; 34b; 34c).
11. An autonomous service robot according to Claim 10, characterised in that the at least one control and/or regulating unit (32a; 32b; 32c) has at least one control system (36a; 36b; 36c) which is provided for a processing of the at least one map of at least one working region.
12. An autonomous service robot according to one of Claims 9 to 11, characterised by at least one sensor unit (38a; 38b; 38c) which is provided for detecting unforeseen obstacles.
13. An autonomous service robot according to Claim 12, characterised in that the sensor unit (38a; 38b; 38c) is provided, on a detection of an unforeseen obstacle, for outputting a signal to the control and/or regulating unit (32a; 32b; 32c), which control and/or regulating unit is provided for storing the position of the unforeseen obstacle on the at least one map of the at least one working region.
14. An autonomous service robot according to claims 9 to 13 wherein the autonomous service robot (10a; 10b; 10c) is an autonomous lawnmower.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102013212605.0A DE102013212605A1 (en) | 2013-06-28 | 2013-06-28 | Method for a work area detection of at least one work area of an autonomous service robot |
GB1411312.0A GB2517572B (en) | 2013-06-28 | 2014-06-25 | Method for a working region acquisition of at least one working region of an autonomous service robot |
Publications (2)
Publication Number | Publication Date |
---|---|
GB201813548D0 GB201813548D0 (en) | 2018-10-03 |
GB2563347A true GB2563347A (en) | 2018-12-12 |
Family
ID=51410115
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB1411312.0A Active GB2517572B (en) | 2013-06-28 | 2014-06-25 | Method for a working region acquisition of at least one working region of an autonomous service robot |
GB1813548.3A Withdrawn GB2563347A (en) | 2013-06-28 | 2014-06-25 | Method for a working region acquisition of at least one working region of an autonomous service robot |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB1411312.0A Active GB2517572B (en) | 2013-06-28 | 2014-06-25 | Method for a working region acquisition of at least one working region of an autonomous service robot |
Country Status (3)
Country | Link |
---|---|
CN (1) | CN104252176B (en) |
DE (1) | DE102013212605A1 (en) |
GB (2) | GB2517572B (en) |
Families Citing this family (32)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102014226084A1 (en) | 2014-12-16 | 2016-06-16 | Robert Bosch Gmbh | Method for mapping a working surface for autonomous robotic vehicles |
SE1451645A1 (en) | 2014-12-23 | 2016-05-31 | Husqvarna Ab | Improved navigation for a robotic lawnmower |
JP5973608B1 (en) * | 2015-03-27 | 2016-08-23 | 本田技研工業株式会社 | Control equipment for unmanned work vehicles |
DE102015221658A1 (en) | 2015-11-04 | 2017-05-04 | Robert Bosch Gmbh | Garden sensor device |
DE102015222414A1 (en) * | 2015-11-13 | 2017-05-18 | Robert Bosch Gmbh | Autonomous working device |
CN106814733A (en) * | 2015-12-01 | 2017-06-09 | 江苏长虹智能装备集团有限公司 | A kind of automatical pilot transportation vehicle Flexibility Control Technique |
EP3508048B1 (en) * | 2016-08-31 | 2023-10-18 | Positec Power Tools (Suzhou) Co., Ltd | Autonomous lawn-mower and method of recognizing an obstacle by an autonomous lawn mower |
CN108073164B (en) * | 2016-11-11 | 2019-11-26 | 苏州宝时得电动工具有限公司 | Automatic mower and its traveling method |
CN110888437B (en) | 2016-11-11 | 2023-02-21 | 苏州宝时得电动工具有限公司 | Automatic working system and control method thereof |
DE102017203055A1 (en) * | 2017-02-24 | 2018-08-30 | Robert Bosch Gmbh | Method for detecting at least one working area of an autonomous implement |
EP3412128B1 (en) | 2017-06-09 | 2021-05-12 | Andreas Stihl AG & Co. KG | Green areas processing system and method for the detection of at least one section of a limiting edge of a surface to be processed |
EP3413155B1 (en) | 2017-06-09 | 2020-02-26 | Andreas Stihl AG & Co. KG | Method for the detection of at least one section of a limiting edge of a surface to be processed, method for operating an autonomous mobile green area processing robot, detection system and green area processing system |
CN107976194B (en) * | 2017-11-24 | 2021-12-21 | 北京奇虎科技有限公司 | Environment map adjusting method and device |
JP6877330B2 (en) | 2017-12-27 | 2021-05-26 | 株式会社クボタ | Work area determination system for autonomous travel type work equipment, autonomous travel type work equipment, and work area determination program |
CN108337987A (en) * | 2018-02-13 | 2018-07-31 | 杭州慧慧科技有限公司 | A kind of automatic mowing system and grass trimmer control method |
CN108919814A (en) * | 2018-08-15 | 2018-11-30 | 杭州慧慧科技有限公司 | Grass trimmer working region generation method, apparatus and system |
EP3560312B1 (en) * | 2018-04-06 | 2021-10-20 | LG Electronics Inc. | Lawn mower robot |
EP3549424B1 (en) | 2018-04-06 | 2022-01-05 | Lg Electronics Inc. | Lawn mower robot |
EP3549426B1 (en) | 2018-04-06 | 2021-09-08 | LG Electronics Inc. | Lawn mower robot |
US11166409B2 (en) | 2018-04-06 | 2021-11-09 | Lg Electronics Inc. | Lawn mower robot |
EP3549425B1 (en) | 2018-04-06 | 2021-08-04 | LG Electronics Inc. | Lawn mower robot |
EP3549423B1 (en) | 2018-04-06 | 2021-06-16 | Lg Electronics Inc. | Lawn mower robot |
US11140820B2 (en) | 2018-04-06 | 2021-10-12 | Lg Electronics Inc. | Lawn mower robot |
EP3549427B1 (en) | 2018-04-06 | 2021-09-01 | LG Electronics Inc. | Lawn mower robot |
AU2019275450B2 (en) | 2018-05-25 | 2023-12-14 | The Toro Company | Autonomous grounds maintenance machines with path planning for trap and obstacle avoidance |
EP3686704B1 (en) | 2019-01-22 | 2023-08-09 | Honda Research Institute Europe GmbH | Method for generating a representation and system for teaching an autonomous device operating based on such representation |
WO2020226085A1 (en) * | 2019-05-09 | 2020-11-12 | ソニー株式会社 | Information processing device, information processing method, and program |
US11457558B1 (en) | 2019-05-15 | 2022-10-04 | Hydro-Gear Limited Partnership | Autonomous vehicle navigation |
CN110466913A (en) * | 2019-07-29 | 2019-11-19 | 东莞弓叶互联科技有限公司 | Rubbish collection methods and device |
CN110498166A (en) * | 2019-08-07 | 2019-11-26 | 东莞弓叶互联科技有限公司 | Rubbish recovering method, device, computer equipment and storage medium |
CN110641872A (en) * | 2019-09-06 | 2020-01-03 | 东莞弓叶互联科技有限公司 | Garbage can transportation method and system |
SE544524C2 (en) * | 2019-12-06 | 2022-06-28 | Husqvarna Ab | Robotic work tool system and method for defining a working area perimeter |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060213167A1 (en) * | 2003-12-12 | 2006-09-28 | Harvey Koselka | Agricultural robot system and method |
WO2014027945A1 (en) * | 2012-08-14 | 2014-02-20 | Husqvarna Ab | Mower with object detection system |
WO2015022672A2 (en) * | 2013-08-16 | 2015-02-19 | Husqvarna Ab | Intelligent grounds management system integrating robotic rover |
Family Cites Families (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR100483548B1 (en) * | 2002-07-26 | 2005-04-15 | 삼성광주전자 주식회사 | Robot cleaner and system and method of controlling thereof |
US8855819B2 (en) * | 2008-10-09 | 2014-10-07 | Samsung Electronics Co., Ltd. | Method and apparatus for simultaneous localization and mapping of robot |
WO2011052826A1 (en) * | 2009-10-30 | 2011-05-05 | 주식회사 유진로봇 | Map generating and updating method for mobile robot position recognition |
DE102009052629A1 (en) * | 2009-11-10 | 2011-05-12 | Vorwerk & Co. Interholding Gmbh | Method for controlling a robot |
US8855930B2 (en) * | 2010-04-09 | 2014-10-07 | Tomtom International B.V. | Method of generating a route |
KR20110119118A (en) * | 2010-04-26 | 2011-11-02 | 엘지전자 주식회사 | Robot cleaner, and remote monitoring system using the same |
DE102010017689A1 (en) * | 2010-07-01 | 2012-01-05 | Vorwerk & Co. Interholding Gmbh | Automatically movable device and method for orientation of such a device |
KR101842460B1 (en) * | 2011-04-12 | 2018-03-27 | 엘지전자 주식회사 | Robot cleaner, and remote monitoring system and method of the same |
KR101566207B1 (en) * | 2011-06-28 | 2015-11-13 | 삼성전자 주식회사 | Robot cleaner and control method thereof |
TW201305761A (en) * | 2011-07-21 | 2013-02-01 | Ememe Robot Co Ltd | An autonomous robot and a positioning method thereof |
US9594380B2 (en) * | 2012-03-06 | 2017-03-14 | Travis Dorschel | Path recording and navigation |
CN102866706B (en) * | 2012-09-13 | 2015-03-25 | 深圳市银星智能科技股份有限公司 | Cleaning robot adopting smart phone navigation and navigation cleaning method thereof |
CN102929280B (en) * | 2012-11-13 | 2015-07-01 | 朱绍明 | Mobile robot separating visual positioning and navigation method and positioning and navigation system thereof |
DE102012221572A1 (en) * | 2012-11-26 | 2014-05-28 | Robert Bosch Gmbh | Autonomous locomotion device |
-
2013
- 2013-06-28 DE DE102013212605.0A patent/DE102013212605A1/en active Pending
-
2014
- 2014-06-25 GB GB1411312.0A patent/GB2517572B/en active Active
- 2014-06-25 GB GB1813548.3A patent/GB2563347A/en not_active Withdrawn
- 2014-06-27 CN CN201410301793.0A patent/CN104252176B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060213167A1 (en) * | 2003-12-12 | 2006-09-28 | Harvey Koselka | Agricultural robot system and method |
WO2014027945A1 (en) * | 2012-08-14 | 2014-02-20 | Husqvarna Ab | Mower with object detection system |
WO2015022672A2 (en) * | 2013-08-16 | 2015-02-19 | Husqvarna Ab | Intelligent grounds management system integrating robotic rover |
Also Published As
Publication number | Publication date |
---|---|
GB201411312D0 (en) | 2014-08-06 |
GB2517572B (en) | 2019-02-13 |
CN104252176B (en) | 2020-12-22 |
GB2517572A (en) | 2015-02-25 |
CN104252176A (en) | 2014-12-31 |
GB201813548D0 (en) | 2018-10-03 |
DE102013212605A1 (en) | 2014-12-31 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
GB2563347A (en) | Method for a working region acquisition of at least one working region of an autonomous service robot | |
CN112584697B (en) | Autonomous machine navigation and training using vision system | |
JP6946524B2 (en) | A system for performing simultaneous position measurement mapping using a mechanical visual system | |
US11402850B2 (en) | Robotic cleaning device with operating speed variation based on environment | |
JP5550671B2 (en) | Autonomous traveling robot and traveling control method for autonomous traveling robot | |
CN105310604B (en) | The method of robot cleaner system and control machine people's cleaner | |
KR101573027B1 (en) | Intelligent unmaned robot for weeding | |
US20240086862A1 (en) | System, devices and methods for tele-operated robotics | |
CN113128747B (en) | Intelligent mowing system and autonomous image building method thereof | |
US20140324272A1 (en) | Operating system for and method of operating an automatic guidance system of an agricultural vehicle | |
EP3603371A2 (en) | Lawn mower robot, system of lawn mower robot and control method of lawn mower robot system | |
WO2017192666A1 (en) | Autonomous aerial vehicle | |
KR102272161B1 (en) | Lawn mover robot system and controlling method for the same | |
JP2022504615A (en) | Route planning | |
KR20210071383A (en) | Mapping method of Lawn Mower Robot. | |
JP7170065B2 (en) | Support device and support method | |
KR20200075140A (en) | Artificial intelligence lawn mover robot and controlling method for the same | |
CN113115621A (en) | Intelligent mowing system and autonomous mapping method thereof | |
CN114721385A (en) | Virtual boundary establishing method and device, intelligent terminal and computer storage medium | |
TW201825869A (en) | Method for the navigation and self-location of an autonomously moving processing device | |
US11797025B2 (en) | Autonomous work system, autonomous work setting method, and storage medium | |
EP4386504A1 (en) | Using adjustable vision component for on-demand vision data capture of areas along a predicted trajectory of a robot | |
EP4250041A1 (en) | Method for determining information, remote terminal, and mower | |
CN114766014A (en) | Autonomous machine navigation in various lighting environments | |
US20170325400A1 (en) | Method for navigation and joint coordination of automated devices |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WAP | Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1) |