CN104252176B - Method for detecting a working area of at least one working area of an autonomous service robot - Google Patents

Method for detecting a working area of at least one working area of an autonomous service robot Download PDF

Info

Publication number
CN104252176B
CN104252176B CN201410301793.0A CN201410301793A CN104252176B CN 104252176 B CN104252176 B CN 104252176B CN 201410301793 A CN201410301793 A CN 201410301793A CN 104252176 B CN104252176 B CN 104252176B
Authority
CN
China
Prior art keywords
service robot
autonomous service
unit
map
work area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201410301793.0A
Other languages
Chinese (zh)
Other versions
CN104252176A (en
Inventor
S·格容卡
S·豪格
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Robert Bosch GmbH
Original Assignee
Robert Bosch GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Robert Bosch GmbH filed Critical Robert Bosch GmbH
Publication of CN104252176A publication Critical patent/CN104252176A/en
Application granted granted Critical
Publication of CN104252176B publication Critical patent/CN104252176B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01DHARVESTING; MOWING
    • A01D34/00Mowers; Mowing apparatus of harvesters
    • A01D34/006Control or measuring arrangements
    • A01D34/008Control or measuring arrangements for automated or remotely controlled operation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/0274Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device

Abstract

The invention relates to a method for working area detection of at least one working area of an autonomous service robot (10 a; 10 b; 10c), in particular of an autonomous lawn mower, by means of at least one acquisition unit (12 a; 12 b; 12c) and a generation unit (14 a; 14 b; 14c), in a first method step (16 a; 16a '; 16 b; 16b '; 16 c; 16c ') at least one visual media file of the at least one working area is captured by the at least one capture unit (12 a; 12 b; 12c) before the work of the autonomous service robot (10 a; 10 b; 10c) is run, and in a second method step (18 a; 18 b; 18c) a map of the at least one working area is generated by the generation unit (14 a; 14 b; 14c) on the basis of the at least one visual media file.

Description

Method for detecting a working area of at least one working area of an autonomous service robot
Background
Methods for work area detection of at least one work area of an autonomous service robot have been proposed.
Disclosure of Invention
The invention relates to a method for detecting a working area of at least one working area of an autonomous service robot, in particular an autonomous lawn mower, by means of at least one detection unit and a generation unit, wherein in a first method step at least one visual media file of the at least one working area is detected by the at least one detection unit before the operation of the autonomous service robot and in a second method step a map of the at least one working area is generated by the generation unit on the basis of the at least one visual media file. "work area detection" is to be understood in this context to mean, in particular, a process in which a work area, in particular of an autonomous service robot, is at least partially identified. Preferably, "working area detection" is to be understood in particular to mean a process in which a working area is detected, in particular virtually. Particularly preferably, "working area detection" is to be understood to mean, in particular, a process in which a working area is detected at least two-dimensionally and, in particular, the limits of the working area are determined. Furthermore, a "working area" is to be understood in this context to mean, in particular, an area which defines a surface which can be processed by the autonomous service robot. Furthermore, an "autonomous service robot" is to be understood in this context to mean, in particular, an apparatus which performs work at least partially automatically, for example, in particular automatically starts, ends and/or automatically selects at least one parameter, for example, in particular a trajectory parameter and/or a turnaround point, etc. Preferably, an "autonomous service robot" is to be understood as an apparatus which is at least partially provided for providing at least one service, typically to an operator and/or a person. Preferably, an "autonomous service robot" is to be understood in particular to mean an apparatus which is moved automatically at least to carry out a job or which advances autonomously in a predetermined working area. Particularly preferably, the apparatus is provided for working, for example, sweeping, sucking, cleaning a work area and in particular the surface and/or trimming a lawn on the surface. Other autonomous service robots considered to be of interest to the person skilled in the art can be considered here, such as in particular autonomous floor sweepers, autonomous vacuum cleaners, autonomous pool cleaners, autonomous floor wiping robots and/or particularly preferably autonomous lawn mowers. In principle, however, other autonomous service robots considered to be useful by the person skilled in the art are also conceivable. An "acquisition unit" is to be understood in this context to mean, in particular, a unit which is provided for visual detection. Preferably, an "acquisition unit" is to be understood in particular to mean a unit having at least one sensor, for example in particular at least one image sensor, which is provided for acquiring at least one piece of visual information. Particularly preferably, an "acquisition unit" is to be understood to mean, in particular, a unit which is provided for acquiring individual images and/or successive images and/or films. Furthermore, a "generating unit" is to be understood in this context to mean, in particular, a unit which is provided for generating a map. A "generating unit" is preferably to be understood in particular as a unit which is provided for generating a map on the basis of at least one visual media file. Preferably, the generating unit has at least one arithmetic unit. Particularly preferably, the generating unit is configured separately from the autonomous service robot. Alternatively or additionally, the generation unit can also form part of the autonomous service robot. An "arithmetic unit" is to be understood here to mean, in particular, a unit having an information input, an information processing and/or an information output. Advantageously, the arithmetic unit has at least one processor, a memory, input and output devices, further electrical components, an operating program, a regulating routine, a control routine and/or a computing routine. Preferably, the components of the arithmetic unit are arranged on a common circuit board and/or advantageously in a common housing. "operational operation of an autonomous service robot" is to be understood in this context in particular as an operational state of the autonomous service robot in which the autonomous service robot performs an effective operation. Preferably, "working operation of the autonomous service robot" is to be understood as an operating state in which the autonomous service robot is set up to complete a core work. In this context, "core work" is to be understood to mean, in particular, work which is present in the services which can be performed by the autonomous service robot. A "visual media file" is to be understood in this context in particular as a file which comprises at least visual information, in particular in digital form. Preferably, a "visual media file" is to be understood in particular as a file which in particular comprises at least image information and/or video information. Particularly preferably, a "visual media file" is to be understood in particular as an image file and/or a video file. In principle, however, other visual media files that are considered to be of interest to the person skilled in the art are also conceivable. Furthermore, a "map" is to be understood in this context in particular as an expression of preferred numbers of a region and/or of the surroundings. Preferably, a "map" is to be understood in particular as a topographical map, which in particular includes height, dimensions, ground characteristics and/or possible obstacles. "provided" in this context should be understood to mean, in particular, a particular programming, design and/or configuration. By "setting a certain object for a determined function" should be understood in particular that the object fulfills and/or implements the determined function in at least one application state and/or operating state.
The method according to the invention advantageously allows a simple detection of the working area. In particular, it is possible to detect the work area already before the work operation of the autonomous service robot. Furthermore, the work area detection can be carried out on the basis of visual media files, whereby, in particular, laying down or setting up physical boundaries can be avoided. This makes it possible, in particular, to keep the effort low. In addition, the working area detection can thereby be carried out again without changing the working area.
Furthermore, it is proposed that in the first method step, before the first start-up of the autonomous service robot, at least one visual media file of the at least one working area is recorded by the recording unit. The term "first start" is to be understood in this context to mean, in particular, the first start-up of the autonomous service robot after purchase and/or after reset and/or after a change in the working range. This enables particularly advantageous detection of the working range. Furthermore, the working area detection can thereby already be carried out before the first start.
Furthermore, it is proposed that in the first method step, before the work operation of the autonomous service robot, at least one visual media file of the at least one work area is recorded by the at least one recording unit independent of the autonomous service robot. "independent" in this context should be understood in particular to mean that the acquisition unit is formed separately from the autonomous service robot. Preferably, "independent" is to be understood in that the acquisition unit operates or operates independently of the autonomous service robot. Particularly preferably, "independent" should be understood in particular in such a way that the acquisition unit is formed by an independent autonomous instrument. Whereby at least one visual media file of the at least one work area can be collected independent of the autonomous service robot. In particular the visual media file can already be captured before the first start-up.
It is furthermore proposed that the method has a further method step in which the boundary of the at least one work area is determined in the map of the at least one work area by the generation unit. A "boundary" is to be understood in this context to mean, in particular, at least one line of matter and/or non-matter, which defines a working area. Preferably, a "boundary" is to be understood in particular as at least one line which completely defines the working area. Particularly preferably, the boundary is formed by one or more respectively closed lines. The boundary of the at least one working area can thus advantageously be determined automatically. Preferably, in particular, the already provided, in particular logical limit can thus be determined as a boundary. In particular, obstacles, steep slopes in the area and/or facts considered to be significant by the person skilled in the art can be detected by the generation unit and already be identified as logical boundaries and subsequently determined.
It is furthermore proposed that the map and/or the boundary of the at least one working area is released by the generation unit for processing. Preferably, said map and/or said borders of said at least one work area are released by said generating unit for processing by an operator. Particularly preferably, the map and/or the boundary of the at least one working area is output by the generation unit to an operator for processing. In particular, it is preferably achieved that the map and/or the boundary of the at least one working area is processed by the operator directly on the generation unit and/or on a unit connected to the generation unit. This enables, in particular, errors in the map and/or particular points in the boundary to be modified or adjusted manually by the operator.
It is furthermore proposed that the method has a further method step in which the at least one media file of the at least one work area and/or the map of the at least one work area and/or the boundary of the at least one work area is transferred to the autonomous service robot via an interface of the autonomous service robot. Preferably, in the further method step, the at least one media file of the at least one work area and/or the map of the at least one work area and/or the boundary of the at least one work area is transferred onto the autonomous service robot via an interface of the autonomous service robot at a first start of the autonomous service robot and/or after a change of the work area. An "interface" is to be understood in this context to mean, in particular, a unit which is provided for exchanging data. In particular, the interface has at least one information inlet and at least one information outlet. Preferably, the interface has at least two information entries and at least two information exits, wherein the at least one information entry and the at least one information exit are each provided for connection to a physical system. Particularly preferably, an "interface" is to be understood as an interface between at least two physical systems, for example, in particular between the autonomous service robot and at least one external instrument. Different interfaces may also be considered as being meaningful to a person skilled in the art. However, an "interface" is to be understood to mean, in particular, a wireless interface, for example bluetooth, WLAN (wireless local area network), UMTS (universal mobile telecommunications system) or NFC (near field communication), a wired interface, for example a USB port, and/or a drive interface by means of a storage medium, for example a memory card, a memory stick or a CD. In particular, this enables an advantageous transfer of at least one media file and/or of the map and/or of the boundary.
It is furthermore proposed that the method has a further method step in which the autonomous service robot is positioned in a work area according to the map of the at least one work area. Preferably, the autonomous service robot has at least one sensor unit by means of which positioning is carried out in the working area. A "sensor unit" is to be understood in this context to mean, in particular, a unit which is provided for detecting at least one characteristic variable and/or physical property, wherein the detection can take place actively, for example, in particular by generating and emitting the electrical measurement signals, and/or passively, for example, in particular by detecting a change in the properties of the sensor component. Different sensor units, such as optical sensors, GPS receivers and/or acoustic sensors, such as in particular ultrasonic sensors, which are considered to be useful by the person skilled in the art, can also be considered. Thereby, the autonomous service robot can advantageously navigate according to the map. Furthermore, this enables the autonomous service robot to interrupt the work area at will. In addition, operation can thereby be carried out in particular independently of the position of the base station of the autonomous service robot.
It is furthermore proposed that the method has a further method step in which the map of the at least one work area is automatically adjusted by the autonomous service robot when a change in the at least one work area is detected. Thereby, the map can advantageously continuously adapt to the work area. It is furthermore possible to detect and take into account, in particular automatically, changes which are produced, for example, by an operator. This makes it possible to avoid, in particular, a continuous, complete renewed mapping of the working area.
It is furthermore proposed that the method has a further method step in which, when a change in the at least one work area is detected, a signal is automatically sent by the autonomous service robot to the operator for the purpose of reconstructing a map of the at least one work area. In this way, changes to the working range can be taken into account particularly advantageously simply and cost-effectively. In particular, it is thereby possible to keep the map constantly updated and to adapt to changes in the work area.
Furthermore, an autonomous service robot, in particular an autonomous lawn trimmer, is proposed for carrying out the method according to the invention.
It is proposed that the autonomous service robot has at least one control and/or regulation unit with at least one memory unit and at least one interface, which is provided for the transfer of at least one externally generated map of at least one work area to the memory unit. A "control and/or regulating unit" is to be understood in particular to mean a unit having at least one control electronics. The term "control electronics" is to be understood to mean, in particular, a unit having a processor unit and a memory unit, as well as a unit having an operating program stored in the memory unit. Furthermore, a "memory unit" is to be understood in this context to mean, in particular, a unit which is provided to advantageously store at least one piece of information independently of a supply of power. Thereby, the externally generated map of the work area can advantageously be stored on the autonomous service robot and recalled as required.
It is furthermore proposed that the at least one control and/or regulating unit has at least one control system which is provided for processing the at least one map of the at least one working area. A "control system" is to be understood in this context in particular as a system which forms at least part of an operating system and/or software. Preferably, the control system is arranged for implementation on at least one processor. Particularly preferably, the control system has at least one regulating routine, control routine and/or calculation routine. Thereby, the processing of the at least one map can advantageously be performed directly on the autonomous service robot.
It is also proposed that the autonomous service robot have at least one sensor unit, which is provided to detect an unexpected obstacle. An "unexpected obstacle" is to be understood in this context to mean, in particular, an obstacle which is not stored on the at least one map and/or which cannot be foreseen by the autonomous working apparatus. This advantageously avoids technically dangerous collisions, in particular already general collisions. Furthermore, obstacles can thereby be detected in the working area that is not mapped.
It is furthermore proposed that the sensor unit is provided for outputting a signal to the control and/or regulating unit when an unexpected obstacle is detected, the control and/or regulating unit being provided for storing the position of the unexpected obstacle on the at least one map of the at least one work area. This enables permanent storage and therefore also new, unexpected obstacles. In particular, this can avoid a map reconstruction. Furthermore, it is thereby possible to adapt the map dynamically to changes in the surroundings.
The method according to the invention and the autonomous service robot according to the invention should not be limited to the above-described applications and embodiments. In particular, the method according to the invention and the autonomous service robot according to the invention have a number which differs from the number mentioned here for the individual elements, components and units in order to achieve the mode of action described here.
Drawings
Further advantages result from the following description of the figures. In which three embodiments of the invention are shown. The figures, description and other descriptions contain many combinations of features. The person skilled in the art can also consider these features individually and combine them to form meaningful further combinations, in accordance with the intended purpose.
In the drawings:
FIG. 1: a flow diagram of the method according to the invention by means of the acquisition unit, the generation unit and the autonomous service robot;
FIG. 2: a schematic view of an autonomous service robot for performing the method according to the invention;
FIG. 3: a flow diagram of an alternative method according to the invention by means of a collection and generation instrument and an autonomous service robot;
FIG. 4: a flow diagram of a further alternative of the method according to the invention by means of a collecting unit and an autonomous service robot;
FIG. 5: schematic illustration of an autonomous service robot for performing the further alternative method according to the invention.
Detailed Description
Fig. 1 shows a schematic flow diagram of the method according to the invention by means of an acquisition unit 12a, by means of a generation unit 14a and by means of an autonomous service robot 10 a.
The autonomous service robot 10a is formed by an autonomous lawn mower. The autonomous service robot 10a is configured to trim a work area. The working area is formed by a grass or garden lawn. In principle, however, it is also conceivable for the autonomous service robot 10a to be formed by a vacuum cleaner or another service robot which is considered to be useful by a person skilled in the art and for the working area to be formed by a room, for example. Furthermore, the autonomous service robot 10a can also be programmed for multiple work areas. The different work areas can be stored separately and called up separately depending on the current position of the autonomous service robot 10 a. The autonomous service robot 10a has a control and regulation unit 32 a. The control and regulation unit 32a has a memory unit 34a and an interface 24 a. Furthermore, the control and regulation unit 32a is arranged in the housing 40a of the autonomous service robot 10 a. The interface 24a of the control and regulation unit 32a is provided for transferring an externally generated map of the working area to the memory unit 34 a. The interface 24a is formed by a wireless interface. The interface 24a is formed by a wireless local area network interface (WLAN interface). In principle, however, other wireless interfaces which are considered to be expedient by the person skilled in the art are also conceivable. In addition or alternatively, however, it is also conceivable for the interface 24a to be formed by a wired interface or by a drive interface or to have a further additional interface. The control and regulating unit 32a furthermore has a control system 36 a. The control system 36a is arranged to process the map of the work area. For this purpose, the control system 36a is provided for reading out or interpreting the map or data of the map stored on the memory unit 34 a. The control system 36a is implemented on the processor of the control and regulation unit 32 a. Furthermore, the control and regulation unit 32a has an output unit 42a, via which information can be transmitted to an operator 48 a. The output unit 42a is constituted by a display (fig. 2).
Furthermore, the autonomous service robot 10a has a sensor unit 38 a. The sensor unit 38a is provided for detecting an unexpected obstacle. The sensor unit 38a is provided to emit a signal to the control and regulation unit 32a when an unexpected obstacle is detected, and the control and regulation unit 32a is provided to store the position of the unexpected obstacle on the map of the work area. The sensor unit 38a is disposed substantially inside the housing 40a of the autonomous service robot 10 a. The sensor unit 38a is arranged in the front region of the autonomous service robot 10a, viewed in the direction of travel 44a of the autonomous service robot 10 a. Furthermore, the sensor unit 38a is aligned toward the front approximately parallel to the direction of travel 44 a. Further, the sensor unit 38a is constituted by an optical sensor. The sensor unit 38a is constituted by a camera. In principle, however, other sensor unit 38a configurations that are considered to be expedient by those skilled in the art are also conceivable. In principle, it is also conceivable for the autonomous service robot 10a to have at least one further sensor unit (fig. 2).
The method according to the invention provides for work area detection of the work area of the autonomous service robot 10 a. The method is carried out by means of the acquisition unit 12a, by means of the generation unit 14a and by means of the autonomous service robot 10 a. The acquisition unit 12a is constituted by a camera. In principle, however, it is also conceivable for the acquisition unit 12a to be formed by other instruments or parts of instruments which are considered to be useful by the person skilled in the art, for example, in particular by a mobile telephone, a smart phone or a camera, or for the acquisition unit 12a to be part of other instruments. The acquisition unit 12a is configured to be independent of the autonomous service robot 10 a. The acquisition unit 12a is completely separate from the autonomous service robot 10a and can be formed by a commercially available acquisition unit, for example, a commercially available camera, in particular. The generating unit 14a is constituted by a computer. In principle, however, it is also conceivable for the generating unit 14a to be formed by other devices or parts of devices which are considered appropriate by the person skilled in the art. In the method, a plurality of visual media files of the working area are captured by the capture unit 12a in a first method step 16a, 16 a'. The first method step 16a, 16 a' of the method is carried out before the working operation of the autonomous service robot 10 a. The first method step 16a, 16 a' of the method is carried out before the first start, i.e. before the first travel of the autonomous service robot 10 a. The visual media file is composed of an image file or a video file. In a first variant 16a of the first method step, an image file is acquired from the working area. In a second variant 16 a' of the first method step, a video file is captured from the working area. Subsequently, the virtual media file is transferred from the recording unit 12a to the generating unit 14a (fig. 1).
In a second method step 18a, a map of the working area is generated by the generation unit 14a on the basis of the visual media file. The generation takes place by means of software using the virtual media file. The generation is performed by means of an image processing algorithm. The generation is performed by means of beam adjustment, structural motion and visual instantaneous localization and mapping SLAM. In principle, however, other image processing algorithms or only parts of the image processing algorithms described above, which are considered to be expedient by a person skilled in the art, can also be considered. Alternatively or additionally to the generation unit 14a, the generation of the map can also be extended at least partially via an internet connection to the server 46a and/or by means of an internet application. In principle, however, it is also conceivable for the generating unit 14a to be formed by the server 46a or a part of the server 46a and for the recording unit 12a to transfer the media file directly to the server 46 a. Subsequently, in a further method step 20a, the boundary of the working area is determined in the map of the working area by the generation unit 14 a. The boundary is determined here based on a logical boundary. Here, obstacles and severe dips within the area are used as logical boundaries. Once the area of the autonomous service robot 10a becomes too steep, a boundary is automatically drawn by the generating unit 14 a. In addition, boundaries are automatically drawn around or over obstacles (e.g., stairs, trees, hedges, etc.) (fig. 1).
After determining the boundaries of the work area, the map of the work area with the boundaries of the work area is released by the generation unit 14a for machining by the operator 48 a. The released map with the boundary can be processed by the operator 48a in a further method step 50 a. The machining takes place here directly on the generating unit 14 a. The map with the borders is output to the operator 48a via the display screen and can be changed at will. In principle, however, it is also conceivable to differentiate between different authorization levels of the operator during the processing. In particular, it is conceivable here that only authorized service personnel can process the map itself and that the end user is only allowed to process the boundary (fig. 1).
After the map has been processed by the operator 48a, the map of the work area is additionally segmented by the generating unit 14a in a further method step 52 a. The working area is divided into different partial working areas. In this case, the generation unit 14a assigns different priorities and different repetition frequencies to the partial operating regions. In principle, the determination of the priority and the determination of the repetition frequency can also be carried out by the operator 48a, in particular. Furthermore, in method step 52a, a preferred trajectory of autonomous service robot 10a is determined by generating unit 14 a. The preferred trajectory here predetermines the movement path traveled by the autonomous service robot 10 a. In principle, however, it is also conceivable for the generating unit 14a to generate virtual auxiliary lines only in the map of the autonomous service robot 10 a. The virtual auxiliary line can support the autonomous service robot 10a in a particularly efficient processing. Furthermore, the virtual auxiliary line can be used to take into account operator wishes that cannot be entered directly at the autonomous service robot 10a (fig. 1).
Subsequently, in a further method step 22a, the map of the work area with the boundaries of the work area is transferred to the autonomous service robot 10a via the interface 24a of the autonomous service robot 10 a. Saving the map with the boundaries directly on the memory unit 34a of the autonomous service robot 10a through the interface 24a of the autonomous service robot 10 a. In addition or alternatively, it is also conceivable in principle for the map with the boundary to be transmitted via the interface 24a to the server 46a or a server and stored there. The map can thus be used in particular also by other autonomous service robots 10 a. Furthermore, the storage capacity of the memory unit 34a can thereby also be kept small. The interface 24a of the autonomous service robot 10a can be connected to the internet via a wireless local area network (WLAN network) of the garden building. It is also conceivable that, via a permanent internet connection, the autonomous service robot 10a continuously obtains new information, for example weather information, via the server 46a in order to work particularly efficiently. The server 46a may also continuously modify and refine the map, the boundary, or the preferred trajectory. Additionally, the autonomous service robot 10a can obtain information of other instruments, such as an ambient sensor in a garden or an ambient environment, a rain sensor, or information of person identification, through the internet. After the map is stored on the memory unit 34a, the autonomous service robot 10a starts operating. After the start of the operation, the autonomous service robot 10a, when it is first started up, is positioned in a further method step 26a in a work area according to the map of the at least one work area. The positioning is performed by the sensor unit 38a of the autonomous service robot 10 a. The autonomous service robot 10a compensates the current data of the sensor unit 38a with map data on the memory unit 34a in order to position itself thereby (fig. 1).
After positioning, the autonomous service robot 10a begins normal operation 54 a. During the normal operation 54a, the navigation software ensures that the autonomous service robot 10a does not leave the work area defined by the map with boundaries. To this end, the autonomous service robot 10a constantly determines its position on the map. If a change in the working area is detected during the normal operation 54a, a further method step 28a is initiated in the method, in which a map of the working area is automatically adjusted by the autonomous service robot 10 a. If a change of the working area, for example a new obstacle, is found by means of the sensor unit 38a, the map on the memory unit 34a of the autonomous service robot 10a automatically adapts to the change by means of the autonomous service robot 10 a. In principle, however, it is also conceivable that, when a change in the working area is detected, as indicated by the dashed line, in a further method step 30a during the normal operation 54a signal for the operator 48a to reconstruct the map of the at least one working area is automatically output by the autonomous service robot 10 a. The signal is output to an operator 48a by an output unit 42a of the autonomous service robot 10 a. The operator 48a can then decide in a further method step 56a whether he can simply remove the obstacle or want to re-map. Especially when the deviation of the working area from the map is large, it is meaningful or even necessary to reformulate the map (fig. 1).
Two further embodiments of the invention are shown in said figures 3-5. The following description and the figures are substantially limited to the differences between the exemplary embodiments, wherein, with regard to components that are labeled the same, in particular with regard to components having the same reference numerals, reference can also be made substantially to the drawings and/or the description of the further exemplary embodiments, in particular fig. 1 and 2. To distinguish the embodiments, the letter a is placed after the reference numerals of the embodiments in fig. 1 and 2. In the embodiment of fig. 3-5, the letter a is replaced by letters b and c.
Fig. 3 shows a schematic flow diagram of an alternative method according to the invention by means of the acquisition and generation instrument 58b and by means of the autonomous service robot 10 b. The acquisition and generation instrument 58b has an acquisition unit 12b and a generation unit 14 b. The acquisition and generation instrument 58b is constituted by a smartphone.
The method according to the invention provides for work area detection of the work area of the autonomous service robot 10 b. The method is carried out by means of the acquisition unit 12b, by means of the generation unit 14b and by means of the autonomous service robot 10 b. The acquisition unit 12b and the generation unit 14b here form part of the acquisition and generation instrument 58 b. The acquisition unit 12b is formed by an integrated camera of the acquisition and generation instrument 58 b. The generating unit 14b is formed by a part of the arithmetic unit of the acquisition and generation instrument 58 b. The acquisition and generation instrument 58b is configured independently of the autonomous service robot 10 b. The acquisition and generation instrument 58b is constructed completely separately from the autonomous service robot 10b and can be constructed from a commercially available smartphone, on which only one specific application is installed. In the method, a plurality of visual media files of the working area are captured by the capture unit 12b in a first method step 16b, 16 b'. By configuring the acquisition and generation instrument 58b as a smartphone, it is possible in principle to take into account that, for each visual media file, in addition to the media file itself, for example, the tilt angle of the acquisition unit 12b during acquisition and/or the exact position of the acquisition unit 12b during acquisition and/or the relative position of the acquisition unit 12b during acquisition with respect to the previous acquisition can also be stored. In particular, already existing sensors of the acquisition and generation instrument 58b, such as, in particular, a tilt sensor, an acceleration sensor, a rotation rate sensor and/or a GPS receiver, can be used here. These additional data can be used in particular when generating the map, in order to achieve a particularly high accuracy. The first method step 16b, 16 b' of the method is carried out before the working operation of the autonomous service robot 10 b. The first method step 16b, 16 b' of the method is carried out before the first start, i.e. before the first travel of the autonomous service robot 10 b. In a second method step 18b, a map of the working area is generated by the generation unit 14b on the basis of the visual media file. The visual media file is here directly further processed by the generating unit 14 b. The generation takes place by means of software installed on the acquisition and generation instrument 58b using virtual media files. The generation is performed by means of an image processing algorithm. The generation is performed by means of beam adjustment, structural motion and visual instantaneous localization and mapping SLAM. Subsequently, in a further method step 20b, the boundary of the work area is determined in the map of the work area by the generation unit 14 b.
After determining the boundaries of the work area, the map of the work area with the boundaries of the work area is released by the generation unit 14b for processing by an operator 48 b. After the operator 48b has processed the map, the map of the work area is additionally partitioned in a further method step 52b by the generating unit 14 b. Furthermore, in method step 52b, a preferred trajectory of autonomous service robot 10b is determined by generating unit 14 b.
In a further method step 22b, the map of the work area with the boundaries of the work area is transferred to the autonomous service robot 10b via the interface 24b of the autonomous service robot 10 b. The map with the boundaries is saved directly on the memory unit 34b of the autonomous service robot 10b through the interface 24b of the autonomous service robot 10 b. After the map is stored on the memory unit 34b, the autonomous service robot 10b starts operating. After the start of the operation, the autonomous service robot 10b is positioned in a further method step 26b in a work area according to the map of the at least one work area when it is first started. After positioning, the autonomous service robot 10b begins normal operation 54 b. If a change in the working area is detected during the normal operation 54b, a further method step 28b is initiated in the method, in which the map of the working area is automatically adjusted by the autonomous service robot 10 b.
Fig. 4 shows a schematic flow diagram of a further alternative method according to the invention by means of the acquisition unit 12c and by means of the autonomous service robot 10 c.
The autonomous service robot 10c is constituted by an autonomous lawn trimmer. The autonomous service robot 10c has a control and regulation unit 32 c. The control and regulation unit 32c has a memory unit 34c and an interface 24 c. The interface 24c of the control and regulation unit 32c is provided for transferring externally generated visual media files of a working area onto the memory unit 34 c. Furthermore, the control and regulation unit 32c has a control system 36 c. Further, the autonomous service robot 10c has a generation unit 14 c. The generating unit 14c is constituted by an arithmetic unit. The generation unit 14c is disposed in a housing 40c of the autonomous service robot 10 c. In principle, it is conceivable for the autonomous service robot 10c additionally to have a detection unit 12 c. The acquisition unit 12c can be integrated in the housing 40c of the autonomous service robot 10 c. In particular, the number of components can thereby be kept low. Furthermore, the autonomous service robot 10a has a sensor unit 38a (fig. 5).
The method according to the invention provides for work area detection of the work area of the autonomous service robot 10 c. The method is carried out by means of the acquisition unit 12c, by means of the generation unit 14c and by means of the autonomous service robot 10 c. The acquisition unit 12c is constituted by a camera. The acquisition unit 12c is configured to be independent of the autonomous service robot 10 c. The acquisition unit 12c is completely separate from the autonomous service robot 10c and can be formed by a commercially available acquisition unit, for example, a commercially available camera, in particular. The generating unit 14c forms part of the autonomous service robot 10 c. In the method, a plurality of visual media files of the working area are captured by the capture unit 12c in a first method step 16c, 16 c'. The first method step 16c, 16 c' of the method is carried out before the working operation of the autonomous service robot 10 c. The first method step 16c, 16 c' of the method is carried out before the first start, i.e. before the first travel of the autonomous service robot 10 c.
Subsequently, in a further method step 22c, the visual media file of the work area with the boundaries of the work area is transferred to the autonomous service robot 10c via the interface 24c of the autonomous service robot 10 c. Through the interface 24c of the autonomous service robot 10c, the visual media file is saved on the memory unit 34c of the autonomous service robot 10c, where it can be called by the generation unit 14 c.
In a second method step 18c, a map of the work area is generated by the generation unit 14c of the autonomous service robot 10c on the basis of the visual media file. The generation takes place by means of software using virtual media files. The generation is performed by means of an image processing algorithm. The generation is performed by means of beam adjustment, structural motion and visual instantaneous localization and mapping SLAM. Subsequently, in a further method step 20c, the boundary of the working area is determined in the map of the working area by the generation unit 14 c.
After determining the boundaries of the work area, the map of the work area with the boundaries of the work area is released by the generation unit 14c for processing by an operator 48 c. The map freed with the boundary can be processed by the operator 48c in a further method step 50 c. In this case, the processing takes place directly on the generation unit 14c or directly on the autonomous service robot 10 c. The map with the borders is output by the output unit 42c and can be processed there. After the operator 48c processes the map, the map with boundaries is stored on the memory unit 34 c. In addition or alternatively, it is conceivable in principle for the map with the boundary to be transferred via the interface 24c to a server 46c or to another server and stored there. In particular, the map can thus also be used by other autonomous service robots 10 c. Furthermore, the storage capacity of the memory unit 34c can thereby be kept small. The interface 24c of the autonomous service robot 10c can be connected to the internet via a wireless local area network (WLAN network) of the garden building.
After the map is stored on the memory unit 34c, the autonomous service robot 10c starts operating. After the start of the operation, the autonomous service robot 10c is positioned in a further method step 26c in a work area according to the map of the at least one work area when it is first started. After positioning, the autonomous service robot 10c begins normal operation 54 c. If a change in the working area is detected during the normal operation 54c, a further method step 28c is initiated in the method, in which the map of the working area is automatically adjusted by the autonomous service robot 10 c.

Claims (14)

1. A method for work area detection of at least one work area of an autonomous service robot (10 a; 10 b; 10c) by means of at least one acquisition unit (12 a; 12 b; 12c) and a generation unit (14 a; 14 b; 14c), wherein in a first method step (16 a; 16a '; 16 b; 16b '; 16 c; 16c ') at least one visual media file of the at least one work area is acquired by the at least one acquisition unit (12 a; 12 b; 12c) before the work operation of the autonomous service robot (10 a; 10 b; 10c) and in a second method step (18 a; 18 b; 18c) a map of the at least one work area is generated by the generation unit (14 a; 14 b; 14c) on the basis of the at least one visual media file, in a first method step (16 a; 16a '; 16 b; 16b '; 16 c; 16c ') at least one visual media file of the at least one working area is recorded by the at least one recording unit (12 a; 12 b; 12c) independent of the autonomous service robot (10 a; 10 b; 10c) before the work operation of the autonomous service robot (10 a; 10 b; 10c), wherein a boundary of the at least one working area is determined in the map of the at least one working area by the generating unit (14 a; 14 b; 14c) and the map of the at least one working area and/or the boundary of the at least one working area is released for processing by the generating unit (14 a; 14 b; 14c) after the determination of the boundary of the working area, or, in a further method step (30 a; 30 b; 30c), when a change in the at least one work area is detected, a signal is automatically sent to the operator by the autonomous service robot (10 a; 10 b; 10c) for reconstructing a map of the at least one work area.
2. Method according to claim 1, characterized in that in the first method step (16 a; 16a '; 16 b; 16b '; 16 c; 16c ') at least one visual media file of the at least one working area is acquired by the acquisition unit (12 a; 12 b; 12c) before the first start of the autonomous service robot (10 a; 10 b; 10 c).
3. Method according to claim 1 or 2, characterized by a further method step (22 a; 22 b; 22c) in which the at least one media file of the at least one work area and/or the map of the at least one work area and/or the border of the at least one work area is transferred onto the autonomous service robot (10 a; 10 b; 10c) through an interface (24 a; 24 b; 24c) of the autonomous service robot (10 a; 10 b; 10 c).
4. Method according to claim 1 or 2, characterized by a further method step (26 a; 26 b; 26c) in which the autonomous service robot (10 a; 10 b; 10c) is positioned in a work area according to a map of the at least one work area.
5. The method according to claim 1 or 2, characterized by a further method step (28 a; 28 b; 28c) in which the map of the at least one work area is automatically adjusted by the autonomous service robot (10 a; 10 b; 10c) when a change in the at least one work area is found.
6. Method according to claim 1 or 2, characterized in that the autonomous service robot (10 a; 10 b; 10c) is an autonomous lawn trimmer.
7. An autonomous service robot for performing at least the method according to any of claims 1 to 6.
8. The autonomous service robot as claimed in claim 7, characterized by at least one control and/or regulating unit (32 a; 32 b; 32c) having at least one memory unit (34 a; 34 b; 34c) and at least one interface (24 a; 24 b; 24c) which is provided for transferring at least one externally generated map of at least one working area onto the memory unit (34 a; 34 b; 34 c).
9. The autonomous service robot according to claim 8, characterized in that the at least one control and/or regulating unit (32 a; 32 b; 32c) has at least one control system (36 a; 36 b; 36c) which is provided for processing the at least one map of the at least one work area.
10. Autonomous service robot according to any of claims 8-9, characterized by at least one sensor unit (38 a; 38 b; 38c) arranged to detect unexpected obstacles.
11. The autonomous service robot as claimed in claim 10, characterized in that the sensor unit (38 a; 38 b; 38c) is provided for outputting a signal to the control and/or regulating unit (32 a; 32 b; 32c) in the event of detection of an unexpected obstacle, which control and/or regulating unit is provided for storing the position of the unexpected obstacle on the at least one map of the at least one working area.
12. The autonomous service robot of claim 7, wherein the autonomous service robot is an autonomous lawn trimmer.
13. A system for carrying out the method according to any one of claims 1 to 6, the system having an autonomous service robot (10 a; 10 b; 10c) and having an acquisition unit (12 a; 12 b; 12c) and a generation unit (14 a; 14 b; 14 c).
14. System according to claim 13, characterized in that the autonomous service robot (10 a; 10 b; 10c) is configured as an autonomous service robot according to any of claims 7-12.
CN201410301793.0A 2013-06-28 2014-06-27 Method for detecting a working area of at least one working area of an autonomous service robot Active CN104252176B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102013212605.0A DE102013212605A1 (en) 2013-06-28 2013-06-28 Method for a work area detection of at least one work area of an autonomous service robot
DE102013212605.0 2013-06-28

Publications (2)

Publication Number Publication Date
CN104252176A CN104252176A (en) 2014-12-31
CN104252176B true CN104252176B (en) 2020-12-22

Family

ID=51410115

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410301793.0A Active CN104252176B (en) 2013-06-28 2014-06-27 Method for detecting a working area of at least one working area of an autonomous service robot

Country Status (3)

Country Link
CN (1) CN104252176B (en)
DE (1) DE102013212605A1 (en)
GB (2) GB2517572B (en)

Families Citing this family (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102014226084A1 (en) 2014-12-16 2016-06-16 Robert Bosch Gmbh Method for mapping a working surface for autonomous robotic vehicles
SE538373C2 (en) 2014-12-23 2016-05-31 Husqvarna Ab Improved navigation for a robotic lawnmower
JP5973608B1 (en) * 2015-03-27 2016-08-23 本田技研工業株式会社 Control equipment for unmanned work vehicles
DE102015221658A1 (en) 2015-11-04 2017-05-04 Robert Bosch Gmbh Garden sensor device
DE102015222414A1 (en) * 2015-11-13 2017-05-18 Robert Bosch Gmbh Autonomous working device
CN106814733A (en) * 2015-12-01 2017-06-09 江苏长虹智能装备集团有限公司 A kind of automatical pilot transportation vehicle Flexibility Control Technique
WO2018041146A1 (en) * 2016-08-31 2018-03-08 苏州宝时得电动工具有限公司 Intelligent lawn-mower, self-mobile device and obstacle identify method thereof
CN108073164B (en) * 2016-11-11 2019-11-26 苏州宝时得电动工具有限公司 Automatic mower and its traveling method
EP3540552B1 (en) 2016-11-11 2022-06-01 Positec Power Tools (Suzhou) Co., Ltd Automatic work system and control method therefor
DE102017203055A1 (en) * 2017-02-24 2018-08-30 Robert Bosch Gmbh Method for detecting at least one working area of an autonomous implement
EP3413155B1 (en) 2017-06-09 2020-02-26 Andreas Stihl AG & Co. KG Method for the detection of at least one section of a limiting edge of a surface to be processed, method for operating an autonomous mobile green area processing robot, detection system and green area processing system
EP3412128B1 (en) 2017-06-09 2021-05-12 Andreas Stihl AG & Co. KG Green areas processing system and method for the detection of at least one section of a limiting edge of a surface to be processed
CN107976194B (en) * 2017-11-24 2021-12-21 北京奇虎科技有限公司 Environment map adjusting method and device
JP6877330B2 (en) 2017-12-27 2021-05-26 株式会社クボタ Work area determination system for autonomous travel type work equipment, autonomous travel type work equipment, and work area determination program
CN108919814A (en) * 2018-08-15 2018-11-30 杭州慧慧科技有限公司 Grass trimmer working region generation method, apparatus and system
CN108337987A (en) * 2018-02-13 2018-07-31 杭州慧慧科技有限公司 A kind of automatic mowing system and grass trimmer control method
EP3560312B1 (en) * 2018-04-06 2021-10-20 LG Electronics Inc. Lawn mower robot
US11166409B2 (en) 2018-04-06 2021-11-09 Lg Electronics Inc. Lawn mower robot
EP3549429B1 (en) 2018-04-06 2021-10-27 LG Electronics Inc. Lawn mower robot
EP3549425B1 (en) 2018-04-06 2021-08-04 LG Electronics Inc. Lawn mower robot
EP3549426B1 (en) 2018-04-06 2021-09-08 LG Electronics Inc. Lawn mower robot
EP3549427B1 (en) 2018-04-06 2021-09-01 LG Electronics Inc. Lawn mower robot
EP3549424B1 (en) 2018-04-06 2022-01-05 Lg Electronics Inc. Lawn mower robot
EP3549423B1 (en) 2018-04-06 2021-06-16 Lg Electronics Inc. Lawn mower robot
CN112218518B (en) 2018-05-25 2022-12-13 托罗公司 Autonomous ground maintenance machine with dilemma avoidance and obstacle avoidance path planning
EP3686704B1 (en) 2019-01-22 2023-08-09 Honda Research Institute Europe GmbH Method for generating a representation and system for teaching an autonomous device operating based on such representation
US11457558B1 (en) 2019-05-15 2022-10-04 Hydro-Gear Limited Partnership Autonomous vehicle navigation
CN110466913A (en) * 2019-07-29 2019-11-19 东莞弓叶互联科技有限公司 Rubbish collection methods and device
CN110498166A (en) * 2019-08-07 2019-11-26 东莞弓叶互联科技有限公司 Rubbish recovering method, device, computer equipment and storage medium
CN110641872A (en) * 2019-09-06 2020-01-03 东莞弓叶互联科技有限公司 Garbage can transportation method and system
SE544524C2 (en) * 2019-12-06 2022-06-28 Husqvarna Ab Robotic work tool system and method for defining a working area perimeter

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1470368A (en) * 2002-07-26 2004-01-28 ������������ʽ���� Robot cleaning device and robot cleaning system and control method thereof
CN102053623A (en) * 2009-11-10 2011-05-11 德国福维克控股公司 Method for controlling robot
EP2381328A2 (en) * 2010-04-26 2011-10-26 LG Electronics Robot cleaner and remote monitoring system using the same
CN102314176A (en) * 2010-07-01 2012-01-11 德国福维克控股公司 Self-propelled device and method for orienting such a device
CN102846273A (en) * 2011-06-28 2013-01-02 三星电子株式会社 Robot cleaner and control method thereof
CN102866706A (en) * 2012-09-13 2013-01-09 深圳市银星智能科技股份有限公司 Cleaning robot adopting smart phone navigation and navigation cleaning method thereof
CN102918359A (en) * 2010-04-09 2013-02-06 通腾科技股份有限公司 Navigation or mapping apparatus and method
CN102929280A (en) * 2012-11-13 2013-02-13 朱绍明 Mobile robot separating visual positioning and navigation method and positioning and navigation system thereof

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7854108B2 (en) * 2003-12-12 2010-12-21 Vision Robotics Corporation Agricultural robot system and method
US8855819B2 (en) * 2008-10-09 2014-10-07 Samsung Electronics Co., Ltd. Method and apparatus for simultaneous localization and mapping of robot
KR101083394B1 (en) * 2009-10-30 2011-11-14 주식회사 유진로봇 Apparatus and Method for Building and Updating a Map for Mobile Robot Localization
KR101842460B1 (en) * 2011-04-12 2018-03-27 엘지전자 주식회사 Robot cleaner, and remote monitoring system and method of the same
TW201305761A (en) * 2011-07-21 2013-02-01 Ememe Robot Co Ltd An autonomous robot and a positioning method thereof
US9594380B2 (en) * 2012-03-06 2017-03-14 Travis Dorschel Path recording and navigation
EP3373097A1 (en) * 2012-08-14 2018-09-12 Husqvarna AB Robotic mower with object detection system
DE102012221572A1 (en) * 2012-11-26 2014-05-28 Robert Bosch Gmbh Autonomous locomotion device
US9804604B2 (en) * 2013-08-16 2017-10-31 Husqvarna Ab Intelligent grounds management system integrating robotic rover

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1470368A (en) * 2002-07-26 2004-01-28 ������������ʽ���� Robot cleaning device and robot cleaning system and control method thereof
CN102053623A (en) * 2009-11-10 2011-05-11 德国福维克控股公司 Method for controlling robot
CN102918359A (en) * 2010-04-09 2013-02-06 通腾科技股份有限公司 Navigation or mapping apparatus and method
EP2381328A2 (en) * 2010-04-26 2011-10-26 LG Electronics Robot cleaner and remote monitoring system using the same
CN102314176A (en) * 2010-07-01 2012-01-11 德国福维克控股公司 Self-propelled device and method for orienting such a device
CN102846273A (en) * 2011-06-28 2013-01-02 三星电子株式会社 Robot cleaner and control method thereof
CN102866706A (en) * 2012-09-13 2013-01-09 深圳市银星智能科技股份有限公司 Cleaning robot adopting smart phone navigation and navigation cleaning method thereof
CN102929280A (en) * 2012-11-13 2013-02-13 朱绍明 Mobile robot separating visual positioning and navigation method and positioning and navigation system thereof

Also Published As

Publication number Publication date
DE102013212605A1 (en) 2014-12-31
GB2517572B (en) 2019-02-13
GB201411312D0 (en) 2014-08-06
GB2517572A (en) 2015-02-25
CN104252176A (en) 2014-12-31
GB201813548D0 (en) 2018-10-03
GB2563347A (en) 2018-12-12

Similar Documents

Publication Publication Date Title
CN104252176B (en) Method for detecting a working area of at least one working area of an autonomous service robot
US11272823B2 (en) Zone cleaning apparatus and method
JP6946524B2 (en) A system for performing simultaneous position measurement mapping using a mechanical visual system
US10394246B2 (en) Robot with automatic styles
US10583561B2 (en) Robotic virtual boundaries
CN114847803B (en) Positioning method and device of robot, electronic equipment and storage medium
CN111990929B (en) Obstacle detection method and device, self-walking robot and storage medium
US11269343B2 (en) Mobile robot and control method thereof
US20240118700A1 (en) Mobile robot and control method of mobile robot
CN111802060A (en) Work machine control using real-time models
WO2019144541A1 (en) Cleaning robot
TWI706763B (en) Moving robot and controlling method
EP3767420A1 (en) Robotic mower boundary detection system
CN110968083B (en) Method for constructing grid map, method, device and medium for avoiding obstacles
EP4137905A1 (en) Robot obstacle avoidance method, device, and storage medium
US11402850B2 (en) Robotic cleaning device with operating speed variation based on environment
CN110060207A (en) Method, the system of floor plan are provided
CN112075879A (en) Information processing method, device and storage medium
US20240028044A1 (en) Ranging method and apparatus, robot, and storage medium
KR102500529B1 (en) Moving Robot and controlling method
EP4010772A1 (en) Using adjustable vision component for on-demand vision data capture of areas along a predicted trajectory of a robot
KR102272161B1 (en) Lawn mover robot system and controlling method for the same
JP7354528B2 (en) Autonomous mobile device, method and program for detecting dirt on lenses of autonomous mobile device
US11796341B2 (en) Information processing device, information processing system, and information processing method
CN210525104U (en) Control device and cleaning robot applied to same

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant