WO2023003249A1 - Robotic cleaner and method for controlling robotic cleaner - Google Patents
Robotic cleaner and method for controlling robotic cleaner Download PDFInfo
- Publication number
- WO2023003249A1 WO2023003249A1 PCT/KR2022/010141 KR2022010141W WO2023003249A1 WO 2023003249 A1 WO2023003249 A1 WO 2023003249A1 KR 2022010141 W KR2022010141 W KR 2022010141W WO 2023003249 A1 WO2023003249 A1 WO 2023003249A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- open
- node
- robot cleaner
- driving
- open space
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 32
- 238000005266 casting Methods 0.000 claims abstract description 47
- 238000001514 detection method Methods 0.000 claims description 27
- 238000012545 processing Methods 0.000 claims description 8
- 238000000926 separation method Methods 0.000 claims description 4
- 238000005259 measurement Methods 0.000 claims description 2
- 230000001678 irradiating effect Effects 0.000 claims 1
- 238000004140 cleaning Methods 0.000 description 23
- 238000010586 diagram Methods 0.000 description 21
- 239000000428 dust Substances 0.000 description 16
- 238000013528 artificial neural network Methods 0.000 description 14
- 230000008569 process Effects 0.000 description 13
- 230000033001 locomotion Effects 0.000 description 11
- 230000000052 comparative effect Effects 0.000 description 10
- 230000006870 function Effects 0.000 description 8
- 238000004891 communication Methods 0.000 description 7
- 230000001133 acceleration Effects 0.000 description 6
- 238000005516 engineering process Methods 0.000 description 6
- 230000008859 change Effects 0.000 description 5
- 230000000694 effects Effects 0.000 description 5
- 238000010801 machine learning Methods 0.000 description 5
- 239000000463 material Substances 0.000 description 4
- 239000000126 substance Substances 0.000 description 4
- 239000000284 extract Substances 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 230000005236 sound signal Effects 0.000 description 3
- 238000012549 training Methods 0.000 description 3
- 238000013527 convolutional neural network Methods 0.000 description 2
- 238000013135 deep learning Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000008054 signal transmission Effects 0.000 description 2
- 238000010408 sweeping Methods 0.000 description 2
- 230000003321 amplification Effects 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 238000003780 insertion Methods 0.000 description 1
- 230000037431 insertion Effects 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000003032 molecular docking Methods 0.000 description 1
- 238000003199 nucleic acid amplification method Methods 0.000 description 1
- 230000001151 other effect Effects 0.000 description 1
- 238000005192 partition Methods 0.000 description 1
- 238000011897 real-time detection Methods 0.000 description 1
- 230000000306 recurrent effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/20—Control system inputs
- G05D1/24—Arrangements for determining position or orientation
- G05D1/242—Means based on the reflection of waves generated by the vehicle
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L9/00—Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating action; Storing devices specially adapted to suction cleaners or parts thereof; Carrying-vehicles specially adapted for suction cleaners
- A47L9/28—Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L9/00—Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating action; Storing devices specially adapted to suction cleaners or parts thereof; Carrying-vehicles specially adapted for suction cleaners
- A47L9/28—Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means
- A47L9/2805—Parameters or conditions being sensed
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L9/00—Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating action; Storing devices specially adapted to suction cleaners or parts thereof; Carrying-vehicles specially adapted for suction cleaners
- A47L9/28—Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means
- A47L9/2836—Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means characterised by the parts which are controlled
- A47L9/2852—Elements for displacement of the vacuum cleaner or the accessories therefor, e.g. wheels, casters or nozzles
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/20—Control system inputs
- G05D1/24—Arrangements for determining position or orientation
- G05D1/246—Arrangements for determining position or orientation using environment maps, e.g. simultaneous localisation and mapping [SLAM]
- G05D1/2464—Arrangements for determining position or orientation using environment maps, e.g. simultaneous localisation and mapping [SLAM] using an occupancy grid
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/20—Control system inputs
- G05D1/24—Arrangements for determining position or orientation
- G05D1/246—Arrangements for determining position or orientation using environment maps, e.g. simultaneous localisation and mapping [SLAM]
- G05D1/2469—Arrangements for determining position or orientation using environment maps, e.g. simultaneous localisation and mapping [SLAM] using a topologic or simplified map
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/20—Drawing from basic elements, e.g. lines or circles
- G06T11/206—Drawing of charts or graphs
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L2201/00—Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
- A47L2201/04—Automatic control of the travelling movement; Automatic obstacle detection
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D2105/00—Specific applications of the controlled vehicles
- G05D2105/10—Specific applications of the controlled vehicles for cleaning, vacuuming or polishing
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D2107/00—Specific environments of the controlled vehicles
- G05D2107/60—Open buildings, e.g. offices, hospitals, shopping areas or universities
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D2109/00—Types of controlled vehicles
- G05D2109/10—Land vehicles
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D2111/00—Details of signals used for control of position, course, altitude or attitude of land, water, air or space vehicles
- G05D2111/10—Optical signals
- G05D2111/17—Coherent light, e.g. laser signals
Definitions
- the present invention relates to a mobile robot, specifically a robot cleaner and a method for controlling the robot cleaner, and more particularly, to a robot cleaner detection and control technology for generating a driving map.
- a typical example of a bot is a robot vacuum cleaner.
- Various technologies are known for sensing an environment around the robot cleaner and a user through various sensors provided in the robot cleaner.
- technologies are known in which a robot cleaner learns and maps a cleaning area by itself and determines a current location on a map.
- a robot cleaner that cleans while driving in a cleaning area in a preset manner is known.
- a map of the driving area In order to perform set tasks such as cleaning, a map of the driving area must be accurately created, and the current location of the robot cleaner on the map must be accurately identified in order to move to a location within the driving area.
- a topology node is added to a route at a location where additional search is required while the robot is driving, so that the need for further search and driving is indicated.
- Patent Document 1 Korean Patent Publication No. 10-2010-0031878 A
- Patent Document 2 Korean Patent Publication No. 10-2021-0009011 A
- an object of the present invention is to provide a map generation method capable of minimizing unnecessary driving when searching for a space requiring driving.
- Korean Patent Publication No. 10-2021-0009011 generates a grid map and a topology map in real time based on real-time detection information, so the direction is set from the current location toward an area that requires additional search driving. Do it.
- Another object of the present invention is to provide a control method capable of reducing avoidance driving by setting the driving to the center of the aisle during such additional search driving.
- Another object of the present invention is to give a PARENT-CHILDREN relationship between one node and other nodes connected to it, and when selecting a child node to move from one parent node, by comparing the open widths of each child node with each other, the maximum It is to provide a control method for first entering a search space having a wide width.
- SLAM performance can be improved through such priority driving, and a control method capable of first providing as much information as possible to other nodes that enter later is provided.
- a robot cleaner for achieving the above object includes a traveling unit for moving a main body within a traveling area; a distance measuring sensor that obtains distance sensing information about a distance to an object outside the main body; and when generating a grid map for the driving area from the distance detection information and dividing the driving area into a plurality of sub areas, ray casting is performed on a plurality of driving nodes on the path of the grid map for each sub area. and a control unit for performing a search for an open space, setting an open node for the open space, and calculating a topology graph between the driving node and the open node.
- the distance measurement sensor may include a lidar sensor that irradiates light to an object outside the main body and calculates the distance detection information based on reflected light.
- the sub area may be allocated as an area when the robot cleaner travels in the driving area for a predetermined time or a predetermined distance.
- the controller may perform ray casting for each of a plurality of driving nodes in the grid map for each sub-region to search for an open space, and set the open node for each of the open spaces.
- the controller may set the open node in a central region of the width of the open space.
- the controller may set the open node in a central region of the separation distance between the two obstacles.
- the controller may set the open node at an intersection of a center line between the two obstacles and a vertical line of the traveling node.
- the control unit moves to one of the open nodes of the last driving node, changes the moved open node to a closed node, and then moves to an open node remaining in the topology graph. It can be moved to partition another sub area.
- the control unit may move to an open node having the largest width among a plurality of open nodes of the last driving node.
- the control unit may determine that the robot cleaner is capable of driving by ray casting at the travel node and a space in which the robot cleaner is not previously driven is an open space.
- the control unit may perform 360-degree ray casting around the traveling node.
- the control unit sets a first open node for one open space searched for for the first travel node, and sets a second open node for another open space searched for for a second travel node different from the first travel node. and if it is determined that the first open node and the second open node overlap, one of the overlapping first and second open nodes may be deleted.
- the controller may determine whether the first open node overlaps with the second open node when widths of the one open space and the other open space overlap by a predetermined range or more.
- the control unit generates circles having the same diameter around the first open node and the second open node, and when each circle overlaps a predetermined range or more, the first open node and the second open node are the same open node. It can be judged to be about.
- one open node adjacent to a central area of an open area where the first and second open nodes are located may be left, and another open node may be deleted.
- the control unit may calculate a final topology graph for the driving area by connecting the topology graphs of the plurality of sub-regions to each other.
- the control unit may process the grid map to calculate a final map.
- the embodiment includes obtaining distance sensing information about a distance to an object outside the main body while moving in an unknown driving area; generating a grid map for the driving area from the distance detection information; dividing the driving area into a plurality of sub-regions, and searching an open space by performing ray casting on a plurality of driving nodes on a path of the grid map for each of the sub-regions; setting an open node for the open space and generating a topology graph between the driving node and the open node for the sub-area; and generating a final topology graph for the driving area by connecting the topology graphs of the plurality of sub-regions to each other.
- the generating of the topology graph may include, when the open space is formed between two obstacles that are spaced apart without a step, setting the open node in a central region of the separation distance between the two obstacles, and determining that the open space has a step When it is formed between two obstacles spaced apart from each other, the open node may be set at an intersection of a center line between the two obstacles and a vertical line of the traveling node.
- the generating of the topology graph may include setting a first open space with respect to one open space searched for for the first traveling node and another open space searched for for a second traveling node different from the first traveling node. establishing a second open node; determining whether the first open node and the second open node overlap when widths of the one open space and the other open space overlap by a predetermined range or more; Circles having the same diameter are generated around the first open node and the second open node, and if each circle overlaps a predetermined range or more, it is determined that the first open node and the second open node correspond to the same open node. doing; and leaving one open node adjacent to a central area of an open area where the first and second open nodes are located among the first open node and the second open node, and deleting the other open node. .
- efficiency can be improved by minimizing unnecessary driving when searching for a space requiring additional driving.
- the present invention can reduce avoidance driving by setting the driving to the center of the aisle during additional search driving.
- FIG. 1 is a perspective view illustrating a robot cleaner and a charging stand for charging the robot cleaner according to an embodiment of the present invention.
- FIG. 2 is an elevational view of the robot cleaner of FIG. 1 viewed from above.
- FIG. 3 is an elevational view of the robot cleaner of FIG. 1 viewed from the front;
- FIG. 4 is an elevational view of the robot cleaner of FIG. 1 viewed from a lower side.
- FIG. 5 is a diagram showing an example of a robot cleaner according to another embodiment of the present invention.
- FIG. 6 is a block diagram illustrating a control relationship between main elements of the robot cleaner of FIG. 1 or 5 .
- FIG. 7 is a diagram illustrating an operation of a LiDAR sensor provided in a robot cleaner according to an embodiment of the present invention.
- FIG. 8 is a flowchart of a process of generating a map of a robot cleaner according to an embodiment of the present invention.
- 9A to 9E are diagrams illustrating a process of generating a map of the robot cleaner according to the flowchart of FIG. 8 .
- FIG. 10 is a diagram showing a portion of a topology graph of a robot cleaner according to the flowchart of FIG. 8 .
- FIG. 11 is a diagram showing a flowchart for searching open nodes of FIG. 8 .
- 12A and 12B are diagrams showing the non-stepped open space of FIG. 11 .
- FIG. 13A and 13B are diagrams illustrating an embodiment of the stepped open space of FIG. 11 .
- 14A and 14B are diagrams illustrating another embodiment of the stepped open space of FIG. 11 .
- FIG. 15 is a diagram illustrating another flowchart for searching open nodes of FIG. 8 .
- FIG. 16 is a diagram illustrating a search of a robot cleaner showing the operation of FIG. 15 .
- FIG. 17 is a diagram for explaining a process for deduplication of FIG. 15 .
- FIG. 18A shows an open space search according to a comparative example
- FIG. 18B shows an open space search according to the present invention.
- FIG. 19A shows a plurality of open space searches according to a comparative example
- FIG. 19B shows redundancy removal during an open space search according to the present invention.
- FIG. 20A shows an unsearched area according to a comparative example
- FIG. 20B shows an open space searched for according to the present invention.
- 'less than or equal to (less than)' and 'less than (less than)' are degrees that can be easily substituted for each other from the point of view of a person skilled in the art
- 'greater than or equal to' (more)' and 'greater than (exceeding)' are degrees that can be easily substituted for each other from the standpoint of a person skilled in the art, and it is of course that there is no problem in exerting the effect even if they are substituted in implementing the present invention.
- the mobile robot 100 of the present invention refers to a robot capable of moving by itself using wheels and the like, and may be a home helper robot and a robot vacuum cleaner.
- FIG. 1 to 4 are views illustrating the appearance of a robot cleaner and a charging station for charging the robot cleaner according to an embodiment of the present invention.
- FIG. 1 is a perspective view showing a robot cleaner and a charging stand for charging the robot cleaner according to an embodiment of the present invention
- FIG. 2 is an elevational view of the robot cleaner of FIG. 1 viewed from above
- FIG. 3 is the robot cleaner of FIG. 1 It is an elevation view viewed from the front
- FIG. 4 is an elevation view of the robot cleaner of FIG. 1 viewed from the lower side.
- the robot cleaner 100 includes a body 110.
- the part facing the ceiling in the driving area is defined as the upper part (see FIG. 2)
- the part facing the floor in the driving area is defined as the bottom part (see FIG. 4).
- the part facing the driving direction is defined as the front part (see FIG. 3).
- a portion facing the opposite direction to the front portion of the main body 110 may be defined as a rear portion.
- the main body 110 may include a case 111 forming a space in which various parts constituting the robot cleaner 100 are accommodated.
- the robot cleaner 100 may include, for example, at least one driving wheel 136 for moving the main body 110 .
- the driving wheel 136 may be driven and rotated by, for example, at least one motor (not shown) connected to the driving wheel 136 .
- the driving wheels 136 may be respectively provided on the left and right sides of the main body 110, for example, and are hereinafter referred to as a left wheel 136(L) and a right wheel 136(R), respectively.
- the left wheel 136(L) and the right wheel 136(R) may be driven by a single drive motor, but the left wheel drive motor and the right wheel 136(R) driving the left wheel 136(L) as needed ) may be provided respectively with a right wheel drive motor for driving.
- the running direction of the main body 110 can be switched to the left or right by making a difference between the rotation speeds of the left wheel 136 (L) and the right wheel 136 (R).
- the robot cleaner 100 may include, for example, a suction unit 330 for sucking foreign substances, brushes 154 and 155 for sweeping, a dust bin for storing collected foreign substances, and a mop unit for wiping. there is.
- a suction port 150h through which air is sucked may be formed on the bottom of the main body 110, and a suction power is provided so that air can be sucked through the suction hole 150h inside the main body 110.
- a device and a dust bin may be provided to collect dust sucked together with air through the inlet 150h.
- the robot cleaner 100 may include, for example, a case 111 forming a space in which various parts constituting the robot cleaner 100 are accommodated.
- An opening (not shown) for insertion and removal of the dust bin may be formed in the case 111 , and a dust box cover 112 that opens and closes the opening may be rotatably provided with respect to the case 111 .
- the robot cleaner 100 includes, for example, a roll-type main brush 154 having brushes exposed through a suction port 150h, a plurality of blades located on the front side of the bottom surface of the main body 110 and extending radially.
- An auxiliary brush 155 having a brush made of may be provided. The rotation of the brushes 154 and 155 separates dust from the floor in the driving area, and the dust separated from the floor is sucked in through the suction port 150h and introduced into the dust bin through the suction unit 330. .
- Air and dust may be separated from each other while passing through a filter or a cyclone in the dust bin, the separated dust is collected in the dust bin, and the air is discharged from the dust bin and finally passes through an exhaust passage (not shown) inside the main body 110. It may be discharged to the outside through an exhaust port (not shown).
- the battery 138 may supply not only the driving motor, but also power necessary for the overall operation of the robot cleaner 100.
- the robot cleaner 100 may return to the charging station 200 for charging, and during this return driving, the robot cleaner 100 automatically moves to the charging station 200. location can be detected.
- the charging station 200 may include, for example, a signal transmission unit (not shown) that transmits a predetermined return signal.
- the return signal may be, for example, an ultrasonic signal or an infrared signal, but is not necessarily limited thereto.
- the robot cleaner 100 may include, for example, a signal detector (not shown) that receives a return signal.
- the signal detection unit may include an infrared sensor for detecting an infrared signal, and may receive an infrared signal transmitted from the signal transmission unit of the charging station 200 .
- the robot cleaner 100 may move to the position of the charging station 200 according to the infrared signal transmitted from the charging station 200 and dock with the charging station 200 . Due to this docking, the charging terminal 133 of the robot cleaner 100 and the charging terminal 210 of the charging stand 200 may come into contact, and the battery 138 may be charged.
- the robot cleaner 100 may have a configuration for sensing internal/external information of the robot cleaner 100 .
- the robot cleaner 100 may include, for example, a camera 120 that obtains image information about a driving area.
- the robot cleaner 100 may include a front camera 120a provided to acquire an image of the front of the main body 110 .
- the robot cleaner 100 may include an upper camera 120b provided on an upper surface of the main body 110 to obtain an image of a ceiling in a driving area.
- the robot cleaner 100 may further include a lower camera 179 provided on the bottom of the main body 110 to obtain an image of the floor.
- the number of cameras 120 provided in the robot cleaner 100, the location where they are placed, and the shooting range are not necessarily limited thereto, and may be placed in various locations to obtain image information about the driving area. there is.
- the robot cleaner 100 may include a camera (not shown) configured to photograph both the front and the upper directions by being disposed obliquely with respect to one surface of the main body 110 .
- the robot cleaner 100 may include a plurality of front cameras 120a and/or an upper camera 120b, or may include a plurality of cameras configured to photograph both the front and the top.
- a camera 120 is installed on some parts (eg, front, rear, bottom) of the robot cleaner 100, and images can be continuously obtained while driving or cleaning.
- Several cameras 120 may be installed for each part for shooting efficiency, and the image captured by the camera 120 recognizes the type of material such as dust, hair, floor, etc. present in the space, whether it is cleaned, Alternatively, it may be used to confirm the cleaning time.
- the robot cleaner 100 may include a light detection and ranging (LiDAR) sensor 175 that obtains terrain information of the outside of the main body 110 using a laser.
- LiDAR light detection and ranging
- the lidar sensor 175 outputs a laser and receives the laser reflected from the object, thereby obtaining information such as the distance to the object reflecting the laser, location direction, material, and the like, and obtaining terrain information of the driving area. can do.
- the robot cleaner 100 may acquire 360-degree geometry information based on information acquired through the lidar sensor 175 .
- the robot cleaner 100 may also include sensors 171 , 172 , and 179 for sensing various data related to the operation and state of the robot cleaner 100 .
- the robot cleaner 100 may include an obstacle detection sensor 171 for detecting an obstacle in front, a cliff detection sensor 172 for detecting the presence or absence of a cliff on the floor within a driving area, and the like.
- the robot cleaner 100 may include a control unit 137 capable of inputting various commands such as power on/off of the robot cleaner 100, and the robot cleaner 100 through the control unit 137. ) can receive various control commands necessary for the overall operation.
- the robot cleaner 100 may include an output unit (not shown), and may display reservation information, battery status, operation mode, operation status, error status, and the like.
- FIG. 5 is a diagram illustrating an example of a robot cleaner according to another embodiment of the present invention.
- the robot cleaner 100 shown in FIG. 5 has the same or similar configurations as the robot cleaner 100 disclosed in FIGS. 1 to 4 , and functions as a robot cleaner applicable to a large-area space. A detailed description thereof will be omitted.
- FIG. 6 is a block diagram illustrating a control relationship between major components of a robot cleaner according to an embodiment of the present invention.
- the robot cleaner 100 includes a storage unit 305, an image acquisition unit 320, an input unit 325, a suction unit 330, a control unit 350, a driving unit 360, a sensor unit ( 370), an output unit 380, and/or a communication unit 390.
- the storage unit 305 may store various types of information necessary for controlling the robot cleaner 100 .
- the storage unit 305 may include a volatile or non-volatile recording medium.
- a recording medium stores data that can be read by a microprocessor, and is not limited to its type or implementation method.
- the storage unit 305 may store a map for a driving area.
- the map stored in the storage unit 305 may be input from an external terminal or server capable of exchanging information with the robot cleaner 100 through wired or wireless communication, and the robot cleaner 100 learns by itself may have been created.
- the storage unit 305 may store data for the sub area.
- the sub area may mean a divided area having a predetermined distance or a predetermined area on the driving area.
- the data for the sub-area may include lidar sensing data obtained while driving in the corresponding sub-area, information on each node for the corresponding lidar sensing data, and information on a moving direction at each node.
- the storage unit 305 may store various map information.
- the location of rooms within the driving area may be indicated on the map. Also, the current location of the robot cleaner 100 may be displayed on a map, and the current location of the robot cleaner 100 on the map may be updated during driving.
- the storage unit 305 may store cleaning history information. Such cleaning history information may be generated whenever cleaning is performed.
- the map for the driving area stored in the storage unit 305 includes, for example, a navigation map used for driving during cleaning, a simultaneous localization and mapping (SLAM) map used for location recognition, When an obstacle is encountered, the corresponding information is stored and the learning map used for learning and cleaning, the global topological map used for global location recognition, the grid map based on cell data, and the information about the recognized obstacle are stored. It may be an obstacle recognition map that is recorded.
- SLAM simultaneous localization and mapping
- maps may be stored and managed by dividing them into the storage unit 305 for each purpose, but the maps may not be clearly classified for each purpose.
- a plurality of pieces of information may be stored in one map so as to be used for at least two or more purposes.
- the image acquisition unit 320 may acquire an image of the robot cleaner 100's surroundings.
- the image acquisition unit 320 may include at least one camera (eg, the camera 120 of FIG. 1 ).
- the image acquisition unit 320 may include, for example, a digital camera.
- a digital camera includes an image sensor (eg, CMOS image sensor) including at least one optical lens and a plurality of photodiodes (eg, pixels) forming an image by light passing through the optical lens.
- CMOS image sensor e.g., CMOS image sensor
- DSP digital signal processor
- the digital signal processor can, for example, generate not only still images, but also moving images composed of frames composed of still images.
- the image acquisition unit 320 may capture an obstacle present in front of the robot cleaner 100 or a situation of the cleaning area.
- the image acquisition unit 320 may obtain a plurality of images by continuously photographing the surroundings of the main body 110, and the obtained plurality of images may be stored in the storage unit 305. there is.
- the robot cleaner 100 may increase the accuracy of obstacle recognition by using a plurality of images or by selecting one or more images from among the plurality of images and using effective data.
- the input unit 325 may include an input device (eg, a key, a touch panel, etc.) capable of receiving a user input.
- the input unit 325 may include a control unit 137 capable of inputting various commands such as power on/off of the robot cleaner 100.
- the input unit 325 may receive a user input through the input device and transmit a command corresponding to the received user input to the control unit 350 .
- the suction unit 330 may suck air containing dust.
- the suction unit 330 is, for example, a suction device (not shown) for sucking foreign substances, brushes 154 and 155 for sweeping, suction devices or brushes (eg, brushes 154 and 155 in FIG. 3) It may include a dust bin (not shown) for storing the collected foreign substances, a suction port through which air is sucked (eg, the suction port 150h of FIG. 4 ), and the like.
- the traveling unit 360 may move the robot cleaner 100.
- the driving unit 360 may include, for example, at least one driving wheel 136 for moving the robot cleaner 100 and at least one motor (not shown) for rotating the driving wheel.
- the sensor unit 370 may include a distance measuring sensor that measures a distance to an object outside the main body 110 .
- the distance measuring sensor may include the lidar sensor 175 described above.
- the robot cleaner 100 may generate a map by recognizing distances, positions, and directions of objects sensed by the lidar sensor 175.
- the robot cleaner 100 may obtain terrain information of a driving area by analyzing a laser reception pattern, such as a time difference or signal strength of a laser reflected from the outside and received.
- the robot cleaner 100 may generate a map using terrain information acquired through the lidar sensor 175 .
- the robot cleaner 100 may perform a lidar slam for determining a moving direction by analyzing surrounding terrain information acquired from a current location through the lidar sensor 175 .
- the robot cleaner 100 effectively recognizes an obstacle through vision-based location recognition using a camera, LIDAR-based location recognition technology using a laser, and an ultrasonic sensor, and performs optimal movement with a small change. Map generation can be performed by extracting directions.
- the sensor unit 370 may include an obstacle detection sensor 171 for detecting an obstacle in front, a cliff detection sensor 172 for detecting the presence or absence of a cliff on the floor within a driving area, and the like.
- a plurality of obstacle detection sensors 171 may be disposed on the outer circumferential surface of the robot cleaner 100 at regular intervals.
- the obstacle detection sensor 171 may include an infrared sensor, an ultrasonic sensor, a radio frequency (RF) sensor, a geomagnetic sensor, a position sensitive device (PSD) sensor, and the like.
- the obstacle detection sensor 171 may be a sensor that detects a distance to an indoor wall or an obstacle, and the present invention is not limited to that type, but an ultrasonic sensor will be described below as an example.
- the obstacle detection sensor 171 may detect an object, particularly an obstacle, existing in the driving (moving) direction of the robot cleaner 100 and transmit obstacle information to the control unit 350 . That is, the obstacle detection sensor 171 detects the moving path of the robot cleaner 100, protrusions present on the front or side of the robot cleaner 100, household items, furniture, walls, wall corners, etc. It can be transmitted to the control unit 350.
- the sensor unit 370 may further include a driving detection sensor (not shown) for detecting a driving motion of the robot cleaner 100 and outputting motion information.
- the driving detection sensor may include, for example, a gyro sensor, a wheel sensor, and an acceleration sensor.
- the gyro sensor may detect a rotation direction and a rotation angle when the robot cleaner 100 moves according to the driving mode.
- the gyro sensor may detect the angular velocity of the robot cleaner 100 and output a voltage value proportional to the angular velocity.
- the wheel sensor may be connected to the driving wheel 136, for example, the left wheel 136(L) and the right wheel 136(R) of FIG. 4 to detect the number of revolutions of the driving wheel 136.
- the acceleration sensor may detect a change in speed of the robot cleaner 100 .
- the acceleration sensor may be attached to a location adjacent to the driving wheel 136 or may be built into the control unit 350 .
- the output unit 380 may include an audio output unit 381 that outputs an audio signal.
- the sound output unit outputs a notification message such as a warning sound, an operation mode, an operation state, an error state, information corresponding to a user's command input, a processing result corresponding to a user's command input, and the like as sound.
- the audio output unit 381 may convert an electrical signal from the control unit 150 into an audio signal and output the converted audio signal. To this end, a speaker or the like may be provided.
- the output unit 380 may include a display 382 that displays information corresponding to a user's command input, a processing result corresponding to the user's command input, an operation mode, an operation state, an error state, and the like as images.
- the display 382 may be configured as a touch screen by forming a mutual layer structure with the touch pad.
- the display 382 composed of a touch screen may be used as an input device capable of inputting information by a user's touch in addition to an output device.
- the communication unit 390 may include at least one communication module (not shown) and may transmit/receive data with an external device.
- the external terminal includes, for example, an application for controlling the robot cleaner 100, and a map of a driving area to be cleaned by the robot cleaner 100 through execution of the application. can be displayed, and an area can be designated to clean a specific area on the map.
- the communication unit 390 may transmit and receive signals using a wireless communication method such as Wi-Fi, Bluetooth, beacon, zigbee, or radio frequency identification (RFID).
- a wireless communication method such as Wi-Fi, Bluetooth, beacon, zigbee, or radio frequency identification (RFID).
- RFID radio frequency identification
- the power supply unit may supply driving power and operating power to each component of the robot cleaner 100 .
- the robot cleaner 100 may further include a battery detecting unit (not shown) capable of detecting the remaining battery capacity and charging state of the battery 138 and transmitting the detection result to the controller 350 .
- a battery detecting unit (not shown) capable of detecting the remaining battery capacity and charging state of the battery 138 and transmitting the detection result to the controller 350 .
- the control unit 350 may be connected to each component provided in the robot cleaner 100.
- the control unit 350 for example, can transmit/receive signals to and from each component provided in the robot cleaner 100 and control overall operations of each component.
- the control unit 350 may determine the state of the inside/outside of the robot cleaner 100 based on information obtained through the sensor unit 370 .
- the controller 350 may calculate the rotation direction and rotation angle using the voltage value output from the gyro sensor.
- the controller 350 may calculate the rotational speed of the driving wheel 136 based on the rotational speed output from the wheel sensor. In addition, the controller 350 may calculate the rotation angle based on the difference in rotation speeds between the left wheel 136 (L) and the right wheel 136 (R).
- the controller 350 may determine a state change of the robot cleaner 100, such as starting, stopping, changing direction, or colliding with an object, based on a value output from an acceleration sensor. Meanwhile, the controller 350 may detect an amount of impact according to a change in speed based on a value output from the acceleration sensor, so the acceleration sensor may perform the function of an electronic bumper sensor.
- the controller 350 may detect the position of the obstacle based on at least two signals received through the ultrasonic sensor, and control the movement of the robot cleaner 100 according to the detected position of the obstacle.
- the obstacle detection sensor 131 provided on the outer surface of the robot cleaner 100 may include a transmitter and a receiver.
- the ultrasonic sensor may include at least one transmitting unit and at least two or more receiving units to be crossed with each other. Accordingly, the transmitter may emit ultrasonic signals at various angles, and at least two or more receivers may receive ultrasonic signals reflected from obstacles at various angles.
- the signal received by the ultrasonic sensor may undergo a signal processing process such as amplification and filtering, and then the distance and direction to the obstacle may be calculated.
- a signal processing process such as amplification and filtering
- the controller 350 may include a driving control module 351, a map generating module 352, a location recognition module 353, and/or an obstacle recognition module 354.
- the driving control module 351, the map generation module 352, the location recognition module 353 and / or the obstacle recognition module 354 are described separately, but the present invention is not limited thereto. not.
- the location recognition module 353 and the obstacle recognition module 354 may be integrated as one recognizer to form one recognition module 355.
- the recognizer is trained using a learning technique such as machine learning, and the learned recognizer can recognize properties of regions, objects, and the like by classifying input data thereafter.
- the map generation module 352, the location recognition module 353, and the obstacle recognition module 354 may be configured as one integrated module.
- the driving control module 351 may control driving of the robot cleaner 100 and may control driving of the driving unit 360 according to driving settings.
- the travel control module 351 may determine the travel path of the robot cleaner 100 based on the operation of the travel unit 360 .
- the driving control module 351 can determine the current or past moving speed, the distance traveled, etc. of the robot cleaner 100 based on the rotational speed of the driving wheel 136, and the driving of the robot cleaner 100 determined in this way. Based on the information, the location of the robot cleaner 100 on the map may be updated.
- the map generation module 352 may generate a map for the driving area.
- the map generating module 352 may generate and/or update a map in real time based on acquired information while the robot cleaner 100 is driving.
- the map generation module 352 may set a plurality of movement directions. For example, when a function for generating a map for a driving area (hereinafter, a map generating function) is executed, the map generation module 352 determines the direction in which the front of the robot cleaner 100 faces at the time the function is executed. It can be set as the first movement direction. In addition, the map generation module 352, at the time when the function is executed, sets the direction in which the left side of the robot cleaner 100 faces as the second movement direction, and the direction in which the right side of the robot cleaner 100 faces as the third movement direction, and the first A direction opposite to the direction toward which the rear surface of the robot cleaner 100 faces may be set as the fourth movement direction.
- a function for generating a map for a driving area hereinafter, a map generating function
- the map generating module 352 may create a map based on information acquired through the lidar sensor 175 .
- the map generation module 352 may obtain topographical information of the driving area by analyzing a reception pattern such as a reception time difference and signal strength of a laser output through the lidar sensor 175 and received after being reflected from an external object.
- the terrain information of the driving area may include, for example, the location, distance, direction, etc. of objects existing around the robot cleaner 100 .
- the map generation module 352 may store information on a plurality of nodes while generating a grid map based on the terrain information of the driving area acquired through the lidar sensor 175, and such information may be used as first map data.
- information on a plurality of nodes while generating a grid map based on the terrain information of the driving area acquired through the lidar sensor 175, and such information may be used as first map data.
- the map generation module 352 divides the driving area into a plurality of sub-areas, and while driving for each sub-area, the cell data for the area with and without obstacles is different through the lidar sensor 175. You can create a grid map.
- the location recognition module 353 may determine the location of the robot cleaner 100 .
- the location recognition module 353 may determine the location of the robot cleaner 100 while the robot cleaner 100 is driving.
- the location recognition module 353 may determine the location of the robot cleaner 100 based on the acquired image obtained through the image acquisition unit 320 .
- the location recognition module 353 determines the characteristics of each location of the driving area detected from the acquired image based on the map data generated by the map generation module 352. It may be mapped to each location, and data on characteristics of each location of the driving area mapped to each location of the map may be stored in the storage unit 305 as location recognition data.
- the location recognition module 353 compares the characteristics of the driving area detected from the acquired image with the characteristics of each location of the driving area included in the location recognition data stored in the storage unit 305, A similarity (probability) for each location may be calculated, and based on the calculated similarity (probability) for each location, a location having the greatest similarity may be determined as the location of the robot cleaner 100 .
- the robot cleaner 100 extracts a feature from an image acquired through the image acquisition unit 320 and substitutes the extracted feature into a mapped grid map so that the robot cleaner ( 100) can be determined.
- the robot cleaner 100 can determine the current location by learning the map through the driving control module 351, the map generating module 352 and/or the obstacle recognition module 354 without the location recognition module 353. may be
- the obstacle recognition module 354 may detect obstacles around the robot cleaner 100 .
- the obstacle recognition module 354 may be configured to obtain an image acquired through the image acquisition unit 320 and/or sensing data acquired through the sensor unit 370 to obtain obstacles around the robot cleaner 100. can detect
- the obstacle recognition module 354 may detect obstacles around the robot cleaner 100 based on terrain information of a driving area acquired through the lidar sensor 175 .
- the obstacle recognition module 354 may determine whether or not an obstacle obstructing the driving of the robot cleaner 100 exists while the robot cleaner 100 is driving.
- the obstacle recognition module 354 may determine a driving pattern such as going straight or turning according to the property of the obstacle, and may transmit the determined driving pattern to the driving control module 351 .
- the robot cleaner 100 may perform machine learning-based human and object recognition and avoidance.
- machine learning may mean that a computer learns through data without a person directly instructing the computer with logic, and through this, the computer solves a problem on its own.
- Deep learning is based on artificial neural networks (ANNs) for constructing artificial intelligence, and is a method of teaching computers how to think about humans. It can mean intelligence technology.
- An artificial neural network (ANN) may be implemented in a software form or a hardware form such as a chip.
- the obstacle recognition module 354 may include an artificial neural network (ANN) in the form of software or hardware in which attributes of obstacles are learned.
- ANN artificial neural network
- the obstacle recognition module 354 may include a deep neural network (DNN) such as a convolutional neural network (CNN), a recurrent neural network (RNN), and a deep belief network (DBN) learned through deep learning.
- DNN deep neural network
- CNN convolutional neural network
- RNN recurrent neural network
- DBN deep belief network learned through deep learning.
- the obstacle recognition module 354 may determine the attribute of an obstacle included in input image data based on, for example, weights between nodes included in a deep neural network (DNN).
- DNN deep neural network
- the driving control module 351 may control the driving of the driving unit 360 based on the attributes of the recognized obstacle.
- the storage unit 305 may store input data for obstacle attribute determination and data for learning the deep neural network (DNN).
- the storage unit 305 may store the original image obtained by the image acquisition unit 320 and the extracted images from which a predetermined region is extracted.
- the storage unit 305 may store weights and biases constituting a deep neural network (DNN) structure. For example, weights and biases constituting the deep neural network structure may be stored in an embedded memory of the obstacle recognition module 354 .
- the obstacle recognition module 354 for example, whenever extracting a partial region of an image acquired by the image acquisition unit 320, uses the extracted image as training data to perform a learning process, or to perform a predetermined learning process. After acquiring more than the number of extracted images, a learning process may be performed.
- the obstacle recognition module 354 updates a deep neural network (DNN) structure such as a weight by adding a recognition result whenever an obstacle is recognized, or training obtained after a predetermined number of training data is obtained.
- DNN deep neural network
- a deep neural network (DNN) structure such as weights can be updated by performing a learning process with data.
- the robot cleaner 100 may transmit the original image or the extracted image obtained by the image acquisition unit 320 to a predetermined server through the communication unit 390 and receive data related to machine learning from the predetermined server. At this time, the robot cleaner 100 may update the obstacle recognition module 354 based on data related to machine learning received from a predetermined server.
- FIG. 7 is a diagram referenced for description of a LiDAR sensor provided in a robot cleaner according to an embodiment of the present invention.
- the lidar sensor 175 can output laser in all directions of 360 degrees, and by receiving the laser reflected from the object, the distance to the object that reflected the laser, the location direction, the material, etc. Information may be obtained, and topographical information of a driving area may be obtained.
- the robot cleaner 100 may obtain terrain information within a certain distance according to the performance and settings of the lidar sensor 175 .
- the robot cleaner 100 may obtain terrain information within a circular area 610 having a radius of a predetermined distance 610 based on the lidar sensor 175 .
- the robot cleaner 100 obtains topographical information within the original area 610, and at this time, may obtain topographical information on a predetermined sub-region.
- the terrain information of the lidar sensor 175 may store terrain information within the circular area 610 detected at each point where the robot cleaner 100 travels a predetermined distance or a predetermined time, and at this time, the stored information may be stored by matching with driving nodes (n1, n2%) that are the current location of the robot cleaner 100, which is the center of the corresponding circular area 610.
- the robot cleaner 100 may extract an open space from terrain information about a sub-region acquired through the lidar sensor 175 .
- the open space may refer to information about a space between objects reflecting a laser in which the robot cleaner 100 can run.
- the robot cleaner 100 continuously reads the sensing data of the lidar sensor 175 for each driving node (n1, n2%) in the sub-area after the driving of the assigned sub-area is completed. casting) to perform an open space detection algorithm that checks whether there is an open space.
- the robot cleaner 100 may extract an open space for the sub-area and may set an additional open node, that is, a point where the robot cleaner 100 may travel, with respect to the corresponding open space.
- the robot cleaner 100 may generate a topology map for the driving nodes n1, n2... and open nodes of the robot cleaner 100 in one sub-region.
- the information on the open space may include information about the driving nodes (n1, n2%) and driving direction of the robot cleaner 100 searching for the corresponding open space, and information on the width of the open space, and the open node finds the corresponding open node. It may include information about an open space and driving nodes n1, n2... of the robot cleaner 100.
- the robot cleaner 100 may move to an open node of the topology map to set additional sub-regions, and complete the topology maps of a plurality of sub-regions for the entire driving region while driving until there are no more open nodes.
- the robot cleaner 100 may detect the center point of the open space, set it as an open node, and drive along the open node in order to move along the center of the moving passage.
- FIG. 8 is a flow chart of a map creation process of a robot cleaner according to an embodiment of the present invention
- FIGS. 9A to 9E are diagrams showing a map creation process of the robot cleaner according to the flowchart of FIG. 8
- FIG. 10 is It is a diagram showing a part of the final map of the robot cleaner according to the flowchart of FIG. 8 .
- the robot cleaner 100 may generate a grid map while moving.
- the robot cleaner 100 may divide the driving area into a plurality of sub areas.
- the divided sub-region may be divided based on the driving distance or driving area, and may be an arbitrary region for forming a topology map of the moved region after moving for a predetermined distance or for a predetermined time.
- the robot cleaner 100 travels for a predetermined distance or for a predetermined time in an unknown driving area for cleaning, which is not an area that is equally divided, the area traveled for that distance or for that time can be defined as one sub-region. there is.
- the sub-region may be defined from the current position of the robot cleaner 100 until an obstacle is found in front, and may be set in various ways according to user settings.
- the robot cleaner 100 While the robot cleaner 100 pre-drives to map an unknown driving area for cleaning (S100), the robot cleaner 100 collects lidar information of divided sub-regions through the lidar sensor 175 to generate a grid map, After detecting the open space in the grid map, a topology graph is created for each sub-region.
- the robot cleaner 100 collects lidar information of the corresponding area through the lidar sensor 175 while traveling in the sub-area (S101), and the grid map of the sub-area through the collected lidar information.
- generating (S102) detecting an open space for a plurality of points by performing ray casting with the LIDAR information on a path on the grid map (S103), generating an open node for the detected open space generating a topology graph (S104), moving from the current position to the next open node, and examining whether an open node occurs (S105), depending on whether there are other open nodes in the topology graph, different sub-regions It proceeds to a step of performing a search (S106).
- control unit 350 of the robot cleaner 100 performs pre-driving for generating a cleaning map for a predetermined driving area from the outside or according to a set instruction (S100).
- a point where driving is completed for a predetermined distance or for a predetermined time may be defined as a sub-region, and driving continues in the same direction D1 until driving to the corresponding sub-region is completed.
- the robot cleaner 100 may obtain lidar information about the circular area 610 of 360 degrees from the lidar sensor 175 as shown in FIG. 9a (S101).
- the collected LiDAR information may be collected from each of a plurality of driving nodes n1, n2, etc. located on the route.
- the driving nodes (n1, n2%) may be set at equal intervals on the path, but are not limited thereto, and may be defined as points where the robot cleaner 100 is located at predetermined time intervals.
- a virtual line to which the driving nodes (n1, n2%) are connected may function as a driving path (DP), but is not limited thereto.
- the driving nodes (n1, n2%) set in one sub-region may have different numbers.
- the robot cleaner 100 collects LIDAR information.
- Each of the LIDAR information may be recorded in the storage unit 305 by matching with each traveling node (n1, n2).
- Such LIDAR information may be information about a distance to an obstacle in the circular area 610, and may include information about a distance, location direction, material, and the like of an object that outputs a laser and reflects the laser.
- the grid map as an Occupancy Grid Map (OGM), is a map using a grid generated by obtained local information.
- OGM Occupancy Grid Map
- the grid map is a map for recognizing the surrounding environment of the robot cleaner 100, and may refer to a map in which the sub area A is expressed as a grid or cells (hereinafter referred to as cells) of the same size and the presence or absence of objects is displayed in each cell. there is.
- the presence or absence of an object may be displayed through color.
- white cells may indicate an area without an object
- gray cells may indicate an area with an object.
- a line connecting gray cells may represent a boundary line (wall, obstacle, etc.) of a certain space.
- the color of a cell can be changed through image processing.
- the controller 350 may create a grid map while driving in the unknown sub-region A. Unlike this, the controller 350 may generate a grid map of the sub-area A after driving in the sub-area A is completed.
- the grid map generating step (S102) can be set to generate a grid map while driving in the sub area (A).
- the control unit 350 calculates a grid map based on lidar information of the lidar sensor 175 received for the corresponding driving node (n1, n2%) at a predetermined point, that is, the driving node (n1, n2). can create That is, the control unit 350 may generate a grid map within a sensing range of the lidar sensor 175 while a node is created and moved to the created node.
- the sensing range of the lidar sensor 175 may deviate from the sub-region A.
- the controller 350 Since the controller 350 generates a grid map based on LIDAR information received from each driving node (n1, n2%), the grid map can be sequentially updated as driving progresses. Accordingly, when the driving of the corresponding sub-region A is finally completed, the grid map may be finally updated and the final grid map may be stored in the storage unit.
- the final grid map stored in this way includes the grid map of the corresponding sub area as shown in FIG. 9B, but grid maps for other areas may also be shown.
- the controller 350 ray-casts lidar information for the corresponding sub-region (A) to determine the open space of the sub-region (A). Detect (S103).
- the open space is formed by ray-casting (RS) the LIDAR information on the original area 610 obtained for each driving node (n1, n2%) in the entire section, thereby forming an open space in the corresponding driving node (n1, n2). , that is, determine whether there is an open space free of obstacles.
- RS ray-casting
- the presence or absence of an open space may be defined as a case where the distance between the lidar data obtained by reflection is greater than the width of the robot cleaner 100, but is not limited thereto, and has a value for the reference width, If there is a space having a wider width than the corresponding standard width, it can be set as an open space.
- 360-degree ray casting is performed on lidar information received at each of a plurality of driving nodes (n1, n2%) on the route while driving in the sub-area (A). By doing so, it is possible to search the open space so that there is no missing direction.
- the controller 350 determines whether each open space overlaps with the open space for other driving nodes (n1, n2%), and sets an open node for the corresponding open space for the overlapped final open space. do.
- Such an open node may be set as a point located in the center of the final open space.
- the controller 350 may determine the open space by applying various algorithms and set an open node accordingly, which will be described in detail later.
- controller 350 generates a topology graph for the sub-region A (S104).
- the topology graph can be created by connecting an open node set for an open space with the corresponding open nodes and corresponding driving nodes (n1, n2).
- the open nodes n21, n22, n23, and n24 set for the sub-region A are displayed as a graph following the driving node n2 where the open node is found.
- an open space obtained by performing 360-degree beam projection on one travel node n2 and a plurality of open nodes n21, n22, n23, and n24 may exist in fact.
- one open node n21 may be connected to a plurality of driving nodes n1 and n2.
- the topology graph formed in this way may be illustrated as shown in FIG. 9E, for example, and unnamed blocks are defined as obstacles.
- the driving nodes n1, n2... and the open nodes n21, n22, n23, n24 may be matched in a parent-children relationship. there is.
- the open node is changed to a driving node, that is, to a closed node. While changing, it is possible to complete the topology graph for the corresponding sub-region (A) (S105).
- the topology graph when creating a topology graph while setting the sub-area A at each open node, the topology graph can be created while determining whether the sub-area A overlaps.
- a cleaning map for the entire driving region may be created by collecting them.
- the cleaning map for the entire cleaning area may be generated by generating topology graphs for each small area and collecting them to increase the accuracy of the cleaning map.
- a cleaning map for the entire area can be created without missing open space.
- the topology graph according to the present invention may have values as shown in FIG. 10 .
- the topology graph for each subregion may have a graph connecting each driving node (n1, n2, n3...) and at least one open node (n31,,,,) corresponding thereto with a straight line. Information about the width of the open space can be recorded for each open node.
- the topology graph may be stored in the storage unit 305 by matching the width of the open space.
- controller 350 of the robot cleaner 100 may perform various algorithms to search for an open space.
- FIG. 11 is a flow chart for searching for an open node in FIG. 8
- FIGS. 12A and 12B are diagrams illustrating a non-step open space of FIG. 11, and
- FIGS. 13A and 13B are a step open space of FIG. 11 14A and 14B are views showing another embodiment when the stepped open space of FIG. 11 is used.
- the search for an open space starts by executing an open space detection algorithm with respect to the obtained grid map of the corresponding sub-region A (S201).
- ray casting is performed with LIDAR information received from each driving node (n1, n2%) along a given path (S202).
- FIGS. 12A and 12B are diagrams showing the determination of the control unit 350 in the case of a non-stepped open space.
- one obstacle 301 and another obstacle 302 are at the same distance from each other at the current position n1 of the robot cleaner 100 with respect to the open space between the obstacles 301 and 302. means located. Accordingly, a step does not occur between one obstacle 301 and another obstacle 302 .
- the distance between one obstacle 301 and the other obstacle 302 constituting an open space in the first driving node n1, which is the current position of the robot cleaner 100, is equal to each other.
- the width w1 of the non-stepped open space is calculated as shown in FIG. 12A (S205).
- the non-stepped open space is defined as a movable potential open space.
- the central area of the width of the open space is set as an open node n11 for the open space (S206).
- a search for the open space n11 may be additionally performed as shown in FIG. 12B (S207).
- the controller 350 stores information about the searched open space and the set open node n11, and matches and stores the width w1 of the open space and the travel node n1 where it is found.
- the open space is not a non-stepped open space, that is, when there is a difference in distance between one obstacle 301 and another obstacle 303 constituting the open space and the robot cleaner 100
- the corresponding open space It is defined as a stepped open space.
- one obstacle 301 may protrude from the current driving node n2 relative to the other obstacle 303 .
- the stepped opening space can be set in two cases as shown in FIGS. 13 and 14 .
- FIGS. 13A and 13B show an open space that occurs backward with respect to the traveling direction caused by the sudden widening of the width of the traveling passage, and forwardly with respect to the traveling direction caused by the sudden narrowing of the width of the traveling passage.
- the resulting open space is shown in Figures 14a and 14b.
- the controller 350 compares the width w2 of the step with the width of the robot cleaner 100 or the reference width (S208).
- the corresponding open space may be defined as a potential open space.
- the controller 350 may set the open node n22 in the central area of the open space (S209).
- the open node n21 is an extension node located at the intersection of the current driving node n2 and the vertical line VL on the center line CL of the width of the open space, unlike the previous open node for the non-stepped open space ( n21).
- the robot cleaner 100 may perform ray casting on LIDAR information in the left direction, that is, in the direction from the open node n21 toward the open space, as shown in FIG. 13B at the set open node n21 (S210). ).
- the ray casting operation can be reduced by performing the ray casting only in the dichotomy facing the open space.
- the robot cleaner 100 after registering the set open node n22 in the topology graph, the robot cleaner 100 performs ray casting on lidar information in the right direction, that is, in the direction from the open node n22 to the open space. can be done
- the ray casting operation can be reduced by performing the ray casting only in the dichotomy facing the open space.
- control unit 350 when searching for an open node, may perform duplicate removal on open nodes in a plurality of driving nodes n1, n2....
- FIG. 15 is a diagram showing another flow chart for searching for an open node of FIG. 8
- FIG. 16 is a diagram showing the search of the robot cleaner 100 showing the operation of FIG. 15, and
- FIG. It is a drawing for explaining the process for.
- 15 to 17 show another example of a method of setting open spaces and open nodes from a grid map.
- an open space detection algorithm is executed on the obtained grid map of the corresponding sub-region A (S301).
- ray casting is performed by searching for LIDAR information received from each traveling node (n1, n2 %) along the given path (S302).
- each travel node (n1, n2%) For example, if an open space is not found by performing ray casting at the first travel node (n1), the second travel node (n2) moved by a predetermined distance ) performs ray casting on the LIDAR information obtained again.
- the traveling nodes (n0, n1, and n2) are moved by the continuous running of the robot cleaner 100, that is, the open spaces that are not found in the 0th traveling node (n0) of FIG. 16A are the first and second open spaces. It can be found at 2 travel nodes (n1, n2). In addition, such an open space may pass without being found again when reaching the third driving node n3.
- an open node may be set for an open space B parallel to the passage.
- the open space (D) parallel to the passage (DP) to the rear of the third open space (C) is searched. , it is determined that the fourth open space D exists behind the third open space C.
- control unit 350 sets the central point of the fourth open space D behind the third open space C to the first open node for the first traveling node n1. (n11) is set (S303).
- control unit 350 performs ray casting on the second travel node n2, which is different from the first travel node n1, and proceeds in the travel direction from the first travel node n1, for example, while performing ray casting on the same third travel node. It is determined whether the open space (C) is searched, and then it is determined whether the fourth open space (D) is also searched.
- the second open node (n21) is set in the central area of the fourth open space (D) based on the second traveling nodes (n1, n2...) (S304).
- the first open node n11 and the second open node n22 may have different absolute positions.
- control unit 350 compares the width W3 of the third open space C and the width W4 of the fourth open space D with each other, and if they are within the similar range, the first open node n11 and Each predetermined range is set around the second open node n21.
- the predetermined range of the first open node n11 may be defined as a circle whose diameter is the width W3 of the third open space C centered on the first open node n11
- the second open node The predetermined range of (n21) may be defined as a circle whose diameter is the width W3 of the third open space C centered on the second open node n21.
- the area of the area E overlapping the predetermined range of the first open node n11 and the predetermined range of the second open node n21 is calculated (S305).
- the controller 350 determines that the first open node n11 and the second open node n21 are the same when the area of the overlapping region E is greater than or equal to the threshold value.
- the threshold value may satisfy 1/2 or 2/3 of a predetermined range for each open node, and may be set in various ways without being limited thereto.
- the controller 350 determines that the first open node n11 and the second open node n21 are the same, the center of the width W4 of the fourth open space D with respect to the two open nodes n11 and n21.
- One open node (n11, n21) adjacent to is maintained, and the other open node (n11, n21) is deleted (S306).
- the first open node n11 remains and the second open node n21 is deleted.
- overlapping open nodes can be removed by comparing ray casting results at different driving nodes (n1, n2%) with respect to spaces of the grid map that do not correspond to the non-stepped open space or the stepped open space.
- the controller 350 may form topology graphs for each of the open spaces and open nodes, and may complete the topology graph for the entire driving area by connecting topology graphs for a plurality of sub-regions.
- Such a topology graph may function as a basic map forming a cleaning map.
- the controller 350 can check the relationship between nodes by referring to the grid map and the topology graph, and can set an optimal path.
- control unit 350 can fuse image data to a corresponding grid map and convert it into an image.
- the robot cleaner 100 may set an optimal path based on the grid map and the topology graph imaged at the current location of the robot cleaner 100 .
- the method for generating the topology graph formed in this way has the following effects.
- FIG. 18A shows an open space search according to a comparative example
- FIG. 18B shows an open space search according to the present invention
- FIG. 19A shows a plurality of open space searches according to a comparative example
- FIG. 19B shows an open space search according to a comparative example
- 20A shows an unsearched area according to a comparative example
- FIG. 20B shows an open space searched for according to the present invention.
- FIGS. 18A, 19A, and 20A illustrate a control method for discovering an open space and generating an open node based on a real-time sensing result
- FIGS. 18B, 19B, and 20B show predetermined A grid map is acquired while driving in a sub-area, and ray casting is performed on the sub-area to set an open space and an open node.
- the search for an open space through ray casting in the grid map according to the embodiment of the present invention of FIG. 19B is removed by determining whether a plurality of open nodes overlap, unlike the discovery according to the real-time sensing of the comparative example of FIG. 19A. An open node occurs.
- an open space that is passed without being discovered occurs, whereas in the case of the present invention, an additional open space formed behind the open space is searched without omission, as shown in FIG. 20b. Therefore, an open space that is omitted does not occur by setting an open node for this purpose.
- the robot cleaner 100 according to the present invention is not limited to the configuration and method of the embodiments described above, but all or part of each of the embodiments is optional so that various modifications can be made. It may be configured in combination with.
- control method of the robot cleaner 100 can be implemented as a processor-readable code on a processor-readable recording medium.
- the processor-readable recording medium includes all types of recording devices in which data readable by the processor is stored. Also, it includes those implemented in the form of a carrier wave such as transmission through the Internet.
- the processor-readable recording medium is distributed in computer systems connected through a network, so that the processor-readable code can be stored and executed in a distributed manner.
- Image acquisition department 320
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Mechanical Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Aviation & Aerospace Engineering (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- Electromagnetism (AREA)
Abstract
Description
Claims (20)
- 주행 구역 내에서 본체를 이동시키는 주행부; a traveling unit that moves the main body within a traveling area;상기 본체 외부의 물체와의 거리에 관한 거리 감지 정보를 획득하는 거리 측정 센서; 및a distance measuring sensor that obtains distance sensing information about a distance to an object outside the main body; and상기 거리 감지 정보로부터 상기 주행 구역에 대한 그리드 맵을 생성하고, 상기 주행 구역을 복수의 서브 영역으로 분할할 때, 상기 서브 영역에 대하여 상기 그리드 맵의 경로 상의 복수의 주행 노드에 대하여 레이 캐스팅을 수행하여 개방 공간을 탐색하고, 상기 개방 공간에 대한 개방 노드를 설정하여 상기 주행 노드와 상기 개방 노드 사이에 토폴로지 그래프를 산출하는 제어부A grid map for the driving area is generated from the distance detection information, and when the driving area is divided into a plurality of sub-areas, ray casting is performed on a plurality of driving nodes on the route of the grid map for the sub-area. A controller for searching an open space, setting an open node for the open space, and calculating a topology graph between the driving node and the open node를 포함하는 로봇 청소기.A robot cleaner comprising a.
- 제 1 항에 있어서,According to claim 1,상기 거리 측정 센서는 상기 본체 외부의 물체에 광을 조사하고, 반사되는 광에 의해 상기 거리 감지 정보를 산출하는 라이다 센서를 포함하는 것을 특징으로 하는, 로봇 청소기.The distance measurement sensor may include a lidar sensor for irradiating light to an object outside the main body and calculating the distance detection information based on the reflected light.
- 제 2 항에 있어서,According to claim 2,상기 서브 영역은 상기 로봇 청소기가 상기 주행 구역을 소정 시간 또는 소정 거리만큼 주행할 때의 면적으로 분할하는 것을 특징으로 하는 로봇 청소기.The robot cleaner, characterized in that the sub-region is divided into an area when the robot cleaner travels for a predetermined time or a predetermined distance in the driving zone.
- 제 3 항에 있어서,According to claim 3,상기 제어부는,The control unit,각각의 상기 서브 영역에 대한 상기 그리드 맵 내에서 복수의 주행 노드마다 상기 레이 캐스팅을 실행하여 개방 공간을 탐색하고, 각각의 상기 개방 공간에 대한 상기 개방 노드를 설정하는 것을 특징으로 하는 로봇 청소기.The robot cleaner according to claim 1 , wherein the ray casting is executed for each of a plurality of driving nodes in the grid map for each sub-region to search for an open space, and to set the open node for each of the open spaces.
- 제 4 항에 있어서,According to claim 4,상기 제어부는 상기 개방 공간의 폭의 중앙 영역에 상기 개방 노드를 설정하는 것을 특징으로 하는 로봇 청소기.The robot cleaner, characterized in that the control unit sets the open node in the central region of the width of the open space.
- 제 5 항에 있어서,According to claim 5,상기 제어부는The control unit상기 개방 공간이 단차가 없이 이격되어 있는 두 장애물 사이에 형성되어 있는 경우, 상기 두 장애물 사이의 이격 거리 중앙 영역에 상기 개방 노드를 설정하는 것을 특징으로 하는 로봇 청소기.When the open space is formed between two obstacles that are spaced apart without a step, the robot cleaner, characterized in that to set the open node in the central region of the separation distance between the two obstacles.
- 제 5 항에 있어서,According to claim 5,상기 제어부는The control unit상기 개방 공간이 단차를 가지고 이격되어 있는 두 장애물 사이에 형성되어 있는 경우, 상기 두 장애물 사이의 중앙선과 상기 주행 노드의 수직선의 교차점에 상기 개방 노드를 설정하는 것을 특징으로 하는 로봇 청소기.When the open space is formed between two obstacles spaced apart by a step, the robot cleaner, characterized in that to set the open node at the intersection of the center line between the two obstacles and the vertical line of the traveling node.
- 제 5 항에 있어서,According to claim 5,상기 제어부는,The control unit,상기 서브 영역에 대한 상기 토폴로지 그래프가 생성되면, 마지막 주행 노드의 상기 개방 노드 중 하나로 이동하고, 이동한 상기 개방 노드를 닫힌 노드로 변경한 후, 상기 토폴로지 그래프에 잔여하는 개방 노드로 이동하여 다른 서브 영역을 구획하는 것을 특징으로 하는 로봇 청소기.When the topology graph for the sub-region is created, it moves to one of the open nodes of the last driving node, changes the moved open node to a closed node, moves to the open node remaining in the topology graph, and moves to another sub-region. A robot cleaner characterized in that the area is partitioned.
- 제 8 항에 있어서,According to claim 8,상기 제어부는 상기 마지막 주행 노드의 복수의 개방 노드 중 폭이 가장 큰 개방 노드로 이동하는 것을 특징으로 하는 로봇 청소기.The robot cleaner, characterized in that the controller moves to an open node having the largest width among a plurality of open nodes of the last driving node.
- 제 9 항에 있어서,According to claim 9,상기 제어부는, 상기 주행 노드에서 레이 캐스팅하여 상기 로봇 청소기의 주행이 가능하고, 상기 로봇 청소기가 기 주행하지 않은 공간을 상기 개방 공간으로 판단하는 것을 특징으로 하는, 로봇 청소기.The controller may determine, as the open space, a space in which the robot cleaner can travel by ray casting at the travel node, and the robot cleaner has not already traveled, as the open space.
- 제 10 항에 있어서,According to claim 10,상기 제어부는 상기 주행 노드를 중심으로 360도로 레이 캐스팅을 수행하는 것을 특징으로 하는 로봇 청소기.The robot cleaner, characterized in that the control unit performs ray casting at 360 degrees around the traveling node.
- 제 5 항에 있어서,According to claim 5,상기 제어부는,The control unit,제1 주행 노드에 대하여 탐색된 일 개방 공간에 대하여 제1 개방 노드가 설정되고, 상기 제1 주행 노드와 다른 제2 주행 노드에 대하여 탐색된 다른 개방 공간에 대하여 제2 개방 노드가 설정되고, 상기 제1 개방 노드와 상기 제2 개방 노드가 중첩되는 것으로 판단되면, 중첩되는 제1 및 제2 개방 노드 중 하나를 삭제하는 것을 특징으로 하는 로봇 청소기.A first open node is set for an open space searched for for the first travel node, and a second open node is set for another open space searched for for a second travel node different from the first travel node; If it is determined that the first open node and the second open node overlap, the robot cleaner characterized in that for deleting one of the overlapping first and second open nodes.
- 제 12 항에 있어서,According to claim 12,상기 제어부는, 상기 일 개방 공간과 상기 다른 개방 공간의 폭이 소정 범위 이상 중첩되면 상기 제1 개방 노드와 상기 제2 개방 노드의 중첩 여부를 판단하는 것을 특징으로 하는, 로봇 청소기.The controller may determine whether the first open node and the second open node overlap when widths of the one open space and the other open space overlap by a predetermined range or more.
- 제 13 항에 있어서,According to claim 13,상기 제어부는,The control unit,상기 제1 개방 노드와 상기 제2 개방 노드를 중심으로 동일한 직경의 원을 생성하고, 각각의 원이 소정 범위 이상 중첩되면 상기 제1 개방 노드와 상기 제2 개방 노드가 동일한 개방 노드에 대한 것으로 판단하는 것을 특징으로 하는, 로봇 청소기.Circles having the same diameter are generated around the first open node and the second open node, and if each circle overlaps a predetermined range or more, it is determined that the first open node and the second open node are related to the same open node. Characterized in that, a robot cleaner.
- 제 14 항에 있어서,15. The method of claim 14,상기 제1 개방 노드와 상기 제2 개방 노드 중 상기 제1 및 제2 개방 노드가 위치하는 개방 영역의 중심 영역에 근접한 하나의 개방 노드를 남기고, 다른 개방 노드를 삭제하는 것을 특징으로 하는 로봇 청소기. Among the first open node and the second open node, one open node proximate to a central area of an open area where the first and second open nodes are located is left, and another open node is deleted.
- 제 13 항에 있어서,According to claim 13,상기 제어부는, 복수의 상기 서브 영역에 대한 상기 토폴로지 그래프를 서로 연결하여 상기 주행 구역에 대한 최종 토폴로지 그래프를 산출하는 것을 특징으로 하는 로봇 청소기.The robot cleaner, characterized in that the control unit calculates a final topology graph for the driving area by connecting the topology graphs for the plurality of sub-regions to each other.
- 제1항에 있어서,According to claim 1,상기 제어부는 상기 그리드 맵을 이미지 처리하여 최종 맵을 산출하는 것을 특징으로 하는, 로봇 청소기.The robot cleaner, characterized in that the control unit calculates a final map by image processing of the grid map.
- 미지의 주행 구역을 이동하면서 본체 외부의 물체와의 거리에 대한 거리 감지 정보를 수득하는 단계;obtaining distance sensing information about a distance to an object outside the main body while moving in an unknown driving area;상기 거리 감지 정보로부터 상기 주행 구역에 대한 그리드 맵을 생성하는 단계;generating a grid map for the driving area from the distance detection information;상기 주행 구역을 복수의 서브 영역으로 분할하고, 각각의 상기 서브 영역에 대하여 상기 그리드 맵의 경로 상에서 복수의 주행 노드에 대하여 레이 캐스팅을 수행하여 개방 공간을 탐색하는 단계; dividing the driving area into a plurality of sub-regions, and searching an open space by performing ray casting on a plurality of driving nodes on a path of the grid map for each of the sub-regions;상기 개방 공간에 대한 개방 노드를 설정하여 상기 서브 영역에 대한 상기 주행 노드와 상기 개방 노드 사이의 토폴로지 그래프를 생성하는 단계; 및 setting an open node for the open space and generating a topology graph between the driving node and the open node for the sub-area; and복수의 상기 서브 영역에 대한 상기 토폴로지 그래프를 서로 연결하여 상기 주행 구역에 대한 최종 토폴로지 그래프를 생성하는 단계generating a final topology graph for the driving area by connecting the topology graphs of the plurality of sub-regions with each other;를 포함하는 로봇 청소기의 제어 방법.Control method of a robot cleaner comprising a.
- 제 18 항에 있어서,According to claim 18,상기 토폴로지 그래프를 생성하는 단계는, Generating the topology graph,상기 개방 공간이 단차가 없이 이격되어 있는 두 장애물 사이에 형성되어 있는 경우, 상기 두 장애물 사이의 이격 거리 중앙 영역에 상기 개방 노드를 설정하고,When the open space is formed between two obstacles that are spaced apart without a step, the open node is set in a central region of the separation distance between the two obstacles;상기 개방 공간이 단차를 가지고 이격되어 있는 두 장애물 사이에 형성되어 있는 경우, 상기 두 장애물 사이의 중앙선과 상기 주행 노드의 수직선의 교차점에 상기 개방 노드를 설정하는 것을 특징으로 하는 로봇 청소기의 제어 방법.When the open space is formed between two obstacles spaced apart by a step, the control method of a robot cleaner, characterized in that by setting the open node at an intersection of a center line between the two obstacles and a vertical line of the traveling node.
- 제 19 항에 있어서,According to claim 19,상기 토폴로지 그래프를 생성하는 단계는, Generating the topology graph,제1 주행 노드에 대하여 탐색된 일 개방 공간에 대하여 제1 개방 노드가 설정하고, 상기 제1 주행 노드와 다른 제2 주행 노드에 대하여 탐색된 다른 개방 공간에 대하여 제2 개방 노드가 설정하는 단계;setting, by a first open node, one open space found for the first traveling node, and setting, by a second open node, another open space found for a second traveling node different from the first traveling node;상기 일 개방 공간과 상기 다른 개방 공간의 폭이 소정 범위 이상 중첩되면 상기 제1 개방 노드와 상기 제2 개방 노드의 중첩 여부를 판단하는 단계;determining whether the first open node and the second open node overlap when widths of the one open space and the other open space overlap by a predetermined range or more;상기 제1 개방 노드와 상기 제2 개방 노드를 중심으로 동일한 직경의 원을 생성하고, 각각의 원이 소정 범위 이상 중첩되면 상기 제1 개방 노드와 상기 제2 개방 노드가 동일한 개방 노드에 대한 것으로 판단하는 단계; 및 Circles having the same diameter are generated around the first open node and the second open node, and if each circle overlaps a predetermined range or more, it is determined that the first open node and the second open node are related to the same open node. doing; and상기 제1 개방 노드와 상기 제2 개방 노드 중 상기 제1 및 제2 개방 노드가 위치하는 개방 영역의 중심 영역에 근접한 하나의 개방 노드를 남기고, 다른 개방 노드를 삭제하는 단계leaving one open node adjacent to a central area of an open area where the first and second open nodes are located among the first open node and the second open node, and deleting another open node;를 포함하는 것을 특징으로 하는 로봇 청소기의 제어 방법. A control method of a robot cleaner comprising a.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP22846119.0A EP4374763A1 (en) | 2021-07-22 | 2022-07-12 | Robotic cleaner and method for controlling robotic cleaner |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020210096541A KR20230015148A (en) | 2021-07-22 | 2021-07-22 | A robot cleaner and control method thereof |
KR10-2021-0096541 | 2021-07-22 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023003249A1 true WO2023003249A1 (en) | 2023-01-26 |
Family
ID=84979378
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2022/010141 WO2023003249A1 (en) | 2021-07-22 | 2022-07-12 | Robotic cleaner and method for controlling robotic cleaner |
Country Status (3)
Country | Link |
---|---|
EP (1) | EP4374763A1 (en) |
KR (1) | KR20230015148A (en) |
WO (1) | WO2023003249A1 (en) |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20100031878A (en) | 2008-09-16 | 2010-03-25 | 삼성전자주식회사 | Apparatus and method for building map used in mobile robot |
KR20170053351A (en) * | 2015-11-06 | 2017-05-16 | 삼성전자주식회사 | Cleaning robot and controlling method thereof |
KR20190091508A (en) * | 2016-12-29 | 2019-08-06 | 아미크로 세미컨덕터 씨오., 엘티디. | Path planning method of intelligent robot |
KR20190098734A (en) * | 2019-07-12 | 2019-08-22 | 엘지전자 주식회사 | Artificial intelligence robot for determining cleaning route using sensor data and method for the same |
JP2019536601A (en) * | 2016-11-23 | 2019-12-19 | クリーンフィックス・ライニグングスシステーメ・アーゲーCleanfix Reinigungssysteme Ag | Floor treatment machine and method for treating floor surface |
JP2020004342A (en) * | 2018-07-02 | 2020-01-09 | 株式会社Soken | Mobile body controller |
KR20210009011A (en) | 2019-07-16 | 2021-01-26 | 엘지전자 주식회사 | Moving robot and control method thereof |
-
2021
- 2021-07-22 KR KR1020210096541A patent/KR20230015148A/en unknown
-
2022
- 2022-07-12 EP EP22846119.0A patent/EP4374763A1/en active Pending
- 2022-07-12 WO PCT/KR2022/010141 patent/WO2023003249A1/en active Application Filing
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20100031878A (en) | 2008-09-16 | 2010-03-25 | 삼성전자주식회사 | Apparatus and method for building map used in mobile robot |
KR20170053351A (en) * | 2015-11-06 | 2017-05-16 | 삼성전자주식회사 | Cleaning robot and controlling method thereof |
JP2019536601A (en) * | 2016-11-23 | 2019-12-19 | クリーンフィックス・ライニグングスシステーメ・アーゲーCleanfix Reinigungssysteme Ag | Floor treatment machine and method for treating floor surface |
KR20190091508A (en) * | 2016-12-29 | 2019-08-06 | 아미크로 세미컨덕터 씨오., 엘티디. | Path planning method of intelligent robot |
JP2020004342A (en) * | 2018-07-02 | 2020-01-09 | 株式会社Soken | Mobile body controller |
KR20190098734A (en) * | 2019-07-12 | 2019-08-22 | 엘지전자 주식회사 | Artificial intelligence robot for determining cleaning route using sensor data and method for the same |
KR20210009011A (en) | 2019-07-16 | 2021-01-26 | 엘지전자 주식회사 | Moving robot and control method thereof |
Also Published As
Publication number | Publication date |
---|---|
KR20230015148A (en) | 2023-01-31 |
EP4374763A1 (en) | 2024-05-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2021010757A1 (en) | Mobile robot and control method thereof | |
AU2020247141B2 (en) | Mobile robot and method of controlling the same | |
WO2018139796A1 (en) | Mobile robot and method for controlling same | |
WO2021006677A2 (en) | Mobile robot using artificial intelligence and controlling method thereof | |
AU2019334724B2 (en) | Plurality of autonomous mobile robots and controlling method for the same | |
WO2021006556A1 (en) | Moving robot and control method thereof | |
AU2019262468B2 (en) | A plurality of robot cleaner and a controlling method for the same | |
AU2019262467B2 (en) | A plurality of robot cleaner and a controlling method for the same | |
WO2021006542A1 (en) | Mobile robot using artificial intelligence and controlling method thereof | |
WO2019083291A1 (en) | Artificial intelligence moving robot which learns obstacles, and control method therefor | |
WO2018155999A2 (en) | Moving robot and control method thereof | |
WO2019212239A1 (en) | A plurality of robot cleaner and a controlling method for the same | |
WO2020050489A1 (en) | A robot cleaner and a controlling method for the same | |
WO2019212240A1 (en) | A plurality of robot cleaner and a controlling method for the same | |
WO2019017521A1 (en) | Cleaner and control method thereof | |
WO2016028021A1 (en) | Cleaning robot and control method therefor | |
WO2020050566A1 (en) | Plurality of autonomous mobile robots and controlling method for the same | |
WO2018101631A2 (en) | Robotic vacuum cleaner, cleaning function control apparatus equipped in robotic vacuum cleaner, and multi-channel lidar-based obstacle detection apparatus equipped in robotic vacuum cleaner | |
WO2018117616A1 (en) | Mobile robot | |
WO2020122540A1 (en) | Robot cleaner and method for operating same | |
WO2021006553A1 (en) | Moving robot and control method thereof | |
WO2019004773A1 (en) | Mobile terminal and robot system including same | |
WO2023003249A1 (en) | Robotic cleaner and method for controlling robotic cleaner | |
WO2021125717A1 (en) | Mobile robot and control method therefor | |
WO2021225234A1 (en) | Robot cleaner and method for controlling the same |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22846119 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2024503620 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2022846119 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2022846119 Country of ref document: EP Effective date: 20240222 |