WO2022075610A1 - Système de robot mobile - Google Patents

Système de robot mobile Download PDF

Info

Publication number
WO2022075610A1
WO2022075610A1 PCT/KR2021/012300 KR2021012300W WO2022075610A1 WO 2022075610 A1 WO2022075610 A1 WO 2022075610A1 KR 2021012300 W KR2021012300 W KR 2021012300W WO 2022075610 A1 WO2022075610 A1 WO 2022075610A1
Authority
WO
WIPO (PCT)
Prior art keywords
robot
charging
driving mode
capacity value
battery
Prior art date
Application number
PCT/KR2021/012300
Other languages
English (en)
Korean (ko)
Inventor
곽동훈
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Publication of WO2022075610A1 publication Critical patent/WO2022075610A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1682Dual arm manipulator; Coordination of several manipulators
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4011Regulation of the cleaning machine by electric means; Control systems and remote control systems therefor
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L9/00Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating action; Storing devices specially adapted to suction cleaners or parts thereof; Carrying-vehicles specially adapted for suction cleaners
    • A47L9/28Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L9/00Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating action; Storing devices specially adapted to suction cleaners or parts thereof; Carrying-vehicles specially adapted for suction cleaners
    • A47L9/28Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means
    • A47L9/2805Parameters or conditions being sensed
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L9/00Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating action; Storing devices specially adapted to suction cleaners or parts thereof; Carrying-vehicles specially adapted for suction cleaners
    • A47L9/28Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means
    • A47L9/2836Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means characterised by the parts which are controlled
    • A47L9/2852Elements for displacement of the vacuum cleaner or the accessories therefor, e.g. wheels, casters or nozzles
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L9/00Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating action; Storing devices specially adapted to suction cleaners or parts thereof; Carrying-vehicles specially adapted for suction cleaners
    • A47L9/28Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means
    • A47L9/2868Arrangements for power supply of vacuum cleaners or the accessories thereof
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L9/00Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating action; Storing devices specially adapted to suction cleaners or parts thereof; Carrying-vehicles specially adapted for suction cleaners
    • A47L9/28Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means
    • A47L9/2894Details related to signal transmission in suction cleaners
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/006Controls for manipulators by means of a wireless system for controlling one or several manipulators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/005Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators using batteries, e.g. as a back-up power source
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1661Programme controls characterised by programming, planning systems for manipulators characterised by task planning, object-oriented languages
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • A47L2201/04Automatic control of the travelling movement; Automatic obstacle detection
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • A47L2201/06Control of the cleaning action for autonomous devices; Automatic detection of the surface condition before, during or after cleaning

Definitions

  • the present invention relates to a mobile robot system, and more particularly, to a mobile robot system in which a plurality of mobile robots perform cooperative driving, and a method for performing cooperative driving thereof.
  • a vacuum cleaner is a device that performs a cleaning function by sucking dust and foreign substances or by mopping.
  • a vacuum cleaner performs a cleaning function for the floor, and the cleaner includes a wheel for movement.
  • the wheel is rolled by an external force applied to the cleaner body to move the cleaner body with respect to the floor.
  • Prior document WO2017-036532 discloses a method in which a master robot cleaning device (hereinafter, master robot) controls at least one slave robot cleaning device (hereinafter, a slave robot).
  • master robot controls at least one slave robot cleaning device (hereinafter, a slave robot).
  • slave robot controls at least one slave robot cleaning device (hereinafter, a slave robot).
  • the prior document discloses a configuration in which the master robot detects an obstacle around it using an obstacle detecting device, and determines the position of the master robot in relation to the slave robot by using the position data derived from the obstacle detecting device.
  • KR20170174493 discloses a general process in which two robot cleaners perform cleaning while communicating with each other.
  • the present specification is to provide an embodiment of a mobile robot system that can solve the above problems.
  • Another object of the present invention is to provide an embodiment of a mobile robot system capable of performing cooperative driving in which appropriate responses to various events occurring during cooperative driving can be performed, and a method for performing cooperative driving thereof.
  • a mobile robot system capable of appropriately responding to a trap situation occurring during cooperative driving and a method for performing cooperative driving thereof.
  • Another object of the present invention is to provide an embodiment of a mobile robot system capable of appropriately responding to obstacles detected during active driving and a method for performing cooperative driving thereof.
  • Another object of the present invention is to provide an embodiment of a mobile robot system and a method for performing cooperative driving in which appropriate responses can be made according to changes in battery charge levels of a plurality of mobile robots and various states of battery charge levels during collaborative driving.
  • a mobile robot system and a method for performing cooperative driving thereof for solving the above-described problems it is determined whether the driving state of a plurality of mobile robots corresponds to a preset reference condition, and a motion for cooperative driving is performed according to the determination result. Do it as a solution.
  • the driving state of a plurality of mobile robots corresponding to the performing condition of the cooperative driving is compared with a preset reference condition, and when the driving state corresponds to the reference condition, By performing the motion for the cooperative driving, the cooperative driving can be performed accurately and stably.
  • the embodiment of the mobile robot system and its collaborative driving performance method determines whether the driving state of a plurality of mobile robots corresponds to a preset reference condition, and performs a motion for cooperative driving according to the determination result. By doing so, the problems as described above are solved.
  • An embodiment of a mobile robot system using the above technical features as a problem solving means communicates with a plurality of mobile robots that perform cleaning while traveling in an area to be cleaned, and communicates with the plurality of mobile robots, and provides a remote control to the plurality of mobile robots. and a controller that transmits a control command for control, wherein the plurality of mobile robots receive a control command for a collaborative driving mode for collaboratively cleaning the cleaning target area from the controller. It is determined whether the driving state corresponds to a preset reference condition, and a motion for the cooperative driving mode is performed according to the determination result.
  • an embodiment of a method for performing cooperative driving of a mobile robot system using the above technical characteristics as a problem solving means is a method for performing cooperative driving of a first robot and a second robot, and the first robot and the second robot are A step of inputting a command for performing cooperative driving, the step of the first robot comparing the driving states of the first robot and the second robot with a preset reference condition, and the first robot and the second robot according to the comparison result
  • Each of the 2 robots includes a step of performing a motion for cooperative driving.
  • an embodiment of a mobile robot system that can properly respond to a trap situation that occurs during cooperative driving is, while the first robot and the second robot are performing cooperative driving, the first robot and/or the second When a trap situation occurs in the robot, the first robot and/or the second robot performs trap escape driving, and the trap situation enters the cleaning target area where the first robot or the second robot does not travel This is an impossible situation, and the trap escape driving is a driving method in which the first robot or the second robot travels along the boundary of the area to be cleaned.
  • an embodiment of a method for performing cooperative driving of a mobile robot system in which an appropriate response to a trap situation occurring during cooperative driving can be made is, the first robot and the second robot performing cooperative driving with each other, the first Determining whether a trap situation has occurred in a robot and/or the second robot, and performing a trap escape run when the first robot and/or the second robot is in a trap situation, wherein the trap situation is , It is a situation in which it is impossible to enter the cleaning target area in which the first robot or the second robot does not travel, and the trap escape driving is performed along the boundary of the cleaning target area in which the first robot or the second robot travels. way of driving.
  • an embodiment of a method for performing collaborative driving of a mobile robot system that can appropriately respond to various error situations occurring during collaborative driving is a method for performing collaborative driving of a mobile robot system that runs in an area to be cleaned, wherein the mobile robot The system includes a first robot that sucks in the contaminants of the cleaning target area, a second robot that wipes the floor of the cleaning target area, a first charging station that charges the first robot, a second charging station that charges the second robot, and a network connecting the first robot and the second robot, wherein the method for performing cooperative driving includes the steps of: entering, by the first robot and the second robot, a cooperative driving mode using the network; performing cooperative driving by the first robot and the second robot by identifying each other's location information, and when an error occurs in at least one of the first robot and the second robot while performing the cooperative driving, or Determining whether to release the cooperative driving mode by the first robot or the second robot when a kidnap occurs in at least one of the first robot and the second robot or when the network connection is released includes
  • an embodiment of a mobile robot system that can properly respond to an obstacle detected during active driving is a first robot that sucks contaminants in the area to be cleaned, a second robot that wipes the floor of the area to be cleaned, and the second robot a first charging station for charging one robot, a second charging station for charging the second robot, and a network connecting the first robot and the second robot, wherein the first robot and the second robot are Enters a cooperative driving mode using a network, divides the cleaning target area into a plurality of unit areas, and performs cooperative driving for each unit area, and the first robot and/or the second robot performs a cooperative driving operation among the plurality of unit areas During the cooperative driving in any one unit area, when an obstacle formed in a preset range, height or depth is detected, the obstacle is avoided or the obstacle is climbed to continue the cooperative driving.
  • an embodiment of a method for performing cooperative driving of a mobile robot system capable of making an appropriate response to an obstacle detected during active driving is a method for performing cooperative driving of a mobile robot system traveling in an area to be cleaned, the mobile robot system comprising: , A first robot for sucking contaminants in the area to be cleaned, A second robot to wipe the floor of the area to be cleaned, A first charging station for charging the first robot, A second charging station for charging the second robot, And the and a network connecting the first robot and the second robot, wherein the method for performing cooperative driving includes: entering, by the first robot and the second robot, a cooperative driving mode using the network; The robot and the second robot divide the area to be cleaned into a plurality of unit areas and run cooperatively for each unit area, and the first robot and/or the second robot, any one of the plurality of unit areas When an obstacle formed in a preset range, height or depth is detected during the cooperative driving in one unit area, avoiding the obstacle or climbing the obstacle to continue the cooperative driving.
  • an embodiment of a mobile robot system in which an appropriate response can be made according to a change in the battery charge level of a plurality of mobile robots and various states of the battery charge level during cooperative driving is a movement in which a plurality of mobile robots cooperatively travel As a robot system, it is driven based on the electric power charged by the first charging station, driven based on the electric power charged by the first robot and the second charging station running in the area to be cleaned, according to the route the first robot traveled a second robot that travels, wherein the first robot and the second robot detect the capacity charged in each battery while performing the cooperative driving mode, and the cooperative driving mode according to the charging capacity value of the battery release, and perform at least one of an independent driving mode and a charging mode of the battery, respectively, in response to the charging capacity value.
  • a mobile robot system in which appropriate responses can be made according to a change in battery charge level of a plurality of mobile robots and various states of battery charge levels while performing cooperative driving is a method in which a plurality of mobile robots run cooperatively.
  • a mobile robot system it is driven based on the electric power charged by the first charging station, driven based on the electric power charged by the first robot and the second charging station that travels in the area to be cleaned, and the path traveled by the first robot and a second robot that travels along, wherein each of the first robot and the second robot senses the capacity charged in the battery while performing the cooperative driving mode, and the charge capacity value of the battery is a preset reference capacity value
  • the battery is charged by moving to each charging station.
  • an embodiment of a method for performing cooperative driving of a mobile robot system in which an appropriate response can be made according to a change in battery charge level of a plurality of mobile robots and various states of battery charge level while performing cooperative driving is charged in the first charging station
  • a movement including a first robot that runs in the area to be cleaned by driving based on one electric power, and a second robot that runs according to the path traveled by the first robot by driving based on electric power charged by the second charging station
  • a method for performing cooperative driving of a robot system comprising: starting each of the first robot and the second robot to perform a cooperative driving mode; detecting a capacity charged in a battery by each of the first robot and the second robot , Comparing the charging capacity value of each of the first robot and the second robot with a preset reference capacity value, and at least one of the first robot and the second robot performing an independent driving mode according to the comparison result, and moving to a charging station to charge the battery.
  • another embodiment of a method for performing cooperative driving of a mobile robot system in which an appropriate response can be made according to a change in battery charge level of a plurality of mobile robots and various states of battery charge level while performing cooperative driving is, in the first charging station, A first robot that drives based on the charged electric power and runs in the area to be cleaned and a second robot that drives based on the electric power charged in the second charging station and travels according to the route the first robot has traveled.
  • a method for performing cooperative driving of a mobile robot system comprising: each of the first robot and the second robot starting to perform a cooperative driving mode; each of the first robot and the second robot detecting a capacity charged in a battery step, each of the first robot and the second robot compares a charging capacity value with a preset reference capacity value, and according to the comparison result, at least one of the first robot and the second robot moves to a charging station to move the battery Including the step of charging.
  • Embodiments of the mobile robot system and the cooperative driving performance method as described above may be applied and implemented to a robot cleaner, a control system for controlling the robot cleaner, a robot cleaning system, a control method for controlling the robot cleaner, etc., In particular, it can be effectively applied to a plurality of mobile robots, a mobile robot system including a plurality of mobile robots, a control method of a plurality of mobile robots, and the like, and all robot cleaners and robot cleaners to which the technical idea of the above technology can be applied. It can also be applied to the control method of the system and robot cleaner.
  • An embodiment of the mobile robot system and the cooperative driving performance method provided herein determines whether the driving state of a plurality of mobile robots corresponds to a preset reference condition, and performs a motion for cooperative driving according to the determination result By doing so, there is an effect that cooperative driving can be performed while satisfying various conditions of cooperative driving.
  • each of the plurality of mobile robots detects the capacity charged in the battery while performing the cooperative driving mode, and each of the plurality of mobile robots performs a corresponding operation according to the detection result, thereby appropriately responding to changes in the charge level of the battery There is an effect that this can be done.
  • FIGS. 1A and 1B are schematic diagrams a and b of a mobile robot.
  • FIG. 2 is a detailed configuration diagram of a mobile robot.
  • 3 is an exemplary view of a mobile robot system.
  • FIG. 4 is a conceptual diagram illustrating network communication between a plurality of mobile robots of a mobile robot system.
  • 5 is a conceptual diagram of a plurality of mobile robots of the mobile robot system.
  • FIG. 6 is a detailed driving example of a plurality of mobile robots according to the conceptual diagram shown in FIG. 5 .
  • FIG. 7 is a flowchart illustrating a sequence in which a plurality of mobile robots perform cooperative driving.
  • FIG. 8 is an exemplary diagram for explaining the concept of recognizing a position through image comparison between a plurality of mobile robots.
  • 9 is an exemplary diagram for explaining the concept of location recognition between a plurality of mobile robots.
  • FIG. 10 is an exemplary view in which a plurality of mobile robots perform cooperative driving.
  • FIG. 11 is a block diagram of a mobile robot system according to an embodiment.
  • FIG. 12 is a flowchart illustrating a process in which a cooperative driving mode is performed in a mobile robot system according to an embodiment.
  • FIG. 13 is a diagram illustrating an example in which a cooperative driving mode is performed in the mobile robot system according to the embodiment
  • FIG. 14 is a flowchart of a method for performing cooperative driving of a mobile robot system according to an embodiment
  • FIG. 15 is a flowchart according to a specific embodiment of the method for performing cooperative driving shown in FIG. 14 .
  • 16 is a view showing cooperative driving of the mobile robot system according to ⁇ Example 1>.
  • FIG. 17 is a diagram illustrating a mobile robot system that performs a preset scenario when the first robot according to ⁇ Embodiment 1> is in a trap situation.
  • FIG. 18 is a diagram illustrating a mobile robot system that performs a preset scenario when the first robot according to ⁇ Embodiment 1> is in a trap situation.
  • FIG. 19 is a diagram illustrating a mobile robot system that performs a preset scenario when a second robot according to ⁇ Embodiment 1> is in a trap situation.
  • FIG. 20 is a diagram illustrating a mobile robot system that performs a preset scenario when a second robot according to ⁇ Example 1> is in a trap situation;
  • 21 is a diagram illustrating a mobile robot system that performs a preset scenario when the first robot and the second robot are in a trap situation according to ⁇ Example 1>;
  • Example 22 is a flowchart of a method for performing cooperative driving of a mobile robot system when a trap situation according to ⁇ Example 1> occurs;
  • FIG. 23 is a diagram illustrating cooperative driving of the mobile robot system according to ⁇ Embodiment 2>.
  • 24A is a diagram illustrating a mobile robot system that performs a preset scenario in response to an error occurring in the first robot according to ⁇ Embodiment 2>;
  • 24B is a view showing a mobile robot system that performs a preset scenario in response to an error occurring in the first robot according to ⁇ Embodiment 2> b.
  • 24C is a diagram illustrating a mobile robot system that performs a preset scenario in response to an error occurring in the first robot according to ⁇ Embodiment 2> c.
  • 25 is a diagram illustrating a mobile robot system that performs a preset scenario in response to errors occurring in a first robot and a second robot according to ⁇ Embodiment 2>;
  • 26A is a diagram illustrating a mobile robot system that performs a preset scenario in response to an error occurring in a second robot according to ⁇ Embodiment 2>;
  • 26B is a diagram illustrating a mobile robot system that performs a preset scenario in response to an error occurring in the second robot according to ⁇ Embodiment 2> b.
  • 26C is a diagram illustrating a mobile robot system that performs a preset scenario in response to an error occurring in the second robot according to ⁇ Embodiment 2> c.
  • 27A is a diagram illustrating a mobile robot system that performs a preset scenario in response to a kidnap generated in the first robot according to ⁇ Embodiment 2>;
  • 27B is a view showing a mobile robot system that performs a preset scenario in response to a kidnap generated in the first robot according to ⁇ Example 2> b.
  • 27C is a diagram illustrating a mobile robot system that performs a preset scenario in response to a kidnap occurring in the first robot according to ⁇ Example 2> c.
  • 28A is a diagram illustrating a mobile robot system that performs a preset scenario in response to a kidnap generated in a second robot according to ⁇ Embodiment 2>;
  • 28B is a view showing a mobile robot system that performs a preset scenario in response to a kidnap generated in the second robot according to ⁇ Embodiment 2> b.
  • 28C is a diagram illustrating a mobile robot system that performs a preset scenario in response to a kidnap generated in the second robot according to ⁇ Embodiment 2> c.
  • 29 is a flowchart illustrating a method in which the mobile robot system according to ⁇ Embodiment 2> performs a preset scenario in response to an error, a key snap, or a communication failure occurring while performing cooperative driving.
  • FIG. 30 is a diagram illustrating a mobile robot system according to ⁇ Example 3> dividing a cleaning target area into a plurality of unit areas, and cooperatively driving for each unit area.
  • FIG. 31 is a diagram illustrating a preset scenario performed when a first robot and a second robot detect a first obstacle, according to ⁇ Embodiment 3>;
  • FIG. 32 is a diagram illustrating a preset scenario performed when the first robot does not detect the first obstacle and the second robot detects the first obstacle, according to ⁇ Embodiment 3>;
  • FIG. 33 is a diagram illustrating a preset scenario performed when the first robot and the second robot do not detect a first obstacle according to ⁇ Embodiment 3>;
  • FIG. 34 is a diagram illustrating a preset scenario performed when the first robot detects the second obstacle and the second robot does not detect the second obstacle, according to ⁇ Embodiment 3>;
  • 35 is a flowchart illustrating a method in which the mobile robot system according to ⁇ Embodiment 3> performs a preset scenario in response to an obstacle detected during cooperative driving.
  • 36A is a table a showing an example of a response according to the state of charge capacity of a battery while a cooperative driving mode is performed in the mobile robot system according to ⁇ Embodiment 4>.
  • 36B is a table b showing an example of a response according to a state of charge capacity of a battery while a cooperative driving mode is performed in the mobile robot system according to ⁇ Embodiment 4>.
  • FIG. 37 is an exemplary diagram illustrating correspondence of a plurality of mobile robots in the mobile robot system according to ⁇ Embodiment 4>.
  • 38 is an exemplary diagram illustrating correspondence of a plurality of mobile robots in the mobile robot system according to ⁇ Embodiment 4>.
  • 39 is an exemplary diagram illustrating correspondence of a plurality of mobile robots in the mobile robot system according to ⁇ Embodiment 4>;
  • FIG. 40 is a flowchart of a method for performing cooperative driving of a mobile robot system according to ⁇ Embodiment 4>;
  • the robot may be a cleaning robot that performs cleaning while driving.
  • the robot may be a cleaning robot that performs driving and cleaning automatically or by a user's manipulation.
  • the robot may be an autonomous driving vacuum cleaner and a vacuum cleaner performing autonomous driving.
  • the robot may be a cleaning robot that recognizes a location while traveling in a predetermined area.
  • the robot may be a cleaning robot that recognizes a location while driving and creates a map of a predetermined area.
  • the robot may perform a function of cleaning the floor while traveling on its own in a certain area.
  • the cleaning of the floor referred to herein includes suctioning dust (including foreign substances) on the floor or mopping the floor.
  • the robot may have a plurality of configurations for running and cleaning.
  • the robot 100 may have a shape as shown in FIG. 1A or FIG. 1B .
  • the robot 100 may have a shape as shown in FIG. 1A , or may have a shape as shown in FIG. 1B , or a form modified from the shape shown in FIGS. 1A and 1B , or FIG. 1A . And it may have a shape different from that shown in FIG. 1B.
  • the robot 100 may include a main body 110 , a cleaning unit 120 , and a sensing unit 130 .
  • the main body 10 forms the exterior of the robot 100, and can perform running and cleaning.
  • the main body 10 may perform the overall operation of the robot 100 .
  • the main body 10 may be formed in a shape that is easy to run and clean to form the exterior of the robot 100 .
  • it may be formed in a circular shape, and may also be formed in a rectangular shape with rounded corners.
  • the main body 110 may have a configuration for running and cleaning the robot 100 .
  • the main body 10 may have a configuration for running and cleaning the robot 100 inside and outside.
  • a configuration in which a driving operation, a cleaning operation, or sensing is performed may be provided outside, and a configuration in which the robot 100 is controlled may be provided inside.
  • the main body 110 may be provided with a wheel unit 111 for driving the robot 100 .
  • the robot 100 may be moved or rotated forward, backward, left and right by the wheel unit 111 .
  • the main body 110 may be equipped with a battery (not shown) for supplying power to the robot 100 .
  • the battery is configured to be rechargeable, and may be configured to be detachably attached to the bottom of the main body 110 .
  • the cleaning unit 120 is disposed to protrude from one side of the main body 110, and may suck air containing dust or mop.
  • the one side may be the side on which the cleaner body 110 travels in the forward direction F, that is, the front side of the cleaner body 110 .
  • the cleaning unit 120 may be detachably coupled to the main body 110 .
  • a mop unit (not shown) may be detachably coupled to the main body 110 to replace the separated cleaning unit 120 .
  • the cleaning unit 120 is mounted on the main body 110 , and when the user wants to wipe the floor, the mop unit can be mounted on the main body 110 .
  • the sensing unit 130 may be disposed on one side of the main body 110 where the cleaning unit 120 is located, that is, in front of the main body 110 .
  • the sensing unit 130 may be disposed to overlap the cleaning unit 120 in the vertical direction of the main body 110 .
  • the sensing unit 130 is disposed on the upper portion of the main body 110, so that the robot 100 does not collide with the obstacle, it is configured to detect an obstacle or a feature in front.
  • the sensing unit 130 may be configured to additionally perform a sensing function other than the sensing function.
  • the sensing unit 130 may include a camera 131 for acquiring an image of the surroundings.
  • the camera 131 may include a lens and an image sensor.
  • the camera 131 may convert an image around the main body 110 into an electrical signal that the controller can process, and transmit, for example, an electrical signal corresponding to the upper image to the controller.
  • the electric signal corresponding to the upper image may be used by the controller to detect the position of the main body 110 .
  • the sensing unit 130 may detect obstacles such as walls, furniture, and cliffs on the traveling surface or traveling path of the robot 100 .
  • the sensing unit 130 may detect the presence of a docking device that charges the battery.
  • the sensing unit 130 may sense the ceiling information to map the traveling area or the cleaning area of the robot 100 .
  • the robot 100 includes a communication unit 1100, an input unit 1200, a traveling unit 1300, a sensing unit 1400, an output unit 1500, a power supply unit 1600, a memory ( 1700), the control unit 1800, and the cleaning unit 1900 may include at least one or a combination thereof.
  • the components shown in FIG. 2 are not essential, it goes without saying that an autonomous driving cleaner having more or fewer components than that may be implemented.
  • the plurality of mobile robots described herein may include only some of the components described below the same. That is, each of the plurality of mobile robots may be composed of different components.
  • the power supply unit 1600 is provided with the battery that can be charged by an external commercial power supply to supply power into the robot (100).
  • the power supply unit 1600 may supply driving power to each component included in the robot 100 to supply operating power required for the robot 100 to travel or perform a specific function.
  • control unit 1800 may detect the remaining power of the battery and, when the remaining power is insufficient, control to move to a charging station connected to an external commercial power source, and receive a charging current from the charging station to charge the battery.
  • the battery may be connected to the battery detection unit, so that the remaining battery level and charging state may be transmitted to the control unit 1800 .
  • the output unit 1500 may display the remaining amount of the battery by the control unit 1800 .
  • the control unit 1800 serves to process information based on artificial intelligence technology, and includes one or more circuit modules that perform at least one of information learning, information inference, information perception, and natural language processing. may include
  • the control unit 1800 by using machine learning (machine running) technology, a large amount of information (big data, such as information stored in the robot 100, environmental information around the mobile terminal, information stored in a communicable external storage, Big data) can be learned, inferred, and at least one of processing can be performed. And, the control unit 1800 predicts (or infers) the operation of at least one executable robot 100 by using the information learned using the machine learning technology, and predicts the at least one predicted operation.
  • the robot 100 may be controlled so that the most feasible operation among them is executed.
  • Machine learning technology is a technology for collecting and learning large-scale information based on at least one algorithm, and judging and predicting information based on the learned information.
  • Information learning is an operation of quantifying the relationship between information and information by identifying characteristics, rules, and judgment criteria of information, and predicting new data using the quantified pattern.
  • Algorithms used by machine learning technology can be statistics-based algorithms, for example, a decision tree that uses the tree structure as a predictive model, or an artificial neural network that mimics the structure and function of a neural network in a living organism. (neural network), genetic programming based on the evolutionary algorithm of organisms, clustering that distributes observed examples into subsets called clusters, and Monte Carlo method that calculates function values with probability through randomly selected random numbers (Monter carlo method) and the like.
  • deep learning technology is a technology that performs at least one of learning, judging, and processing information by using an artificial neural network (DNN) algorithm.
  • An artificial neural network (DNN) may have a structure that connects layers and transmits data between layers.
  • Such deep learning technology can learn a vast amount of information through an artificial neural network (DNN) using a graphic processing unit (GPU) optimized for parallel operation.
  • GPU graphic processing unit
  • the control unit 1800 may use an external server or training data stored in the memory 1700 and may be equipped with a learning engine that detects a characteristic for recognizing a predetermined object.
  • the characteristics for recognizing the object may include the size, shape, and shadow of the object.
  • the controller 1800 may recognize at least one object or living organism included in the input image.
  • the controller 1800 determines whether obstacles such as chair legs, electric fans, and specific types of balcony gaps that interfere with the running of the robot 100 exist in the vicinity. can be recognized, so that the efficiency and reliability of the driving of the robot 100 can be increased.
  • the above learning engine may be mounted on the control unit 1800 or may be mounted on an external server.
  • the controller 1800 may control the communication unit 1100 to transmit at least one image to be analyzed to the external server.
  • the external server may recognize at least one object or living organism included in the image by inputting the image received from the cleaner into the learning engine.
  • the external server may transmit information related to the recognition result back to the cleaner.
  • the information related to the recognition result may include information related to the number of objects included in the image to be analyzed and the name of each object.
  • the traveling unit 1300 includes a motor, and by driving the motor, the left and right main wheels are rotated in both directions to rotate or move the main body. At this time, the left and right main wheels can move independently.
  • the driving unit 1300 may move the main body 110 forward, backward, left and right, curved driving, or rotating in place.
  • the input unit 1200 may receive various control commands for the robot 100 from a user.
  • the input unit 1200 may include one or more buttons.
  • the input unit 1200 may include a confirmation button, a setting button, and the like.
  • the confirmation button is a button for receiving a command from the user to check detection information, obstacle information, location information, and map information
  • the setting button is a button for receiving a command for setting the information from the user.
  • the input unit 1200 cancels the previous user input and receives a user input again, an input reset button, a delete button to delete a preset user input, a button to set or change an operation mode, a command to return to the charging station It may include a button for receiving input.
  • the input unit 1200 may be installed on the upper part of the mobile robot as a hard key, soft key, touch pad, or the like. Also, the input unit 1200 may have the form of a touch screen together with the output unit 1500 .
  • the output unit 1500 may be installed above the robot 100 .
  • the installation location or installation type may vary.
  • the output unit 1500 may display a battery state or a driving method on the screen.
  • the output unit 1500 may output information on the internal state of the mobile robot detected by the sensing unit 1400 , for example, the current state of each component included in the mobile robot.
  • the output unit 1500 may display external state information, obstacle information, location information, map information, etc. detected by the sensing unit 1400 on the screen.
  • the output unit 1500 may be any one of a light emitting diode (LED), a liquid crystal display (LCD), a plasma display panel, and an organic light emitting diode (OLED). It can be formed as an element of
  • the output unit 1500 may further include a sound output means for aurally outputting the operation process or operation result of the robot 100 performed by the control unit 1800 .
  • the output unit 1500 may output a warning sound to the outside according to the warning signal generated by the control unit 1800 .
  • the sound output means may be a means for outputting sound such as a beeper or a speaker
  • the output unit 1500 is audio data or message having a predetermined pattern stored in the memory 1700 .
  • the data may be output to the outside through a sound output means.
  • the robot 100 may output the environmental information about the driving area to the screen or output the sound through the output unit 1500 .
  • the robot may transmit map information or environment information to the terminal device through the communication unit 1100 so that the terminal device outputs a screen or sound to be output through the output unit 1500 .
  • the memory 1700 stores a control program for controlling or driving the robot 100 and data corresponding thereto.
  • the memory 1700 may store audio information, image information, obstacle information, location information, map information, and the like. Also, the memory 1700 may store information related to a driving pattern.
  • the memory 1700 mainly uses a non-volatile memory.
  • the non-volatile memory (NVM, NVRAM) is a storage device capable of continuously maintaining stored information even when power is not supplied, and for example, a ROM, a flash memory, and a magnetic computer. It may be a storage device (eg, hard disk, diskette drive, magnetic tape), an optical disk drive, magnetic RAM, PRAM, or the like.
  • a map for a driving area may be stored in the memory 1700 .
  • the map may be input by an external terminal or server capable of exchanging information with the robot 100 through wired or wireless communication, or may be generated by the robot 100 while driving.
  • the map may indicate the locations of the rooms within the driving zone.
  • the current position of the robot 100 may be displayed on the map, and the current position of the robot 100 on the map may be updated during the driving process.
  • the memory 1700 may store cleaning history information. Such cleaning history information may be generated whenever cleaning is performed.
  • the map for the driving zone stored in the memory 1700 is data that stores predetermined information of the driving zone in a predetermined format, and is a navigation map used for driving while cleaning and location recognition.
  • SLAM Simultaneous localization and mapping
  • a learning map used for learning and cleaning by storing the information when it collides with an obstacle
  • a global location map used for global location recognition
  • an obstacle recognition map in which information about the recognized obstacle is recorded, etc. .
  • the map may mean a node map including a plurality of nodes.
  • the node means data indicating a certain location on the map corresponding to a point that is a certain location in the driving zone.
  • the sensing unit 1400 may include at least one of an external signal detection sensor, a front detection sensor, a cliff detection sensor, a 2D camera sensor, and a 3D camera sensor.
  • the external signal detection sensor may detect an external signal of the robot 100 .
  • the external signal detection sensor may be, for example, an infrared sensor, an ultra sonic sensor, a radio frequency sensor, or the like.
  • the robot 100 may receive a guide signal generated by the charging stand using an external signal detection sensor to confirm the position and direction of the charging stand.
  • the charging stand may transmit a guide signal indicating a direction and a distance so that the mobile robot can return. That is, the robot 100 may receive a signal transmitted from the charging station, determine its current location, set a moving direction, and return to the charging station.
  • the front detection sensor the front of the robot 100, specifically, along the outer peripheral surface of the side of the robot 100 may be installed at regular intervals.
  • the forward detection sensor is located on at least one side of the robot 100 to detect an obstacle in the front. may be transmitted to the control unit 1800 . That is, the front detection sensor may detect protrusions, household appliances, furniture, walls, wall corners, etc. existing on the movement path of the robot 100 , and transmit the information to the controller 1800 .
  • the front detection sensor may be, for example, an infrared sensor, an ultrasonic sensor, an RF sensor, a geomagnetic sensor, etc., and the robot 100 uses one type of sensor as the front detection sensor or two or more types of sensors as needed. Can be used together.
  • the ultrasonic sensor may be mainly used to generally detect a distant obstacle.
  • the ultrasonic sensor includes a transmitter and a receiver, and the controller 1800 determines the existence of an obstacle based on whether the ultrasonic wave emitted through the transmitter is reflected by the obstacle and received by the receiver, and the ultrasonic radiation time and ultrasonic reception time can be used to calculate the distance to the obstacle.
  • the controller 1800 may detect information related to the size of an obstacle by comparing the ultrasonic wave emitted from the transmitter and the ultrasonic wave received from the receiver. For example, the controller 1800 may determine that the size of the obstacle increases as more ultrasound waves are received by the receiver.
  • a plurality (eg, five) of ultrasonic sensors may be installed on the front side of the robot 100 along the outer circumferential surface.
  • the ultrasonic sensor may be installed on the front side of the robot 100 by a transmitter and a receiver alternately.
  • the transmitter may be disposed to be spaced apart on the left and right sides from the center of the front of the main body, and one or more transmitters may be disposed between the receiver to form a receiving area of the ultrasonic signal reflected from an obstacle or the like.
  • the reception area can be expanded while reducing the number of sensors.
  • the transmission angle of the ultrasonic wave may maintain an angle within a range that does not affect different signals to prevent a crosstalk phenomenon.
  • the reception sensitivities of the receivers may be set differently.
  • the ultrasonic sensor may be installed upward by a certain angle so that the ultrasonic waves transmitted from the ultrasonic sensor are output upward, and in this case, a predetermined blocking member may be further included to prevent the ultrasonic waves from being radiated downward.
  • the front detection sensor as described above, two or more types of sensors may be used together, and accordingly, the front detection sensor may use any one type of sensor such as an infrared sensor, an ultrasonic sensor, an RF sensor, etc. .
  • the front detection sensor may include an infrared sensor as a sensor other than the ultrasonic sensor.
  • the infrared sensor may be installed on the outer peripheral surface of the robot 100 together with the ultrasonic sensor.
  • the infrared sensor may also detect obstacles present in front or on the side and transmit obstacle information to the controller 1800 . That is, the infrared sensor detects protrusions, household appliances, furniture, wall surfaces, wall corners, etc. existing on the movement path of the robot 100 , and transmits the information to the controller 1800 . Accordingly, the main body 110 can move within a specific area without colliding with an obstacle.
  • the cliff detection sensor (or cliff sensor) may detect an obstacle on the floor supporting the main body 110 by mainly using various types of optical sensors.
  • the cliff detection sensor is installed on the rear surface of the robot 100 , but may be installed at different positions depending on the type of the robot 100 .
  • the cliff detection sensor is located on the back of the robot 100 to detect obstacles on the floor, and the cliff detection sensor is an infrared sensor having a light emitting unit and a light receiving unit like the obstacle detection sensor, an ultrasonic sensor, an RF sensor, It may be a Position Sensitive Detector (PSD) sensor or the like.
  • PSD Position Sensitive Detector
  • any one of the cliff detection sensors may be installed in front of the robot 100 , and the other two cliff detection sensors may be installed relatively backward.
  • the cliff detection sensor may be a PSD sensor, but may also include a plurality of different types of sensors.
  • the PSD sensor detects the short and long-distance position of the incident light with one p-n junction using the semiconductor surface resistance.
  • the PSD sensor includes a one-dimensional PSD sensor that detects light in only one axial direction and a two-dimensional PSD sensor that detects a light position on a plane, both of which may have a pin photodiode structure.
  • the PSD sensor is a type of infrared sensor and measures the distance by using infrared rays to transmit infrared rays and then measure the angle of infrared rays reflected back from obstacles. That is, the PSD sensor calculates the distance to the obstacle using the triangulation method.
  • the PSD sensor includes a light emitting unit that emits infrared rays to an obstacle and a light receiving unit that receives infrared rays reflected back from the obstacle, but is generally configured in a module form.
  • a stable measurement value can be obtained regardless of the difference in reflectance and color of the obstacle.
  • the control unit 1800 may measure the infrared angle between the infrared light emitting signal emitted by the cliff sensor toward the ground and the reflected signal reflected by the obstacle to detect the cliff and analyze its depth.
  • the controller 1800 may determine whether or not to pass the cliff according to the ground state of the cliff sensed using the cliff detection sensor, and may determine whether to pass the cliff according to the determination result. For example, the controller 1800 determines the existence of a cliff and the depth of the cliff through the cliff sensor, and then passes through the cliff only when a reflection signal is detected through the cliff sensor.
  • control unit 1800 may determine the lift-up phenomenon of the robot 100 using a cliff detection sensor.
  • the two-dimensional camera sensor is provided on one surface of the robot 100 to acquire image information related to the periphery of the body while moving.
  • the optical flow sensor generates image data in a predetermined format by converting a downward image input from an image sensor provided in the sensor.
  • the generated image data may be stored in the memory 1700 .
  • one or more light sources may be installed adjacent to the optical flow sensor.
  • One or more light sources irradiate light to a predetermined area of the floor surface photographed by the image sensor. That is, when the robot 100 moves in a specific area along the floor surface, a constant distance is maintained between the image sensor and the floor surface if the floor surface is flat. On the other hand, when the robot 100 moves on the non-uniform surface of the floor, it moves away from it by a certain distance or more due to irregularities and obstacles on the floor.
  • one or more light sources may be controlled by the controller 1800 to adjust the amount of light irradiated.
  • the light source may be a light emitting device capable of controlling the amount of light, for example, a Light Emitting Diode (LED).
  • LED Light Emitting Diode
  • the controller 1800 may detect the position of the robot 100 irrespective of the sliding of the robot 100 .
  • the control unit 1800 may calculate a moving distance and a moving direction by comparing and analyzing the image data captured by the optical flow sensor over time, and may calculate the position of the robot 100 based on this.
  • the control unit 1800 can make a strong correction against sliding with respect to the position of the robot 100 calculated by other means. .
  • the 3D camera sensor may be attached to one surface or a part of the body 110 to generate 3D coordinate information related to the circumference of the body 110 .
  • the three-dimensional camera sensor may be a three-dimensional depth camera (3D Depth Camera) that calculates a near-distance distance between the robot 100 and the object to be photographed.
  • 3D Depth Camera three-dimensional depth camera
  • the 3D camera sensor may capture a 2D image related to the surroundings of the main body 110 and may generate a plurality of 3D coordinate information corresponding to the captured 2D image.
  • the three-dimensional camera sensor includes two or more cameras for acquiring an existing two-dimensional image, and combining two or more images obtained from the two or more cameras to generate three-dimensional coordinate information. can be formed in this way.
  • the three-dimensional camera sensor includes a first pattern irradiator that irradiates a first pattern of light downward toward the front of the main body 110, and a second pattern of a second pattern upward toward the front of the main body. It may include a second pattern irradiator for irradiating light and an image acquisition unit for acquiring an image of the front of the body. Accordingly, the image acquisition unit may acquire an image of a region where the light of the first pattern and the light of the second pattern are incident.
  • the three-dimensional camera sensor includes an infrared pattern emitter for irradiating an infrared pattern together with a single camera, and captures a shape in which the infrared pattern irradiated from the infrared pattern emitter is projected onto the object to be photographed, so that the three-dimensional camera A distance between the sensor and the object to be photographed may be measured.
  • the 3D camera sensor may be an IR (Infra Red) type 3D camera sensor.
  • the three-dimensional camera sensor includes a light emitting unit that emits light together with a single camera, receives a portion of the laser emitted from the light emitting unit reflected from the object to be photographed, and analyzes the received laser, A distance between the camera sensor and the object to be photographed may be measured.
  • a three-dimensional camera sensor may be a three-dimensional camera sensor of a time of flight (TOF) method.
  • TOF time of flight
  • the laser of the three-dimensional camera sensor as described above is configured to irradiate a laser having a form extending in at least one direction.
  • the 3D camera sensor may include first and second lasers, the first laser irradiates a laser beam of a straight line that intersects with each other, and the second laser irradiates a single straight laser beam. can do.
  • the lowermost laser is used to detect an obstacle at the bottom
  • the uppermost laser is used to detect an upper obstacle
  • the middle laser between the lowermost laser and the uppermost laser is used to detect an obstacle in the middle. is used for
  • the sensing unit 1400 acquires images around the robot 100 .
  • an image acquired by the sensing unit 1400 is defined as an 'acquired image'.
  • the acquired image includes various features such as lights located on the ceiling, edges, corners, blobs, and ridges.
  • the controller 1800 detects a feature from each of the acquired images, and calculates a descriptor based on each feature point.
  • a descriptor means data in a predetermined format for representing feature points, and means mathematical data in a format in which a distance or similarity between descriptors can be calculated.
  • the descriptor may be an n-dimensional vector (n is a natural number) or data in a matrix format.
  • the control unit 1800 classifies at least one descriptor for each acquired image into a plurality of groups according to a predetermined sub-classification rule based on descriptor information obtained through the acquired image of each location, and is assigned to the same group according to a predetermined sub-representative rule. Included descriptors can be converted into sub-representative descriptors, respectively.
  • all descriptors collected from acquired images within a predetermined area such as a room are classified into a plurality of groups according to a predetermined sub-classification rule, and descriptors included in the same group according to the predetermined sub-representative rule are respectively classified as sub-representative descriptors.
  • a predetermined sub-classification rule assigns descriptors to the same group according to the predetermined sub-representative rule.
  • the control unit 1800 may obtain the feature distribution of each location through this process.
  • Each location feature distribution may be expressed as a histogram or an n-dimensional vector.
  • the controller 1800 may estimate an unknown current location based on descriptors calculated from each feature point without going through a predetermined sub-classification rule and a predetermined sub-representative rule.
  • the current position of the robot 100 becomes unknown due to a position jump or the like, the current position may be estimated based on data such as a pre-stored descriptor or a sub-representative descriptor.
  • the robot 100 acquires an acquired image through the sensing unit 1400 at an unknown current position.
  • Various features such as lights located on the ceiling, edges, corners, blobs, and ridges are identified through the image.
  • the controller 1800 detects features from the acquired image and calculates a descriptor.
  • the control unit 1800 compares with location information to be compared (eg, feature distribution of each location) according to a predetermined sub-transformation rule based on at least one descriptor information obtained through the acquired image of the unknown current location. It is converted into possible information (lower recognition feature distribution).
  • each positional feature distribution may be compared with each recognized feature distribution to calculate each similarity.
  • a degree of similarity may be calculated for each location corresponding to each location, and a location from which the greatest probability is calculated may be determined as the current location.
  • control unit 1800 may divide a driving zone and generate a map composed of a plurality of areas, or recognize the current location of the robot 100 based on a pre-stored map.
  • the communication unit 1100 communicates with a terminal device and/or other device located in a specific area (in this specification, the term “home appliance” is used interchangeably) with one of wired, wireless, and satellite communication methods. connected to send and receive signals and data.
  • the communication unit 1100 may transmit/receive data to/from another device located within a specific area.
  • the other device may be any device capable of transmitting and receiving data by connecting to a network, and may be, for example, an air conditioner, a heating device, an air purification device, a light fixture, a TV, or a device such as a car.
  • the other device may be a device for controlling a door, a window, a water valve, a gas valve, and the like.
  • the other device may be a sensor that detects temperature, humidity, atmospheric pressure, gas, or the like.
  • the communication unit 1100 may communicate with another cleaner located in a specific area or within a predetermined range.
  • the controller 1800 may transmit the generated map to an external terminal or server through the communication unit 1100 , and may store the map in its own memory 1100 . Also, as described above, when a map is received from an external terminal, a server, or the like, the controller 1800 may store the map in the memory 1100 .
  • a mobile robot system (hereinafter referred to as a system) in which a plurality of the robots 100 are configured to perform collaboration will be described.
  • the first robot 100a and the second robot 100b may exchange data with each other through the network 50 .
  • the first robot 100a and/or the second robot 100b performs a cleaning-related operation according to a control command received from the terminal 300 through the network 50 or other communication or corresponding action can be performed.
  • the plurality of robots 100a and 100b may communicate with the terminal 300 through the first network communication and communicate with each other through the second network communication. .
  • the network 50 may mean network communication, and includes a Wireless LAN (WLAN), a Wireless Personal Area Network (WPAN), a Wireless-Fidelity (Wi-Fi), a Wireless Fidelity (Wi-Fi) Direct, and a DLNA.
  • WLAN Wireless Local Area Network
  • Wi-Fi Wireless Fidelity
  • Wi-Fi Wireless Fidelity
  • Wi-Fi Wireless Fidelity
  • DLNA DLNA
  • Digital Living Network Alliance WiBro (Wireless Broadband), WiMAX (World Interoperability for Microwave Access), Zigbee, Z-wave, Blue-Tooth, RFID (Radio Frequency Identification), Infrared Data Association (IrDA), UWB It may refer to short-distance communication using at least one of wireless communication technologies such as (Ultrawide-Band) and Wireless Universal Serial Bus (USB).
  • the illustrated network 50 may vary depending on the communication method of robots that want to communicate with each other.
  • the first robot 100a and/or the second robot 100b may provide information sensed through each sensing unit 130 to the terminal 300 through the network 50 . there is. Also, the terminal 300 may transmit a control command generated based on the received information to the first robot 100a and/or the second robot 100b through the network 50 .
  • the communication unit 1100 of the first robot 100a and the communication unit 1100 of the second robot 100b directly wirelessly communicate or indirectly wirelessly communicate via another router (not shown). , information about the driving state and location information of each other can be grasped.
  • the second robot 100b may perform a driving operation and a cleaning operation according to a control command received from the first robot 100a.
  • the first robot 100a operates as a master cleaner and the second robot 100b operates as a slave cleaner.
  • the second robot 100b follows the first robot 100a.
  • the first robot 100a is equipped with the cleaning unit 120, and the second robot 100b has the mop unit. mounted, the first robot 100a precedes the second robot 100b and sucks the dust on the floor, and the second robot 100b follows the first robot 100a to wipe the floor.
  • FIG. 4 shows an example of a system 1 in which collaboration is made including a plurality of robots 100a and 100b and a plurality of terminals 300a and 300b.
  • the system 1 may include the plurality of robots 100a and 100b , a network 50 , a server 500 , and the plurality of terminals 300a and 300b .
  • the plurality of robots 100a and 100b, the network 50, and at least one terminal 300a are disposed in the building 10, and the other terminals 300b and the server 500 are located in the building 10. It may be located outside of (10).
  • Each of the plurality of robots 100a and 100b may perform autonomous driving and autonomous cleaning.
  • the plurality of robots 100a and 100b may each include a communication unit 1100 therein, in addition to the traveling function and the cleaning function.
  • the plurality of robots 100a and 100b, the server 500, and the plurality of terminals 300a and 300b may be connected to each other through the network 50 to exchange data with each other.
  • a wireless router such as an access point (AP) device may be further included.
  • the terminal 300a located inside the building 10 connects to at least one of the plurality of robots 100a and 100b through the AP device, thereby monitoring the plurality of robots 100a and 100b. , remote control, etc.
  • the terminal 300b located outside the building 10 is also connected to at least one of the plurality of robots 100a and 100b through the AP device, thereby monitoring the plurality of robots 100a and 100b; Remote control, etc. can be performed.
  • the server 500 may be directly wirelessly connected through the mobile terminal 300b. Alternatively, the server 500 may be connected to at least one of the plurality of robots 100a and 100b without going through the mobile terminal 300b.
  • the server 500 may include a processor capable of processing a program, and may include various algorithms.
  • the server 500 may include an algorithm related to performing machine learning and/or data mining.
  • the server 500 may include a voice recognition algorithm. In this case, when receiving voice data, the received voice data may be converted into text data and output.
  • the server 500 may store firmware information and driving information (course information, etc.) for the plurality of robots 100a and 100b, and register product information for the plurality of robots 100a and 100b.
  • the server 500 may be a server operated by a robot manufacturer or a server operated by an open application store operator.
  • the server 500 may be a home server that is provided in the internal network of the building 10 and stores state information about home devices or content shared by the home devices.
  • the server 500 is a home server, information related to a foreign material, for example, a foreign material image, may be stored.
  • the plurality of robots 100a and 100b may be directly wirelessly connected through Zigbee, Z-wave, Bluetooth, Ultra-wide Band, etc. can In this case, the plurality of robots 100a and 100b may exchange position information and driving information with each other.
  • one of the plurality of robots 100a and 100b may be the master robot 100a, and the other may be the slave robot 100b.
  • the first robot 100a may control the driving and cleaning of the second robot 100b.
  • the second robot 100b may follow the first robot 100a and perform driving and cleaning.
  • the second robot 100b following the first robot 100a means that, as shown in FIG. 5 , the second robot 100b and the first robot 100a have an appropriate distance. It means to perform driving and cleaning by following the first robot 100a while maintaining it.
  • the first robot 100a controls the second robot 100b so that the second robot 100b follows the first robot 100a.
  • the first robot 100a and the second robot 100b exist within a specific area where mutual communication is possible, and the second robot 100b grasps at least a relative position of the first robot 100a.
  • the communication unit 1100 of the first robot 100a and the communication unit 1100 of the second robot 100b mutually exchange IR signals, ultrasonic signals, carrier frequencies, impulse signals, etc., and perform triangulation, etc.
  • the relative positions of the first robot 100a and the second robot 100b can be grasped.
  • the present invention is not limited thereto, and the relative positions of the first robot 100a and the second robot 100b may be grasped through triangulation using one of the various wireless communication technologies described above.
  • the second robot 100b may display map information stored in the first robot 100a or the server 500 or the terminal. It may be controlled based on map information stored in 300 and the like. Also, the second robot 100b may share obstacle information sensed by the first robot 100a. In addition, the second robot 100b may perform an operation according to a control command received from the first robot 100a (eg, a control command related to traveling such as traveling direction, traveling speed, and stop).
  • a control command received from the first robot 100a (eg, a control command related to traveling such as traveling direction, traveling speed, and stop).
  • the second robot 100b performs cleaning while driving along the travel path of the first robot 100a.
  • the traveling directions of the first robot 100a and the second robot 100b do not always coincide.
  • the first robot 100a moves or rotates up/down/left/right
  • the second robot 100b moves or rotates up/down/left/right after a predetermined time, so the current progress The directions may be different.
  • traveling speed Va of the first robot 100a and the traveling speed Vb of the second robot 100b may be different from each other.
  • the first robot 100a is controlled to vary the traveling speed Vb of the second robot 100b in consideration of the communicable distance between the first robot 100a and the second robot 100b.
  • the first robot 100a may have a traveling speed Vb of the second robot 100b higher than before. You can control how fast it is.
  • the traveling speed Vb of the second robot 100b is controlled to be slower than before or controlled to stop for a predetermined time can do. Through this, the second robot 100b can continue to follow the first robot 100a and perform cleaning.
  • the first robot 100a and the second robot 100b may perform driving and cleaning while following each other or cooperating with each other without user intervention.
  • first robot 100a it is necessary for the first robot 100a to grasp the position of the second robot 100b or for the second robot 100b to grasp the position of the first robot 100a. This may mean identifying the relative positions of the first robot 100a and the second robot 100b.
  • one of the aforementioned various wireless communication technologies eg, one of Zigbee, Z-wave, Blue-Tooth, and Ultra-wide Band
  • the relative positions of the first robot 100a and the second robot 100b can be grasped through triangulation or the like.
  • the triangulation method for obtaining the relative positions of two devices is a general technique, detailed descriptions are omitted herein, and the relative positions of the first robot 100a and the second robot 100b are determined in the system 1 .
  • the first robot 100a and the second robot 100b determine (recognize) relative positions using the UWB module will be described.
  • the UWB module (or UWB sensor) may be included in the communication unit 1100 of each of the first robot 100a and the second robot 100b.
  • the UWB module is the first robot 100a and the second robot 100b, respectively. It may be included in the sensing unit 1400 .
  • the first robot 100a and the second robot 100b measure the time of the signal transmitted and received between the UWB modules included in each, and between the first robot 100a and the second robot 100b The distance (separation distance) can be determined.
  • the first robot 100a and the second robot 100b may be disposed in one cleaning space.
  • a house which is an entire space in which cleaning is typically performed, may be divided into several spaces such as a living room, a room, and a kitchen.
  • the first robot 100a has map information for the entire space in a state in which the space has been cleaned at least once.
  • the map information may be input by a user or based on a record obtained by the first robot 100a while cleaning.
  • the first robot 100a in FIG. 6 is located in the living room or kitchen, it is possible to have map information for the entire space of the house.
  • each of the first robot 100a and the second robot 100b may be assigned a charging station. That is, the two robots 100a and 100b do not share a charging station, and the batteries may be charged at a charging station corresponding to each.
  • the first robot 100a may be docked to the first charging station to charge the battery
  • the second robot 100b may be docked to the second charging station to charge the battery.
  • each of the first robot 100a and the second robot 100b may store information on the location of the charging station between each other.
  • the position information of the second charging station is stored in the first robot 100a so that the position can be recognized when the second robot 100b is docked, and the second robot 100b is the first
  • the location information of the charging station is stored, so that the location can be recognized when the first robot 100a is docked.
  • a process in which the first robot 100a and the second robot 100b collaborate in such a space may be as shown in FIG. 7 .
  • the map information of the first robot 100a may be transmitted to the second robot 100b (S1).
  • map information may be transmitted while the communication unit 1100 of the first robot 100a and the second robot 100b communicates directly.
  • the first robot 100a and the second robot 100b may transmit information through another network such as Wi-Fi or through a server as a medium.
  • the shared map information may be map information including the location where the first robot 100a is disposed.
  • the first robot 100a and the second robot 100b can exist together in the entire space called a house, and furthermore, because they can exist together in a more specific space such as a living room, the two robots 100a and 100b ), it is desirable to share map information for the space where it is located.
  • the first robot 100a and the second robot 100b can move to start cleaning at each charging station, but it is also possible to move the space where each robot needs cleaning by the user. .
  • first robot 100a and the second robot 100b are respectively powered on and driven (S2), it is possible for the first robot 100a and the second robot 100b to move. In particular, it is possible for the second robot 100b to move in a direction in which the distance from the first robot 100a decreases.
  • the specific distance is 50 cm or less.
  • the specific distance may mean a distance for an initial arrangement set for cleaning while the first robot 100a and the second robot 100b travel together. That is, when the two robots 100a and 100b are disposed at a specific distance, thereafter, the two robots may perform cleaning together according to a predetermined algorithm.
  • the first robot 100a and the second robot 100b can communicate directly, it can be confirmed that the distance from the first robot 100a decreases while the second robot 100b moves. .
  • the accuracy of the position and the direction of the first robot 100a in the second robot 100b is Since it is not high, a technique for increasing the accuracy may be added later.
  • the second robot 100b continuously moves while the first robot 100a and the second robot 100b move. Let the interval of (100b) be moved within a certain distance. For example, if the second robot 100b moves while drawing a circular trajectory, and the distance decreases when moving in a specific direction, it can be checked whether the distance decreases while continuously moving in the corresponding direction.
  • the image captured by the first robot 100a is transmitted to the second robot 100b (S4).
  • the first robot 100a and the second robot 100b may directly communicate, or communicate through another network or server.
  • first robot 100a and the second robot 100b are located within a specific distance, images captured by the first robot 100a and the second robot 100b may be similar.
  • the cameras provided in the first robot 100a and the second robot 100b are respectively arranged toward the front and upward, the images taken by the two robots 100a and 100b are the same if the positions and directions are the same. can be Therefore, by comparing the images taken by the two robots (100a, 100b), by adjusting the position and direction of the two robots (100a, 100b), the initial position and direction for the two robots (100a, 100b) to start cleaning can be sorted.
  • FIG. 8(a) is a diagram illustrating a state in which an image is captured by the first robot 100a
  • FIG. 8(b) is a view illustrating an image capturing in the second robot 100b.
  • a camera is installed in the first robot 100a and the second robot 100b to photograph the upper side of the front, and the photographing is performed in the direction indicated by the arrow in each drawing.
  • a2 feature points on the left and a1 feature points on the right are disposed with respect to the direction of the arrow. That is, it is possible to select a characteristic point from the image photographed by the first robot 100a, and select different characteristic points on the left and right sides with respect to the front photographed by the camera. Accordingly, the left and right sides of the image captured by the camera can be distinguished.
  • photographing is initially performed based on the dotted arrow. That is, in the camera provided in the second robot 100b, the camera is arranged so as to face the upper front side, and a1 characteristic point and a4 characteristic point are arranged on the left side based on the dotted arrow to photograph the corresponding part, and the a3 characteristic point is arranged on the right side. . Therefore, when comparing the feature points in the control unit provided in the second robot 100b, it can be confirmed that there is a difference in the feature points of the images captured by the two robots 100a and 100b.
  • the heading angles of the two robots 100a and 100b may be similarly aligned. Furthermore, if the feature points are similarly arranged in the image provided by the two robots 100a and 100b, it can be confirmed that the positions where the two robots 100a and 100b look at the feature points in the current state are arranged adjacent to each other within a specific distance. , can accurately specify the positions of each other.
  • the characteristic point may be a large object easily distinguishable by a characteristic or a part of a large object easily distinguishable by a characteristic.
  • the feature point may be an object such as an air purifier, a door, a TV, or the like, or a part of an object, such as a corner of a wardrobe, bed, or the like.
  • the second robot 100b when the feature points are arranged at similar positions in the two images, the second robot 100b is arranged at the initial position before the start of driving with the first robot 100a. can judge If there is a difference between the image provided by the first robot 100a and the image currently captured by the second robot 100b, by moving or rotating the second robot 100b, the second robot 100b It is possible to change the image captured by the camera.
  • the image captured by the camera of the first robot 100a and the image provided by the first robot 100a are compared with each other, if the position change of the feature point in the two images is made in a similar direction, the second robot 100b ) can also be determined to be arranged in the initial position before the start of driving with the first robot 100a.
  • each feature point is divided on the left and right sides of the front center of the first robot 100a or the second robot 100b.
  • the cameras of the second robot 100b and the first robot 100a are respectively arranged to face forward.
  • the control unit of the second robot 100b This is because it is easy to detect the position and direction of another cleaner in 1800.
  • the second robot 100b moves or rotates the second robot 100b so that the left and right arrangement of the feature points is the same as the left and right arrangement transmitted from the first robot 100a, so that the second robot 100b is It may be arranged in a line at the rear of the first robot 100a.
  • the front of the second robot 100b and the first robot 100a to coincide with each other, it can be easy to select an initial movement direction when cleaning together.
  • the location of the first robot 100a can be determined from the map information shared by the second robot 100b (S6).
  • first robot 100a and the second robot 100b can exchange location information with each other while moving based on a navigation map and/or a SLAM map shared with each other. there is.
  • the second robot 100b may acquire an image through the sensing unit 1400 while moving or after moving a predetermined distance, and extract region feature information from the acquired image.
  • the controller 1800 may extract region characteristic information based on the acquired image.
  • the extracted area feature information may include a set of probability values for the area and the object recognized based on the obtained image.
  • the controller 1800 may determine the current location based on SLAM-based current location node information and the extracted area characteristic information.
  • the SLAM-based current location node information may correspond to a node most similar to the feature information extracted from the acquired image among pre-stored node feature information. That is, the controller 1800 may select the current location node information by performing location recognition using feature information extracted from each node.
  • the controller 1800 may perform location recognition using both feature information and area feature information to increase the accuracy of location recognition. For example, the controller 1800 compares the extracted region characteristic information with pre-stored region characteristic information to select a plurality of candidate slam nodes, and from among the selected plurality of candidate slam nodes, a SLAM-based The current location may be determined based on candidate slam node information most similar to the current location node information of .
  • the controller 1800 may determine current location node information based on SLAM and correct the determined current location node information according to the extracted area feature information to determine the final current location.
  • control unit 1800 selects a node most similar to the extracted area characteristic information among pre-stored area characteristic information of nodes existing within a predetermined range based on the SLAM-based current location node information. It can be determined by the final current location.
  • the control unit 1800 provides area feature information (ex. living room: sofa, table, TV/kitchen) when creating a map. : dining table, sink / room: bed, desk) can be extracted and stored, and then the positions of the first robot 100a and the second robot 100b can be estimated using information on features of various areas in the indoor environment .
  • area feature information ex. living room: sofa, table, TV/kitchen
  • the positions of the first robot 100a and the second robot 100b can be estimated using information on features of various areas in the indoor environment .
  • the present invention it is possible to estimate the location robust to changes in lighting/illuminance by storing the features of objects, objects, and regions instead of using only a specific point in the image when storing the environment.
  • the sensing unit 1400 is obstructed by the object to sufficiently detect feature points such as corners.
  • the containing image may not be acquired.
  • the accuracy of extracting feature points using the ceiling image may be lowered at a specific location.
  • the control unit 1800 controls area features such as a sofa and living room in addition to feature points such as corners The current location can be determined from the information.
  • the second robot 100b may perform cleaning while driving along the first robot 100a.
  • the location may be determined through mutual communication. For example, as described above, by measuring the time of a signal transmitted/received between the UWB modules included in each of the first robot 100a and the second robot 100b, the distance (separation distance) between the two robots 100a and 100b ) can be found. In this case, the distance (separation distance) between the two robots 100a and 100b may be determined through a conversion equation using the coordinates of the positions where signals are transmitted and received between the UWB modules.
  • the transformation equation is based on the first coordinates expressing the current position of the first robot 100a based on the previous position of the first robot 100a, and the position of the main body of the second robot 100b as the reference. may mean an expression of converting into second coordinates expressing the current position of the first robot 100a.
  • the previous position of the first robot 100a is expressed by a dotted line, and the current position is expressed by a solid line.
  • the position of the second robot 100b is represented by a solid line.
  • Equation 1 The transformation equation is described as follows, and is expressed as a 3X3 matrix in Equation 1 below.
  • Xr and Yr are first coordinates
  • Xm and Ym are second coordinates
  • the first coordinates may be calculated based on information provided by the driving unit 1300 that moves the first robot 100a.
  • the information provided by the driving unit 1300 of the first robot 100a is information derived from an encoder that measures rotation information of a motor that rotates a wheel, and a gyro that detects the rotation of the first robot 100a It is possible to calibrate by the sensor.
  • the driving unit 1300 provides a driving force to move or rotate the first robot 100a, and the second robot 100b cannot receive a signal provided by the first robot 100a. can also be calculated. Therefore, it is possible to determine a relatively accurate position compared to position information calculated by transmitting and receiving signals between the two robots 100a and 100b. In addition, since the driving unit 1300 includes information about the actual movement of the first robot 100a, it is possible to accurately describe a change in the position of the first robot 100a.
  • a change in the position of the first robot 100a may be accurately calculated. Even when the motor that rotates the wheel is rotated, since the first robot 100a can only rotate without movement, the rotation of the motor does not unconditionally move the position of the other vacuum cleaner. Accordingly, in the case of using the gyro sensor, a case in which only rotation is made without a change in position of the first robot 100a, a case in which both a change in position and rotation are made, or a case in which only a change in position is made without rotation can be distinguished.
  • the first robot 100a can accurately calculate the first coordinates converted from the previous position to the current position.
  • this information may be transmitted to a network through the communication unit 1100 of the first robot 100a, and may be transmitted to the second robot 100b through the network.
  • the second coordinate is measured by a signal transmitted/received between the first robot 100a and the second robot 100b (eg, a signal may be transmitted/received using a UWB module).
  • the second coordinate may be calculated when the signal is transmitted because the first robot 100a is present in the sensing area of the second robot 100b.
  • data when the first robot 100a is disposed in the sensing area of the second robot 100b may be continuously accumulated.
  • Such data is expressed as in the following ⁇ Equation 2>.
  • a lot of data is accumulated when the first robot 100a is located in the sensing area.
  • data is a plurality of first coordinates and a plurality of second coordinates corresponding to each.
  • Equation 3 the least squares method may be used as shown in Equation 3 below.
  • H has a more reliable value.
  • the second robot 100b operates the first robot 100a.
  • the first robot 100a temporarily leaves the sensing area of the second robot 100b, so that the second robot 100b can directly receive a signal regarding the position of the first robot 100a through the sensing unit. If not, the second robot 100b uses the driving information of the first robot 100a transmitted through the network to determine the position of the first robot 100a compared to the position of the second robot 100b. It can be calculated by the conversion formula.
  • the second robot 100b determines the position of the first robot 100a by the conversion formula
  • the first coordinate corresponding to R must be transmitted through the communication unit 1100 of the second robot 100b. do. That is, knowing R and H, M can be calculated. M is the position of the first robot 100a with respect to the second robot 100b. Accordingly, the second robot 100b may know a relative position with respect to the first robot 100a, and the second robot 100b may move along the first robot 100a.
  • the position of the other robot is determined using the mutual communication result without using map information, so that the first robot 100a and the second robot 100b are in a map-less position even while driving. Recognition can be used to locate each other.
  • the first robot 100a and the second robot 100b may perform cooperative driving as shown in FIG. 10 .
  • the cleaning target area in which the first robot 100a and the second robot 100b travel is divided into one or more zones (Z1 to Z3), as shown in FIG. Cleaning can be done.
  • the first robot 100a starts cleaning the first zone Z1, and the second robot 100b waits near the starting position of the first robot 100a. can do.
  • the first robot 100a completes cleaning the first zone Z1 by a certain standard or more
  • the first robot 100a transmits information on the cleaning area to the second robot 100b can For example, the first robot 100a transmits information about the first zone Z1, or information on the path traveled by the first robot 100a in the first zone Z1 to the second robot ( 100b), the second robot 100b may be caused to travel according to the travel path of the first robot 100a.
  • the first robot 100a transmits information about the cleanable area to the second robot 100b, and then cleans the remaining part of the first area Z1, or moves to the second area Z2 and moves to the second area Z2.
  • the second zone Z2 is cleaned, and the second robot 100b may clean the first zone Z1 based on the information received from the first robot 100a.
  • the second robot 100a may perform cleaning while driving along the path traveled by the first robot 100a based on the information received from the first robot 100a.
  • the first robot 100a cleans the second zone Z2 while the second robot 100b cleans the first zone Z1 to clean the second zone Z2. After completing , you can move to the next uncleaned zone, the third zone (Z3). At this time, as the first robot 100a transmits information on the cleanable area in the first zone Z1 to the second robot 100b, in the second zone Z2, the information may be transmitted to the second robot 100b. Accordingly, after the second robot 100a completes the cleaning of the first zone Z1, it moves to the second zone Z2 to perform cleaning of the second zone Z2. can
  • the first robot 100a cleans the third area Z3, and while the first robot 100a cleans the third area Z3, the second robot 100b cleans the third area Z3.
  • the twenty-second zone Z2 may be cleaned.
  • the first robot 100a completes cleaning of the third zone Z3, similarly, the second robot 100b moves to the third zone Z3, and the first robot 100a
  • the third zone Z3 may be cleaned while driving along the traveled route.
  • the system 1 in which the first robot 100a and the second robot 100b cooperate is driven cooperatively according to the driving state of the first robot 100a and the second robot 100b. This can be done
  • the cooperative driving may not be performed.
  • the battery charge capacity of one or more of the first robot 100a and the second robot 100b does not meet a certain standard, so that the cooperative driving cannot be completed, or the first robot 100a And when one or more of the second robots 100b is located in an area where mutual location recognition is impossible, and the location recognition of the other robot is not made, so that it is difficult to start the cooperative driving, the cooperative driving is not performed can be
  • the cooperative driving in the system 1 may be performed when the driving states of the first robot 100a and the second robot 100b satisfy certain criteria.
  • the embodiment of the system 1 communicates with a plurality of mobile robots 100a and 100b and the plurality of mobile robots 100a and 100b that perform cleaning while traveling in an area to be cleaned, as shown in FIG. 11 . and a controller 600 for transmitting a control command for remote control to the plurality of mobile robots 100a and 100b.
  • the plurality of mobile robots 100a and 100b may include two robots, preferably, the first robot 100a and the second robot 100b.
  • the first robot 100a may be a robot that pre-travels in the target area of the cooperative driving and sucks dust
  • the second robot 100b is the area in which the first robot 100a travels. It could be a robot that wipes the dust while driving.
  • the plurality of mobile robots 100a and 100b are used to include both the first robot 100a and the second robot 100b.
  • the controller 600 may be one or more of the terminal 300 , a control device of the server 500 , and a remote controller of the first robot 100a and the second robot 100b .
  • the first robot 100a and the second robot 100b are the terminal 300, the control device of the server 500, and the first robot 100a and the second robot 100b. It may be driven by receiving the control command from one or more of the remote controllers.
  • the controller 600 may be preferably a mobile terminal.
  • the first robot 100a and the second robot 100b may perform the cooperative driving mode by the terminal 300 .
  • the cooperative driving in the system 1 may be performed by transmitting a control command for the cooperative driving from the controller 600 to the first robot 100a and the second robot 100b.
  • the plurality of mobile robots 100a and 100b When the plurality of mobile robots 100a and 100b receive a control command for a cooperative driving mode for collaboratively cleaning the cleaning target area from the controller 600, the plurality of mobile robots 100a and 100b It is determined whether the driving state corresponds to a preset reference condition, and a motion for the cooperative driving mode is performed according to the determination result.
  • the plurality of mobile robots 100a and 100b may compare each driving state with the reference condition and perform a motion for the cooperative driving mode according to the comparison result.
  • the cooperative driving mode may mean an operation mode in which the plurality of mobile robots 100a and 100b perform the cooperative driving.
  • the cooperative driving mode may be a mode in which the plurality of mobile robots 100a and 100b run sequentially and clean.
  • it may be a mode in which the first robot 100a and the second robot 100b run sequentially in a predetermined area and clean.
  • the cooperative driving mode may be a mode in which one of the plurality of mobile robots 100a and 100b cleans an area cleaned while the other robot runs afterward.
  • the first robot 100a may run before, and the second robot 100b may run and clean.
  • a process in which the cooperative driving mode is performed in the system 1 may be as shown in FIG. 12 .
  • conditions under which the cooperative driving mode is performed in the system 1 according to the process shown in FIG. 12 may be as shown in FIG. 13 .
  • the plurality of mobile robots 100a and 100b stops the operation being performed at the current location.
  • the driving state may be determined (S20).
  • the plurality of mobile robots 100a and 100b may be performing different operation modes, or may be docked to each of the charging stations 400a and 400b.
  • the plurality of mobile robots 100a and 100b receive the control command regardless of whether other operation modes are being performed or whether the charging stations 400a and 400b are docked, and determine the driving state at the current position ( S20) can be done.
  • the driving state may mean a state for performing cooperative driving of each of the plurality of mobile robots 100a and 100b.
  • the driving state may mean including one or more state information of the plurality of mobile robots 100a and 100b compared with the reference condition.
  • the driving state is, as shown in FIG. 13 , a map sharing state (driving state 1), a battery charging state (driving state 2), and a charging station location information state of each of the plurality of mobile robots 100a and 100b.
  • (drive state 3) may include one or more of. That is, when determining the driving state (S20), the map sharing state (driving state 1), battery charging state (driving state 2), and charging station location information state of each of the plurality of mobile robots 100a and 100b One or more of (driving state 3) may be determined.
  • the map sharing state may mean a state of whether or not map information of each of the plurality of mobile robots 100a and 100b is shared with each other. That is, the state of whether the map information of the second robot 100b is shared with the first robot 100a and the map information of the first robot 100a is shared with the second robot 100b can be
  • the battery charge state may mean a battery charge capacity state of each of the plurality of mobile robots 100a and 100b. That is, it may be a state for each of the battery charging capacity of the first robot 100a and the battery charging capacity of the second robot 100b.
  • the state of the charging station location information of the counterpart robot may refer to a state of whether or not the charging station location information of the counterpart robot is stored in each of the plurality of mobile robots 100a and 100b. That is, location information of the charging stand 400b of the second robot 100b, which is a counterpart robot, is stored in the first robot 100a, and the position information of the first robot 100a, which is a counterpart robot, is stored in the second robot 100b. It may be a state as to whether or not the charging station 400a location information is stored.
  • the driving state may include all of a map sharing state of each of the plurality of mobile robots 100a and 100b, a battery charging state, and a charging station location information state of the other robot.
  • each of the plurality of mobile robots 100a and 100b may communicate with each other to share the determination result. Accordingly, each of the plurality of mobile robots 100a and 100b may grasp the driving state of all of the plurality of mobile robots 100a and 100b. Thereafter, one or more of the plurality of mobile robots 100a and 100b may compare the driving state with the reference condition to determine whether the driving state corresponds to the reference condition ( S30 to S50 ).
  • the reference condition may be a condition of the driving state in which the cooperative driving mode may be performed. That is, the reference condition may mean an initial state condition in which the cooperative driving mode can be performed. Accordingly, as the reference condition, conditions corresponding to the driving state may be preset.
  • the reference condition includes a first condition in which each of the plurality of mobile robots 100a and 100b shares a map, a second condition in which a battery charge capacity of each of the plurality of mobile robots 100a and 100b is equal to or greater than a preset reference capacity, and Each of the plurality of mobile robots 100a and 100b may include one or more of a third condition in which information on the position of the charging station of the counterpart robot is stored.
  • the reference condition may include all of the first to third conditions. Accordingly, the plurality of mobile robots 100a and 100b compares the driving state with the reference condition (S30 to S50), and the map sharing state of each of the plurality of mobile robots 100a and 100b is the first It is determined whether the condition is satisfied (S30), and it is determined whether the battery charge state of each of the plurality of mobile robots 100a and 100b corresponds to the second condition (S40), and the plurality of mobile robots 100a , 100b) it may be determined (S50) whether the charging station location information state of each counterpart robot corresponds to the third condition.
  • the plurality of mobile robots 100a and 100b share a map with each of the plurality of mobile robots 100a and 100b,
  • the battery charging capacity of each of the plurality of mobile robots 100a and 100b is greater than or equal to a preset reference capacity, it is determined whether information on the location of the charging station of the counterpart robot is stored in each of the plurality of mobile robots 100a and 100b (S50) According to the result, a motion for the cooperative driving mode may be performed.
  • the plurality of mobile robots 100a and 100b determines whether the map sharing state of each of the plurality of mobile robots 100a and 100b corresponds to the first condition (S30). As a result, the plurality of mobile robots ( 100a, 100b) If each map sharing state corresponds to the first condition, it is determined whether the battery charge capacity state of each of the plurality of mobile robots 100a, 100b corresponds to the second condition (S40) can And, as a result of determining whether the map sharing state of each of the plurality of mobile robots 100a and 100b corresponds to the first condition (S30), the map sharing state of each of the plurality of mobile robots 100a and 100b is determined (S30).
  • the plurality of mobile robots 100a and 100b may not perform the cooperative driving mode (R2). That is, when each of the plurality of mobile robots 100a and 100b shares a map, it is determined that the plurality of mobile robots 100a and 100b can perform the cooperative driving mode through the shared map, and the plurality of When each of the mobile robots 100a and 100b does not share a map, the plurality of mobile robots 100a and 100b cannot perform the cooperative driving mode due to the limitation of collaborative cleaning in the same area due to the non-sharing of map information. It is determined that the cooperative driving mode is not performed (R2).
  • the plurality of mobile robots 100a and 100b determines whether the battery charge capacity of each of the plurality of mobile robots 100a and 100b corresponds to the second condition (S40). As a result, the plurality of mobile robots (100a, 100b) When each battery charging capacity corresponds to the second condition, it is determined whether the charging station location information state of each of the plurality of mobile robots 100a and 100b corresponds to the third condition (S50) can be done.
  • the battery charge capacity of each of the plurality of mobile robots 100a and 100b may not perform the cooperative driving mode (R2).
  • the cooperative driving mode may not be performed (R2).
  • at least one of the first robot 100a and the second robot 100b may output a notification regarding the shortage of the charging capacity of the battery. For example, a notification about the need for charging may be output from the robot whose charging capacity is less than the reference capacity.
  • the plurality of mobile robots 100a and 100b determines (S50) whether the location information of the opposite robot's charging stations 400a and 400b is stored in each of the plurality of mobile robots 100a and 100b.
  • R1 cooperative driving mode
  • the plurality of mobile robots 100a and 100b determines (S50) whether the location information of the opposite robot's charging stations 400a and 400b is stored in each of the plurality of mobile robots 100a and 100b.
  • the mutual positions may be recognized (S60), and the motion for the cooperative driving mode may be performed according to the recognition result.
  • the plurality of mobile robots 100a and 100b determines whether the position information status of the charging stations 400a and 400b of each of the plurality of mobile robots 100a and 100b corresponds to the third condition (S50) As a result, when the position information status of the charging stations 400a and 400b of each of the plurality of mobile robots 100a and 100b corresponds to the third condition, the first robot 100a is the second robot ( 100b), the cooperative driving mode may be performed (R1) by moving to a location within a predetermined distance from the front. For example, by moving to a point 1 [m] in front of the second robot 100b, the cooperative driving mode may be performed (R1) prior to the second robot 100b.
  • the plurality of mobile robots 100a , 100b) When the position information of the charging stations 400a and 400b of each other robot does not meet the third condition, the plurality of mobile robots 100a and 100b each recognize the position of each other (S60).
  • the plurality of mobile robots (100a, 100b) is the cooperative driving mode with the stored location information Determining that it can be performed, the first robot 100a moves to a position within a predetermined distance from the front of the second robot 100b to perform the cooperative driving mode (R1), and the plurality of mobile robots 100a , 100b) If the position information of the charging stations 400a and 400b of the opposite robot is not stored in each, it is impossible to determine the initial position and the end position of the opposite robot due to the impossibility of determining the position of the charging stations 400a and 400b of the opposite robot. By determining, an operation for mutual location recognition ( S60 ) may be performed.
  • the plurality of mobile robots 100a and 100b recognize each other's positions (S60), and when the mutual positions are recognized (S70), the line traveling target robot moves to a position within a certain distance from the other robot,
  • the cooperative driving mode may be performed (R1).
  • the plurality of mobile robots 100a and 100b After outputting a notification informing that the unrecognized robot moves to the vicinity of the counterpart robot from at least one of them, the motion for the cooperative driving mode may be performed according to the movement result.
  • the unrecognized robot When the unrecognized robot is moved to the vicinity of the counterpart robot, the unrecognized robot performs a position recognition operation for recognizing the position of the counterpart robot using a communication result with the counterpart robot, and then sets a driving standard Accordingly, the cooperative driving mode may be performed (R3).
  • the first robot ( 100a) may move to a location within a predetermined distance in the vicinity of the second robot 100b to perform a motion for performing the cooperative driving mode (R4).
  • the first robot 100a is the second robot 100b.
  • the cooperative driving mode may be performed (R1) by moving to a location within a predetermined distance in front. For example, by moving to a point 1 [m] in front of the second robot 100b, the cooperative driving mode may be performed prior to the second robot 100b.
  • any one of the plurality of mobile robots 100a and 100b does not recognize the position of the other robot (S80)
  • at least one of the plurality of mobile robots 100a and 100b recognizes the unrecognized robot.
  • the unrecognized robot After outputting a notification informing to move to the vicinity of the counterpart robot, if the unrecognized robot is moved to the vicinity of the counterpart robot by the user, the unrecognized robot is mapped to the mapless position recognition method as shown in FIG.
  • the cooperative driving mode may be performed according to the driving standard (R3). For example, when the first robot 100a does not recognize the position of the second robot 100b, the first robot 100a moves to a position within a radius of 50 [cm] of the second robot 100b. After moving, the position of the second robot 100b may be recognized according to the map-less position recognition method shown in FIG. 9 .
  • the second robot 100b when the second robot 100b does not recognize the position of the first robot 100b, the second robot 100b is located within a radius of 50 [cm] of the first robot 100a.
  • the position of the first robot 100a may be recognized according to the mapless position recognition method shown in FIG. 9 .
  • the vicinity of the opposing robot may mean a distance at which the angle of view of the opposing robot overlaps with the camera 131, and may be within a radius of 50 [cm] or 50 [cm] of the opposing robot.
  • the plurality of mobile robots 100a and 100b may perform the cooperative driving mode (R3) according to the driving standard.
  • the driving criterion may be a criterion for changing or limiting the setting of the cooperative driving mode.
  • the zone set in the cooperative driving mode may be divided into two or more small regions to be driven. Accordingly, the plurality of mobile robots 100a and 100b attempt to recognize each other even while the cooperative driving mode is being performed, and the result of the position recognition may be corrected according to the trial result. If the positions of all of the plurality of mobile robots 100a and 100b are not recognized (S80), the first robot 100a moves to a position within a certain distance in the vicinity of the second robot 100b. Then, after each of the first robot 100a and the second robot 100b recognizes the position of the other robot according to the mapless position recognition method shown in FIG. 9, a motion for performing the cooperative driving mode is performed. You can do (R4).
  • the cooperative driving may be performed by the cooperative driving performing method as shown in FIG. 14 .
  • the collaborative driving performing method (hereinafter referred to as a performing method) is a method in which the first robot 100a and the second robot 100b perform cooperative driving, and as shown in FIG. 14 , the first A step of inputting a command for cooperative driving to the robot 100a and the second robot 100b (S100), the first robot 100a is the first robot 100a and the second robot 100b ) comparing the driving state with a preset reference condition (S200) and performing a motion for each of the first robot 100a and the second robot 100b for cooperative driving according to the comparison result (S300) includes
  • the first robot 100a pre-runs in the target area of the cooperative driving and sucks dust
  • the second robot 100b runs after the area in which the first robot 100a travels
  • a command for performing the cooperative driving may be input from each of the first robot 100a and the second robot 100b.
  • step S100 of inputting a command for performing the cooperative driving the operation of the first robot 100a and the second robot 100b at the current location may be stopped.
  • the first robot 100a may compare the driving state with a preset reference condition.
  • step (S200) of comparing the driving state with a preset reference condition as shown in FIG. 15 , the map sharing state and battery charge capacity state of the first robot 100a and the second robot 100b, respectively.
  • step (S210) of comparing each with the first condition and the second condition among the reference conditions and the result of comparing the first condition and the second condition the first robot 100a and the second robot 100b ) comparing the storage state of the charging station location information of each counterpart robot with a third condition among the reference conditions (S220).
  • the first robot (100a) and the second robot (100b) according to the comparison result of the step (S200) of comparing the driving state with a preset reference condition One or more of the motions for the cooperative driving may be performed.
  • the first robot 100a is the second robot 100b. You can move to a location within a certain distance.
  • the first robot 100a may move to a position within x[m] in front of the second robot 100b to start the cooperative driving.
  • the first robot 100a and the Each of the second robots 100b may recognize each other's positions, and one or more of the first robot 100a and the second robot 100b may perform a motion for the cooperative driving according to the recognition result.
  • step (S300) of performing the motion for the cooperative driving when one robot does not recognize the position of the other robot as a result of recognizing the mutual positions, the first robot 100a and the second robot After outputting a notification informing that the unrecognized robot moves to the vicinity of the counterpart robot in one or more of (100b), the motion for the cooperative driving may be performed according to the movement result.
  • the unrecognized robot may be moved within a radius y [cm] of the counterpart robot to perform a motion for the cooperative driving.
  • the unrecognized robot uses a communication result with the counterpart robot to locate the counterpart robot
  • the cooperative driving may be performed according to a preset driving standard.
  • the performance comprising the step of inputting a command for performing the cooperative driving (S100), the step of comparing the driving state with a preset reference condition (S200), and the step of performing the motion for the cooperative driving (S300)
  • the method may be implemented as computer-readable code on a medium in which a program is recorded.
  • the computer-readable medium includes all types of recording devices in which data readable by a computer system is stored. Examples of computer-readable media include Hard Disk Drive (HDD), Solid State Disk (SSD), Silicon Disk Drive (SDD), ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage device, etc.
  • HDD Hard Disk Drive
  • SSD Solid State Disk
  • SDD Silicon Disk Drive
  • ROM Read Only Memory
  • RAM Compact Disk Drive
  • CD-ROM Compact Disk Read Only Memory
  • magnetic tape floppy disk
  • optical data storage device etc.
  • the computer may include the control unit 1800 .
  • Example 1> of the mobile robot system 1 that performs a preset scenario in response to a trap situation that occurs while performing cooperative driving will be described with reference to FIGS. 16 to 21 .
  • the first robot 100a and the second robot 100b enter the cooperative driving mode and each other's position information is identified, the first robot 100a precedes the second robot 100b.
  • Contaminants in the area to be cleaned can be inhaled.
  • cleaning may be performed in units of the divided zones.
  • the fourth zone Z4 refers to a cleaning zone in which the second robot 100b is scheduled to travel after the first robot 100a completes the driving.
  • the fifth zone Z5 refers to a cleaning zone in which the first robot 100a is scheduled to travel.
  • the sixth zone Z6 refers to a cleaning zone in which the first robot 100a is scheduled to travel after cleaning of the fifth zone Z5 is completed.
  • the fourth zone (Z4) to the sixth zone (Z6) are divided by the outer wall and the inlets (D1, D2) as boundaries.
  • the embodiment is not limited thereto, and the fourth zone (Z4) to the sixth zone (Z6) are divided based on a certain size or divided based on an outer wall, corner, furniture, etc.
  • the mobile robot The cooperative driving of the system 1 can be divided into a method in which it can be efficiently performed.
  • the first robot 100a may travel along the first travel path L1
  • the second robot 100b may travel along the second travel path L2 .
  • the first path L1 means all paths for cleaning the area to be cleaned, such as the first robot 100a bypassing the obstacle.
  • the second travel path L2 refers to all paths for the second robot 100b to clean the area to be cleaned, and may be set to be the same as the travel path that the first robot 100a has already traveled.
  • a modified path such as a detour.
  • the first robot 100a completes the cleaning of the fifth zone Z5
  • the second robot 100b completes the cleaning of the fourth zone Z4
  • the inlet D1 movable from the fifth zone Z5 to the sixth zone Z6 is closed. Accordingly, it represents a case in which the first robot 100a is in a trap situation.
  • the trap situation means a situation in which it is impossible to enter the cleaning target area in which the first robot 100a or the second robot 100b does not travel. That is, it means a situation in which the first robot 100a and/or the second robot 100b cannot enter the uncleaned area. Accordingly, in the trap situation of the first robot 100a in FIG. 15 , the first robot 100a completes the cleaning of the fifth zone Z5, but it is impossible to enter the sixth zone Z6, which is the cleaning scheduled zone. it means.
  • the trap situation includes a situation in which the first robot 100a and the second robot 100b cannot enter the cleaning target area that does not travel due to various obstacles such as chairs, desks, furniture, etc. in addition to the door.
  • the trap escape driving refers to a driving method in which the first robot 100a travels along the exterior or boundary of the cleaning area. That is, the trap escape driving refers to a driving method in which the first robot 100a or the second robot 100b travels while pushing the outer or boundary of the cleaning area that has already been driven.
  • the third path L3 refers to all paths in which the first robot 100a or the second robot 100b travels while pushing the outer or boundary of the cleaning area as the trap escape driving is performed.
  • the case of escaping the trap situation means that the first robot 100a and/or the second robot 100b enters an uncleaned area to which it is impossible to enter. Accordingly, when the first robot 100a is in a trap situation and the second robot 100b is not in a trap situation, when the first robot 100a escapes the trap situation by performing trap escape driving, that is, when the first robot 100a escapes from the trap situation, that is, When neither the robot 100a nor the second robot 100b is in a trap situation, the first robot 100a and the second robot 100b perform cooperative driving again.
  • the second robot 100b may stand by in place for a first preset time. Also, after the second robot 100b ends the driving of the fourth zone Z4 being cleaned, the second robot 100b may wait for a first preset time at the ending point. While the second robot 100b is waiting, if the first robot 100b escapes the trap situation, the second robot 100b releases the standby state and performs cooperative driving with the first robot 100a.
  • the second robot 100b waits for a first preset time and then cleans the already cleaned fourth zone Z4. It can be re-cleaned again along the fourth path (L4).
  • the re-cleaning time may be set to a preset second time.
  • the fourth path L4 refers to all paths for performing re-cleaning of the area to be cleaned, such as returning to the second path L2 that has already been driven, or driving while avoiding obstacles to perform re-cleaning. If the first robot 100a escapes the trap situation while the second robot 100b is re-cleaning the already cleaned fourth zone Z4, the second robot 100b and the first robot 100a collaborate again carry out driving
  • the first time may be set to 1 minute, and the second time may be set to 9 minutes, but the embodiment is not limited thereto. If the waiting time of the second robot 100b is long, water may accumulate on the floor, so the first time can be set as an appropriate time to prevent this. In addition, the second time may be set to an appropriate time for the second robot 100b to perform re-cleaning and to wait for the first robot 100a to escape the trap situation.
  • the second robot 100b stops re-cleaning and Perform collaborative driving again.
  • FIG. 18 is a diagram showing a first time during which the second robot 100b waits and a second time for performing re-cleaning when the first robot 100a is in a trap situation and the second robot 100b is not in a trap situation. in case it has elapsed.
  • the second robot 100b cancels the cooperative driving mode and returns to the second charging station 400b.
  • the first time in which the second robot 100b waits and the second time in which the re-cleaning is performed have elapsed. It refers to all paths returning to the second charging station 400b after the operation.
  • the second robot 100b does not wait for the first time or perform re-cleaning for the second time, and
  • the cooperative driving mode may be immediately released and the robot 100a may return to the second charging station 400b.
  • the second robot 100b when only the first robot 100a is in the trap situation, the second robot 100b returns to the second charging station 400b without releasing the cooperative driving mode, and then the first robot 100a escapes the trap situation. Then, cleaning can be performed according to the cooperative driving mode. After the second robot 100b returns to the second charging station 400b, if the first robot 100a fails to escape the trap situation for a preset time, the second robot 100b cancels the cooperative driving mode and cleans can be terminated.
  • the second robot 100b moves to the first robot ( Although the cleaning of 100a) is completed, the cleaning of the fifth zone Z5 that does not travel may be performed.
  • the first robot 100a is in a state in which cleaning of the fifth zone Z5 is completed, and the second robot 100b is in a state in which cleaning of the fourth zone Z4 is completed.
  • the inlet D2 movable from the fourth zone Z4 to the fifth zone Z5 is closed. Accordingly, it represents a case in which the second robot 100a is in a trap situation.
  • the second robot 100b When the first robot 100a is not in the trap situation and the second robot 100b is in the trap situation, the second robot 100b performs a trap escape driving along the third path L3.
  • the trap escape driving along the third path L3 refers to a driving method in which the second robot 100b travels while pushing the outer or boundary of the cleaning area.
  • the first robot 100a may stand by in place for a first time. Also, the first robot 100a may wait for a first time at the end point after terminating the driving of the fifth zone Z5 being cleaned. While the first robot 100a is waiting, if the second robot 100b escapes the trap situation, the first robot 100a releases the standby state and performs cooperative driving with the second robot 100b.
  • the first robot 100a waits for a first preset time and then cleans the already cleaned fifth zone Z5 may be re-cleaned again along the fourth path L4.
  • the re-cleaning time may be set to a preset second time
  • the fourth path L4 is a driving method that returns to the first path L1 that has already traveled, or avoids obstacles to perform re-cleaning. Refers to all routes for performing re-cleaning of the back cleaning area. If the second robot 100b escapes the trap situation while the first robot 100a is re-cleaning the already cleaned fifth zone Z5, the first robot 100a and the second robot 100b collaborate again carry out driving
  • FIG. 20 is a diagram showing a first time for the first robot 100a to wait and a second time for performing re-cleaning when the second robot 100b is in a trap situation and the first robot 100a is not in a trap situation. in case it has elapsed.
  • the first robot 100a In the case where the first robot 100a is in the trap situation, the first robot 100a does not escape the trap situation during the first time when the second robot 100b waits and the second time to perform re-cleaning. If not, the second robot 100b cancels the cooperative driving mode and returns to the second charging station 400b, when the first time and the second time elapse in the case where the second robot 100b is in a trap situation, , the first robot 100a releases the cooperative driving mode, and performs independent driving.
  • the first robot 100a when the first robot 100a is not in a trap situation, and the second robot 100b is in a trap situation, the first time that the first robot 100a waits and the second time for performing re-cleaning have elapsed. In this case, the first robot 100a releases the cooperative driving mode and enters the independent driving mode to drive alone. Accordingly, the first robot 100a travels in the sixth area Z6, which is the area to be cleaned. In this case, the first path L1 in which the first robot 100a travels in the sixth zone Z6 means all paths for cleaning the cleaning zone.
  • the first robot 100a when the first robot 100a is not in a trap situation and the second robot 100b is in a trap situation, the first robot 100a releases the cooperative driving mode as soon as a trap situation occurs in the second robot 100b. and enters the independent driving mode and can drive in the sixth zone Z6, which is the area to be cleaned.
  • the first robot 100a when the first robot 100a is not in a trap situation and the second robot 100b is in a trap situation, the first robot 100a does not wait for the first time or perform re-cleaning for the second time. , it is possible to immediately release the cooperative driving mode and return to the first charging station 400a. That is, when the first robot 100a is not in the trap situation, the first robot 100a releases the cooperative driving mode as soon as the second robot 100b becomes the trap situation and returns to the first charging station 400a. there is.
  • the first robot 100a when only the second robot 100b is in the trap situation, the first robot 100a returns to the first charging station 400a without releasing the cooperative driving mode, and then the second robot 100b escapes the trap situation. Then, cooperative driving can be performed by performing the cooperative driving mode again. If the second robot 100b fails to escape the trap situation for a preset time after the first robot 100a returns to the first charging station 400a, the first robot 100a cancels the cooperative driving mode and performs cleaning. can be shut down
  • both the first inlet D1 and the second inlet D2 are closed, and both the first robot 100a and the second robot 100b are in a trap situation.
  • this is divided for convenience of explanation, and the embodiment is not limited thereto, and the first robot ( 100a ) and the second robot ( 100b ) when a trap situation occurs while driving in the same cleaning area.
  • 100a) and the second robot 100b refer to all cases in a trap situation.
  • the first robot 100a and the second robot 100b respectively perform trap escape driving.
  • the above-described first robot 100a operates according to a scenario in which the first robot 100a is not a trap situation and the second robot 100b is a trap situation do.
  • the above-described first robot 100a is in the trap situation, and the second robot 100b operates according to a scenario other than the trap situation. do.
  • 22 is a flowchart of a method for performing cooperative driving of the mobile robot system 1 when a trap situation occurs.
  • step S1100 the first robot 100a and the second robot 100b enter a cooperative driving mode, and cooperative driving is performed by grasping each other's location information.
  • the cleaning target area may be divided into one or more zones (Z4 to Z6), and cleaning may be performed in units of the divided zones.
  • step S1200 it is determined whether the first robot 100a and the second robot 100b are in a trap situation. At this time, it is divided into three cases, and a scenario according to the trap situation is performed.
  • case A indicates a case in which the first robot 100a is a trap situation and the second robot 100b is not a trap situation.
  • Case B represents a case in which the first robot 100a is not in a trap situation and the second robot 100b is in a trap situation.
  • Case C represents a case in which both the first robot 100a and the second robot 100b are in a trap situation.
  • step S1300 a trap scenario according to case A is performed.
  • the trap scenario according to case A as described in the descriptions of FIGS. 17 and 18 above, the first robot 100a is a trap situation and the second robot 100b is not a trap situation. (100a) and means the running of the second robot (100b).
  • the first robot 100a when only the first robot 100a is in a trap situation, the first robot 100a performs trap escape driving. While the first robot 100a performs trap escape driving, the second robot 100b may stand by in place for a first preset time. In addition, the second robot 100b may wait for a first time at the point where the cleaning is finished after terminating the driving of the cleaning area being cleaned. While the second robot 100b is waiting, when the first robot 100a escapes the trap situation, the second robot 100b releases the standby state and performs cooperative driving with the first robot 100a again. .
  • the second robot 100b may re-clean the already cleaned cleaning area after waiting for the first time.
  • the re-cleaning time may be set to a preset second time. If the first robot 100a escapes the trap situation while the second robot 100b performs re-cleaning for the second time period, the second robot 100b stops re-cleaning and Perform collaborative driving again.
  • the second robot 100b operates cooperatively. Release the mode and return to the second charging station 100b.
  • the second robot 100b waits for the first time or does not perform re-cleaning for the second time, and when the first robot 100a is in the trap situation It is possible to immediately release the cooperative driving mode and return to the second charging station 400b.
  • the second robot 100b when only the first robot 100a is in a trap situation, the second robot 100b returns to the second charging station 400b without releasing the cooperative driving mode. , when the first robot 100a escapes the trap situation, it may perform the cooperative driving mode again to proceed with cleaning. After the second robot 100b returns to the second charging station 400b, if the first robot 100a fails to escape the trap situation for a preset period of time, the second robot 100b releases the cooperative driving mode. and finish cleaning.
  • the second robot 100b can perform cleaning of the cleaning area in which the cleaning of the first robot 100a is completed but the second robot 100b does not travel. there is.
  • step S1310 as a result of the first robot 100a performing the trap escape driving, it is determined whether the first robot 100a has escaped the trap situation.
  • the first robot 100a escapes the trap situation, neither the first robot 100a nor the second robot 100b is in the trap situation. Therefore, cooperative driving is performed (S1100).
  • the second robot 100b returns to the second charging station 400b ( S1600 ).
  • a trap scenario according to case B is performed.
  • the trap scenario according to case B is a case in which the second robot 100b is a trap situation and the first robot 100a is not a trap situation. (100a) and means the running of the second robot (100b).
  • the second robot 100b performs the trap escape driving. While the second robot 100b performs the trap escape driving, the first robot 100a may stand by in place for a first preset time. In addition, the first robot 100a may wait for a first time at the point where the cleaning is finished after terminating the driving of the cleaning area being cleaned. While the first robot 100a is waiting, if the second robot 100b escapes the trap situation, the first robot 100a releases the standby state and performs cooperative driving again with the second robot 100b .
  • the first robot 100a may re-clean the already cleaned cleaning area after waiting for the first time.
  • the re-cleaning time may be set to a preset second time. If the second robot 100b escapes the trap situation while the first robot 100a performs re-cleaning for the second time period, the first robot 100a stops the re-cleaning and communicates with the second robot 100b. Perform collaborative driving again.
  • the first robot 100a when the second robot 100b fails to escape the trap situation during the first time during which the first robot 100a waits and the second time for performing re-cleaning, the first robot 100a operates cooperatively It is possible to cancel the mode and enter the independent driving mode to perform independent driving. That is, while the second robot 100b performs trap escape driving, the first robot 100a waits for a first time, performs re-cleaning for a second time, and then performs independent driving according to the independent driving mode can do.
  • the first robot 100a waits for the first time or does not perform re-cleaning for the second time.
  • the cooperative driving mode may be immediately released and the second robot 100b may return to the first charging station 400a.
  • the first robot 100a when only the second robot 100b is in the trap situation, the first robot 100a returns to the first charging station 400a without releasing the cooperative driving mode, and then the second robot 100b escapes the trap situation. Then, collaborative driving can be performed again. After the first robot 100a returns to the first charging station 400a, if the second robot 100b fails to escape the trap situation for a preset period of time, the first robot 100a releases the cooperative driving mode. and finish cleaning.
  • step S1410 as a result of the second robot 100b performing the trap escape driving, it is determined whether the second robot 100b has escaped the trap situation.
  • the second robot 100b escapes the trap situation, neither the first robot 100a nor the second robot 100b is in the trap situation. Therefore, cooperative driving is performed (S1100).
  • the first robot 100a performs an independent driving according to the independent driving mode ( S1700 ).
  • step S1500 both the first robot 100a and the second robot 100b correspond to case C, which is a trap situation, and the first robot 100a and the second robot 100b respectively perform trap escape driving. do.
  • step S1510 it is determined whether the first robot 100a and the second robot 100b escape from the trap situation, respectively.
  • the case A trap scenario is performed (S1300).
  • case B trap scenario S1400
  • cooperative driving may be performed ( S1100 ).
  • the first robot 100a and the second robot 100b may enter the cooperative driving mode using the network 50 .
  • the first robot 100a and the second robot 100b may enter the cooperative driving mode and understand each other's position information
  • the first robot 100a and the second robot 100b Contaminants in the cleaning target area Z4 may be sucked by driving before driving.
  • the contaminants may be a concept including all inhalable substances such as dust, foreign substances, and garbage existing in the cleaning target zone Z4.
  • the second robot 100b may drive along the path L1 traveled by the first robot 100a to wipe the floor of the cleaning target area Z4 .
  • the second robot 100b wipes the floor
  • the second robot 100b wipes a substance, such as a liquid
  • the first robot 100a and the second robot 100b may perform a preset scenario in response to an error occurring while performing cooperative driving.
  • Table 1 is a table showing the first to seventh embodiments regarding preset scenarios in which the first robot 100a and the second robot 100b perform in response to an error occurring while performing cooperative driving.
  • 'error' means that the first robot 100a or the second robot 100b is caught in an obstacle, or the wheel is missing, or the motor that rotates the wheel is broken, etc. It means that it cannot be performed.
  • 'normal' means a state in which the first robot 100a or the second robot 100b does not generate an error and can continue to perform cooperative driving.
  • the first robot 100a and the second robot 100b may include a button for receiving a resume command from the user.
  • the first robot 100a and the second robot 100b may perform cooperative driving again.
  • the re-run command relates to the second embodiment, the third embodiment, the sixth embodiment, and the seventh embodiment, which will be described later.
  • Examples 1 to 7 will be described in detail.
  • Example State of the first robot 100a State of the second robot 100b One error (a) normal 2 error (b) normal 3 error (c) normal 4 error (d) error (e) 5 normal error (f) 6 normal error (g) 7 normal error (h)
  • the first embodiment is a scenario in which an error (a) occurs in the first robot 100a and a preset waiting time elapses while performing cooperative driving.
  • the first robot 100a may turn off the power after a preset waiting time.
  • the preset waiting time may be 10 minutes.
  • the second robot 100b releases the cooperative driving mode and travels (L2) to the point P1 where the first robot 100a travels, and then 2 It is possible to return (L3) to the charging station (400b).
  • the point P1 at which the first robot 100a travels is the position of the first robot 100a at the time when the error (a) occurs.
  • the second robot 100b may drive (L2) to the point P1 where the first robot 100a has sucked the contaminants, wipe the floor, and then return (L3) to the second charging station 400b.
  • the second robot 100b cancels the cooperative driving mode, and then does not return to the second charging station 400b and performs independent driving.
  • an error (b) occurred in the first robot 100a, but at a preset waiting time, the error (b) was resolved and a re-running command was input to the first robot 100a
  • the first robot 100a and the second robot 100b may perform cooperative driving again.
  • the preset waiting time may be 10 minutes.
  • the second robot 100b runs from the time of the occurrence of the error (b) until the point at which the cooperative driving is performed again. You can drive (L4) the area to be cleaned again. That is, if the second robot 100b is left in place during the waiting time, water may rise at the waiting point, so that the second robot 100b can wipe the floor of the area to be cleaned again that has already cleaned the floor.
  • the first robot 100a and the second robot 100b do not perform cooperative driving, cancel the cooperative driving mode, and then perform independent driving, respectively.
  • an error (c) occurs in the first robot 100a while performing cooperative driving, and the error (c) is resolved and the first robot 100a is reset at a preset waiting time.
  • the preset waiting time may be 10 minutes.
  • the first robot 100a may release the cooperative driving mode and then perform independent driving (L5).
  • the second robot 100b releases the cooperative driving mode, travels (L6) to the point P2 where the first robot 100a travels (L6), and then returns to the second charging station 400b (L7).
  • the point P2 at which the first robot 100a travels is the position of the first robot 100a at the time when the error c occurs. That is, the second robot 100b may drive (L6) to the point P2 where the first robot 100a has sucked the contaminants, wipe the floor, and then return (L7) to the second charging station 400b.
  • the second robot 100b cancels the cooperative driving mode, and then performs independent driving without returning to the second charging station 400b.
  • the first robot 100a and the second robot 100b cancel the cooperative driving mode and then return to the charging stations 400a and 400b, respectively.
  • the first robot 100a after the first robot 100a and the second robot 100b release the cooperative driving mode, the first robot 100a returns to the first charging station 400a, It may be considered that the second robot 100b performs independent driving.
  • the fourth embodiment while performing cooperative driving, errors occur in both the first robot 100a and the second robot 100b, and This is a scenario when the set waiting time has elapsed. That is, in the fourth embodiment, an error (d) occurs in the first robot 100a and an error (e) occurs in the second robot. Referring to FIG. 25 , the first robot 100a and the second robot 100b may turn off their power after a preset waiting time, respectively.
  • the preset waiting time may be 10 minutes.
  • the fifth embodiment is a scenario in which an error (f) occurs in the second robot 100b while performing cooperative driving and a preset waiting time elapses.
  • the second robot 100b may turn off the power after a preset waiting time.
  • the preset waiting time may be 10 minutes.
  • the first robot 100a may cancel the cooperative driving mode and then perform independent driving (L8).
  • the first robot 100a releases the cooperative driving mode, and then returns to the charging station 400a without performing independent driving.
  • an error (g) occurred in the second robot 100b while performing cooperative driving, but in a preset waiting time, a command to solve the error g and re-run to the second robot 100b It is an input scenario, and a scenario in which the first robot 100a and the second robot 100b grasp each other's position information.
  • the first robot 100a and the second robot 100b may perform cooperative driving again.
  • the preset waiting time may be 10 minutes.
  • the first robot 100a from the time of occurrence of the error g until the time of performing cooperative driving again, the first robot 100a itself You can drive (L9) the area to be cleaned again.
  • the first robot 100a and the second robot 100b do not perform cooperative driving, release the cooperative driving mode, and then perform independent driving, respectively. can
  • an error (h) occurs in the second robot while performing cooperative driving, and an error (h) is resolved and a re-running command is input to the second robot 100b at a preset waiting time
  • the preset waiting time may be 10 minutes.
  • the first robot 100a may release the cooperative driving mode and then perform a single driving (L10).
  • the second robot 100b may release the cooperative driving mode, travel to the point P3 where the first robot 100a travels, and then return to the second charging station 400b.
  • the point P3 at which the first robot 100a travels is the position of the first robot 100a at the time when the error h occurs. That is, the second robot 100b may drive (L11) to the point P3 where the first robot 100a has sucked the contaminants, wipe the floor, and then return to the second charging station 400b (L12). .
  • the first robot 100a releases the cooperative driving mode and then returns to the first charging station 400a without performing independent driving.
  • the second robot 100b cancels the cooperative driving mode, and then does not return to the second charging station 400b, and performs independent driving, and the first robot 100a In this case, after canceling the cooperative driving mode, it may be considered to perform an independent driving or to return to the charging station 400a.
  • FIGS. 27A to 28C a mobile robot system 1 that performs a preset scenario in response to a kidnap generated while performing cooperative driving will be described with reference to FIGS. 27A to 28C .
  • the first robot 100a and the second robot 100b may enter the cooperative driving mode using the network 50 .
  • the first robot 100a and the second robot 100b may be sucked by driving before the driving.
  • the contaminants may be a concept including all inhalable substances such as dust, foreign substances, and garbage existing in the cleaning target zone Z4.
  • the second robot 100b may drive along the path L1 traveled by the first robot 100a to wipe the floor of the cleaning target area Z4 .
  • the second robot 100b wipes the floor
  • the second robot 100b wipes a substance, such as a liquid
  • a substance such as a liquid
  • the first robot 100a and the second robot 100b may perform a preset scenario in response to a kidnap that occurs during collaborative driving.
  • Table 2 is a table showing the first to seventh embodiments regarding preset scenarios that the first robot 100a and the second robot 100b perform in response to a key nap that occurs while performing collaborative driving .
  • 'kidnap' means that the user picks up the first robot 100a or the second robot 100b while driving and places it in a different position.
  • 'normal' means a state in which the first robot 100a or the second robot 100b does not generate a kidnap and can continue to perform cooperative driving.
  • the first robot 100a and the second robot 100b may include a button for receiving a resume command from the user.
  • the first robot 100a and the second robot 100b may perform cooperative driving again.
  • the re-run command relates to the second embodiment, the third embodiment, the fourth embodiment, the sixth embodiment, and the seventh embodiment, which will be described later.
  • Examples 1 to 7 will be described in detail.
  • Example State of the first robot 100a State of the second robot 100b One Kidnap (i) normal 2 Kidnap (j) normal 3 Kidnap (k) normal 4 Kidnap (l) Kidnap (m) 5 normal Kidnap (n) 6 normal Kidnap (o) 7 normal Kidnap (p)
  • the first embodiment is a scenario in which a kidnap (i) occurs in the first robot 100a and a preset waiting time elapses while performing cooperative driving.
  • the first robot 100a may turn off the power after a preset waiting time.
  • the preset waiting time may be 10 minutes.
  • the second robot 100b releases the cooperative driving mode and travels (L13) to the point Q1 where the first robot 100a travels, and then 2 It is possible to return (L14) to the charging station (400b).
  • the point Q1 at which the first robot 100a travels is the position of the first robot 100a at the time when the kidnap i occurs.
  • the second robot 100b may drive (L13) to the point Q1 where the first robot 100a has sucked the contaminants, wipe the floor, and then return (L14) to the second charging station 400b.
  • the second robot 100b cancels the cooperative driving mode, and then performs independent driving without returning to the second charging station 400b.
  • a key nap j occurs in the first robot 100a
  • a re-running command is input to the first robot 100a at a preset waiting time, and the first robot 100a ) and the second robot 100b is a scenario in which the position information of each other is grasped.
  • the first robot 100a and the second robot 100b may perform cooperative driving again.
  • the preset waiting time may be 10 minutes.
  • the second robot 100b runs by itself from the occurrence of the kidnap j to the time before the cooperative driving is performed again. You can drive (L15) the area to be cleaned again. That is, if the second robot 100b is left in place during the waiting time, water may rise at the waiting point, so that the second robot 100b can wipe the floor of the area to be cleaned again that has already cleaned the floor.
  • the first robot 100a and the second robot 100b do not perform cooperative driving, cancel the cooperative driving mode, and then perform independent driving, respectively.
  • a key nap (k) occurs in the first robot 100a during cooperative driving, and a re-running command is input to the first robot 100a at a preset waiting time, but the second This is a scenario in which the first robot 100a and the second robot 100b do not grasp each other's position information.
  • the preset waiting time may be 10 minutes.
  • the first robot 100a may release the cooperative driving mode and then perform independent driving (L16).
  • the second robot 100b releases the cooperative driving mode, travels (L17) to the point Q2 where the first robot 100a travels, and then returns to the second charging station 400b (L18).
  • the point Q2 at which the first robot 100a travels is the position of the first robot 100a at the time when the kidnap k occurs. That is, the second robot 100b may drive (L17) to the point Q2 where the first robot 100a has sucked the contaminants, wipe the floor, and then return (L18) to the second charging station 400b.
  • the second robot 100b cancels the cooperative driving mode, and then performs independent driving without returning to the second charging station 400b.
  • the first robot 100a and the second robot 100b cancel the cooperative driving mode and then return to the charging stations 400a and 400b, respectively.
  • the first robot 100a after the first robot 100a and the second robot 100b release the cooperative driving mode, the first robot 100a returns to the first charging station 400a, It may be considered that the second robot 100b performs independent driving.
  • both the first robot 100a and the second robot 100b generate a key nap, This is a scenario when the preset waiting time has elapsed. That is, the fourth embodiment is a case in which the kid nap l is generated in the first robot 100a and the kid nap m is generated in the second robot 100b.
  • the first robot 100a and the second robot 100b may perform cooperative driving again.
  • the preset waiting time may be 10 minutes.
  • the fourth embodiment when a re-run command is input to only one of the first robot 100a and the second robot 100b, the first robot 100a and the second robot 100b, According to circumstances, any one of the above-described first to third embodiments and the fifth to seventh embodiments to be described later may be followed.
  • the fifth embodiment is a scenario in which a kidnap n occurs in the second robot 100b while performing cooperative driving and a preset waiting time elapses.
  • the second robot 100b may turn off the power after a preset waiting time.
  • the preset waiting time may be 10 minutes.
  • the first robot 100a may cancel the cooperative driving mode and then perform independent driving (L19).
  • it may be considered that the first robot 100a releases the cooperative driving mode, and then returns to the charging station 400a without performing independent driving.
  • the first robot 100a and the second robot 100b may perform cooperative driving again.
  • the preset waiting time may be 10 minutes.
  • the first robot 100a runs by itself from the occurrence of the kidnap o to the time before the cooperative driving is performed again. It is possible to drive again (L20) the area to be cleaned.
  • a key nap p is generated in the second robot during cooperative driving, and a re-running command is input to the second robot 100b at a preset waiting time, but with the first robot 100a and This is a scenario in which the second robot 100b fails to grasp each other's position information.
  • the preset waiting time may be 10 minutes.
  • the first robot 100a may cancel the cooperative driving mode and then perform a single driving L21.
  • the second robot 100b may release the cooperative driving mode, travel to the point Q3 where the first robot 100a travels, and then return to the second charging station 400b.
  • the point Q3 at which the first robot 100a travels is the position of the first robot 100a at the time when the kidnap p occurs. That is, the second robot 100b may drive (L22) up to the point Q3 where the first robot 100a has sucked the contaminants, wipe the floor, and then return (L23) to the second charging station 400b. .
  • the first robot 100a releases the cooperative driving mode and then returns to the first charging station 400a without performing independent driving.
  • the second robot 100b cancels the cooperative driving mode, and then does not return to the second charging station 400b, and performs independent driving, and the first robot 100a In this case, after canceling the cooperative driving mode, it may be considered to perform an independent driving or to return to the charging station 400a.
  • the mobile robot system 1 for performing a preset scenario in response to a communication failure occurring during collaborative driving will be described.
  • the first robot 100a and the second robot 100b may enter the cooperative driving mode using the network 50 .
  • the first robot 100a and the second robot 100b may be sucked by driving before the driving.
  • the contaminants may be a concept including all inhalable substances such as dust, foreign substances, and garbage existing in the cleaning target zone Z4.
  • the second robot 100b may drive along the path L1 traveled by the first robot 100a to wipe the floor of the cleaning target area Z4 .
  • the second robot 100b wipes the floor
  • the second robot 100b wipes a substance, such as a liquid
  • a communication failure may occur in at least one of the first robot 100a and the second robot 100b.
  • the communication failure means any type of failure in which the first robot 100a or the second robot 100b cannot transmit or receive data with another mobile robot using a network.
  • the first robot 100a and the second robot 100b may perform a preset scenario in response to a communication failure occurring while performing cooperative driving.
  • the network 50 connecting the first robot 100a and the second robot 100b may include a first network and a second network.
  • the first network may be a network for the first robot 100a and the second robot 100b to share map information of the cleaning target area Z4.
  • the first network may be Wi-Fi.
  • the second network may be a network for determining the separation distance between the first robot 100a and the second robot 100b , between the first robot 100a and the second robot 100b.
  • the second network may be UWB.
  • the first robot 100a and the second robot 100b, a method of sharing map information using Wi-Fi, and a method of determining the separation distance between the first robot 100a and the second robot 100b using UWB are described above. so it is omitted.
  • a first embodiment and a second embodiment of a preset scenario performed by the first robot 100a and the second robot 100b in response to a communication failure occurring during collaborative driving will be described in detail.
  • the first embodiment is a scenario in which the first network or the second network is disconnected between the first robot 100a and the second robot 100b while cooperative driving is being performed.
  • the first robot 100a and the second robot 100b may continue to perform cooperative driving. That is, the disconnection of the first network or the second network between the first robot 100a and the second robot 100b means that the first network and between the first robot 100a and the second robot 100b are disconnected. It means that one of the second networks is connected.
  • the second embodiment is a scenario in which both the first network and the second network are disconnected between the first robot 100a and the second robot 100b while cooperative driving is being performed.
  • the first robot 100a may release the cooperative driving mode and then perform independent driving.
  • the second robot 100b may return to the second charging station 100b after canceling the cooperative driving mode.
  • the first robot 100a and the second robot 100b may enter the cooperative driving mode using the network 50 . Since the process in which the first robot 100a and the second robot 100b enter the cooperative driving mode has been described above, a detailed description thereof will be omitted.
  • the first robot 100a and the second robot 100b may perform cooperative driving by identifying each other's positions.
  • the first robot 100a may travel before the second robot 100b travels to suck the contaminants in the cleaning target area Z4 .
  • the contaminants may be a concept including all inhalable substances such as dust, foreign substances, and garbage existing in the cleaning target zone Z4.
  • the second robot 100b may drive along the path L1 traveled by the first robot 100a to wipe the floor of the cleaning target area Z4 .
  • the second robot 100b wipes the floor, it may mean that the second robot 100b wipes a substance, such as a liquid, that cannot be sucked by the first robot 100a by wiping with water.
  • the first robot 100a and the second robot 100b may determine whether to release the cooperative driving mode in response to an error, a key snap, or a communication failure occurring during cooperative driving. That is, an error, a key snap, or a communication failure may occur in at least one of the first robot 100a and the second robot 100b while performing cooperative driving.
  • the first robot 100a and the second robot 100b may perform a preset scenario in response to an error, a key snap, or a communication failure occurring during collaborative driving.
  • a preset scenario corresponding to an error, a key snap, or a communication failure occurring during collaborative driving has been described above, and thus a detailed description thereof will be omitted.
  • the mobile robot system 1 may include a first robot 100a and a second robot 100b.
  • the first robot 100a and the second robot 100b are provided in the main body 110 and the main body 110 forming the exterior, and use the network 50 to communicate data with other mobile robots. It may include a communication unit 1100 for sending and receiving.
  • the first robot 100a may include a cleaning unit 120 mounted on one side of the main body 110 to suck contaminants in the area to be cleaned.
  • the second robot 100b may include a mop part (not shown) that is mounted on one side of the main body 110 and wipes the floor of the area to be cleaned.
  • the network 50 connecting the first robot 100a and the second robot 100b may include a first network and a second network.
  • the first network may be a network for the first robot 100a and the second robot 100b to share map information of the area to be cleaned.
  • the first network may be Wi-Fi.
  • the second network may be a network for determining the separation distance between the first robot 100a and the second robot 100b , between the first robot 100a and the second robot 100b.
  • the second network may be UWB.
  • the first robot 100a and the second robot 100b may perform independent driving or cooperative driving from this configuration.
  • the first robot 100a and the second robot 100b may perform a preset scenario in response to an error, a key snap, or a communication failure occurring during collaborative driving.
  • a preset scenario corresponding to an error, a key snap, or a communication failure that occurs during collaborative driving performed by the first robot 100a and the second robot 100b has been described above, and thus a detailed description thereof will be omitted.
  • Example 3> of the mobile robot system 1 for performing a preset scenario in response to an obstacle detected during cooperative driving will be described with reference to FIGS. 30 to 34 .
  • the first robot 100a and the second robot 100b may enter the cooperative driving mode using the network 50 .
  • the first robot 100a and the second robot 100b move the cleaning target area X1.
  • cooperative driving may be performed for each unit area.
  • the first robot 100a and the second robot 100b perform cooperative driving, the first robot 100a travels before the driving of the second robot 100b in any one of the plurality of unit areas. Contaminants of (eg, the first unit area A1) may be sucked in.
  • the contaminant may be a concept including all inhalable substances, such as dust, foreign substances, and garbage, which exist in each unit area.
  • the second robot 100b travels along the path L1 traveled by the first robot 100a, and in any one of the plurality of unit areas (the unit in which the first robot 100a sucks contaminants) It is possible to wipe the floor of the area, the first unit area A1).
  • the second robot 100b wipes the floor, it may mean that the second robot 100b wipes a substance, such as a liquid, that cannot be sucked by the first robot 100a by wiping with water.
  • a detailed method of cooperatively driving the first robot 100a and the second robot 100b for each divided unit area is omitted since it has been described above.
  • At least one of the first robot 100a and the second robot 100b may detect an obstacle during cooperative driving in any one of the plurality of unit areas. Specifically, at least one of the first robot 100a and the second robot 100b is disposed between the divided areas (eg, between the first unit area A1 and the second unit area A2), Alternatively, an obstacle present inside the divided area (eg, inside the first unit area A1) may be detected.
  • an obstacle existing between the divided regions is defined as a first obstacle OB1
  • an obstacle existing inside the divided regions is defined as a second obstacle OB2 .
  • the first obstacle OB1 or the second obstacle OB2 may be a threshold, a carpet, or a cliff.
  • the first obstacle OB1 or the second obstacle OB2 is an obstacle that the first robot 100a and the second robot 100b can climb, and is an obstacle formed in a preset range, height or depth.
  • the first robot 100a may recognize an obstacle formed with a height of 5 [mm] or more as an obstacle capable of climbing.
  • the second robot 100b can recognize an obstacle having a height of 4 [mm] or more as an obstacle that can be climbed.
  • the first robot 100a can recognize an obstacle formed to a depth of 30 [mm] or more as an obstacle that can be climbed in the case of single travel.
  • the first robot 100a may recognize an obstacle formed to a depth of 10 [mm] or more as an obstacle capable of climbing.
  • the second robot 100b may recognize an obstacle formed to a depth of 10 [mm] or more as an obstacle capable of climbing.
  • the first robot 100a and the second robot 100b climbing an obstacle means crossing a threshold, crossing a carpet, passing a gap in a cliff, or going down and then going up a slope of a cliff.
  • reference numerals M1 to M17 denote traveling paths of the first robot 100a
  • reference numerals N1 to N13 denote traveling paths of the second robot 100b.
  • the first robot 100a and the second robot 100b perform the first obstacle OB1 during cooperative driving M1 and N1 in the first unit area A1.
  • the first robot 100a avoids the first obstacle OB1 (M2) to complete the cooperative driving in the first unit area A1 (M3), and then moves to the second unit area A2.
  • the second robot 100b avoids the first obstacle OB1 (M2) to complete the cooperative driving in the first unit area A1 (M3), and then the second charging station 100b ) can be returned (N4).
  • the first robot 100a enters the second unit area A2 without avoiding the first obstacle OB1 , and the second robot 100b performs the first After completing the floor cleaning in the unit area A1 , it may be considered to wait until the first robot 100a completes suction of the contaminants in the second unit area A2 .
  • the first robot 100a and the second robot 100b perform cooperative driving (M6, N5) in the first unit area A1, the first robot 100a This fails to detect the first obstacle OB1 and enters (M7) the second unit area A2, and the second robot 100b detects the first obstacle OB1 and overcomes the first obstacle OB1.
  • M6 cooperative driving
  • N6 avoidance
  • the first robot 100a after the first robot 100a completes (M8) suction of contaminants in the second unit area A2, it can move (M9) to the position P1 where the second robot 100b sends the notification. there is.
  • the second robot 100b 1 Unit Area (A1) can be completed with floor cleaning.
  • the second robot 100b may not wipe the floor of the second unit area A2 in which the first robot 100a has completed suctioning the contaminants.
  • the driving of the first robot 100a and the second robot 100b in the above-described second embodiment should be understood as driving in a cooperative driving mode, not independent driving.
  • the second robot 100b may detect the first obstacle OB1 , and transmit information about the first obstacle OB1 to the first robot 100a. Then, the first robot 100a receives information about the first obstacle OB1 from the second robot 100b, and places the first obstacle OB1 on the map stored in the memory 1700 . can be merged That is, the second robot 100b may share information about the first obstacle OB1 with the first robot 100a.
  • the first robot 100a and the second robot 100b perform the first obstacle OB1 during cooperative driving M10 and N8 in the first unit area A1.
  • This is a scenario of entering (M11, N9) into the second unit area (A2) without detecting .
  • the first robot 100a and the second robot 100b may perform cooperative driving M12 and N10 in the second unit area A2 .
  • the first robot 100a and the second robot 100b perform cooperative driving (M13, N11) in the first unit area A1, the first robot 100a
  • the second obstacle OB2 is sensed to avoid (M14) the second obstacle, and then moves to the second unit area A2 (M15), and the second robot 100b moves to the second obstacle OB2.
  • This is a scenario in which the second obstacle OB2 is not avoided (N12) by not detecting .
  • the first robot 100a is notified that it cannot enter the second unit area A2. can be sent.
  • the first robot 100a may complete (M16) suction of the contaminants in the second unit area A2, and then move (M17) to a position where the second robot 100b transmits the notification.
  • the second robot 100b even when the first robot 100a does not complete the suction of the contaminants in the first unit area A1 and moves to the second unit area A2, the second robot 100b, The floor cleaning of the first unit area A1 may be completed.
  • the second robot 100b may not wipe the floor of the second unit area A2 in which the first robot 100a has completed suctioning the contaminants.
  • the driving of the first robot 100a and the second robot 100b in the above-described fourth embodiment should be understood as driving in a cooperative driving mode, not independent driving.
  • the first robot 100a may detect the second obstacle OB2 , and transmit information about the second obstacle OB2 to the second robot 100b . Then, the second robot 100b receives information about the second obstacle OB2 from the first robot 100a, and places the second obstacle OB2 on the map stored in the memory 1700 . can be merged That is, the first robot 100a may share information about the second obstacle OB2 with the second robot 100b.
  • the first robot 100a and the second robot 100b may enter the cooperative driving mode using the network 50 . Since the process in which the first robot 100a and the second robot 100b enter the cooperative driving mode has been described above, a detailed description thereof will be omitted.
  • step S3200 referring back to FIG. 30, the first robot 100a and the second robot 100b divide the cleaning target area X1 into a plurality of unit areas (eg, the cleaning target area ( By dividing X1) into a first unit area A1 and a second unit area A2), cooperative driving may be performed for each unit area.
  • the first robot 100a and the second robot 100b perform cooperative driving, the first robot 100a travels before the driving of the second robot 100b in any one of the plurality of unit areas.
  • Contaminants of eg, the first unit area A1
  • the contaminant may be a concept including all inhalable substances, such as dust, foreign substances, and garbage, which exist in each unit area.
  • the second robot 100b travels along the path L1 traveled by the first robot 100a, and in any one of the plurality of unit areas (the unit in which the first robot 100a sucks contaminants) It is possible to wipe the floor of the area, the first unit area A1).
  • the second robot 100b wipes the floor, it may mean that the second robot 100b wipes a substance, such as a liquid, that cannot be sucked by the first robot 100a by wiping with water.
  • a detailed method of cooperatively driving the first robot 100a and the second robot 100b for each divided unit area is omitted since it has been described above.
  • At least one of the first robot 100a and the second robot 100b may detect an obstacle during cooperative driving in any one unit area among a plurality of unit areas. Specifically, at least one of the first robot 100a and the second robot 100b is disposed between the divided areas (eg, between the first unit area A1 and the second unit area A2), Alternatively, an obstacle present inside the divided area (eg, inside the first unit area A1) may be detected.
  • an obstacle existing between the divided regions is defined as a first obstacle OB1
  • an obstacle existing inside the divided regions is defined as a second obstacle OB2 .
  • the first obstacle OB1 or the second obstacle OB2 may be a threshold, a carpet, or a cliff.
  • the first obstacle OB1 or the second obstacle OB2 is an obstacle that the first robot 100a and the second robot 100b can climb, and is an obstacle formed in a preset range, height or depth.
  • the first robot 100a may recognize an obstacle formed with a height of 5 [mm] or more as an obstacle capable of climbing.
  • the second robot 100b can recognize an obstacle having a height of 4 [mm] or more as an obstacle that can be climbed.
  • the first robot 100a can recognize an obstacle formed to a depth of 30 [mm] or more as an obstacle that can be climbed in the case of single travel.
  • the first robot 100a may recognize an obstacle formed to a depth of 10 [mm] or more as an obstacle capable of climbing.
  • the second robot 100b may recognize an obstacle formed to a depth of 10 [mm] or more as an obstacle capable of climbing.
  • the first robot 100a and the second robot 100b climbing an obstacle means crossing a threshold, crossing a carpet, passing a gap in a cliff, or going down and then going up a slope of a cliff.
  • the first robot 100a and the second robot 100b, in step S3300 at least one of the first robot 100a and the second robot 100b, the first obstacle OB1 or the second Embodiments 1 to 4 will be described in detail with respect to a preset scenario performed when the obstacle OB2 is detected.
  • the first robot 100a and the second robot 100b perform the first obstacle OB1 during cooperative driving M1 and N1 in the first unit area A1.
  • the first robot 100a avoids the first obstacle OB1 (M2) to complete the cooperative driving in the first unit area A1 (M3), and then moves to the second unit area A2.
  • the second robot 100b avoids the first obstacle OB1 (M2) to complete the cooperative driving in the first unit area A1 (M3), and then the second charging station 100b ) can be returned (N4).
  • the first robot 100a enters the second unit area A2 without avoiding the first obstacle OB1 , and the second robot 100b performs the first After completing the floor cleaning in the unit area A1 , it may be considered to wait until the first robot 100a completes suction of the contaminants in the second unit area A2 .
  • the first robot 100a and the second robot 100b perform cooperative driving (M6, N5) in the first unit area A1, the first robot 100a This fails to detect the first obstacle OB1 and enters (M7) the second unit area A2, and the second robot 100b detects the first obstacle OB1 and overcomes the first obstacle OB1.
  • M6 cooperative driving
  • N6 avoidance
  • the first robot 100a after the first robot 100a completes (M8) suction of contaminants in the second unit area A2, it can move (M9) to the position P1 where the second robot 100b sends the notification. there is.
  • the second robot 100b 1 Unit Area (A1) can be completed with floor cleaning.
  • the second robot 100b may not wipe the floor of the second unit area A2 in which the first robot 100a has completed suctioning the contaminants.
  • the driving of the first robot 100a and the second robot 100b in the above-described second embodiment should be understood as driving in a cooperative driving mode, not independent driving.
  • the second robot 100b may detect the first obstacle OB1 , and transmit information about the first obstacle OB1 to the first robot 100a. Then, the first robot 100a receives information about the first obstacle OB1 from the second robot 100b, and places the first obstacle OB1 on the map stored in the memory 1700 . can be merged That is, the second robot 100b may share information about the first obstacle OB1 with the first robot 100a.
  • the first robot 100a and the second robot 100b perform the first obstacle OB1 during cooperative driving M10 and N8 in the first unit area A1.
  • This is a scenario of entering (M11, N9) into the second unit area (A2) without detecting .
  • the first robot 100a and the second robot 100b may perform cooperative driving M12 and N10 in the second unit area A2 .
  • the first robot 100a and the second robot 100b perform cooperative driving (M13, N11) in the first unit area A1, the first robot 100a
  • the second obstacle OB2 is sensed to avoid (M14) the second obstacle, and then moves to the second unit area A2 (M15), and the second robot 100b moves to the second obstacle OB2.
  • This is a scenario in which the second obstacle OB2 is not avoided (N12) by not detecting .
  • the first robot 100a is notified that it cannot enter the second unit area A2. can be sent.
  • the first robot 100a may complete (M16) suction of the contaminants in the second unit area A2, and then move (M17) to a position where the second robot 100b transmits the notification.
  • the second robot 100b even when the first robot 100a does not complete the suction of the contaminants in the first unit area A1 and moves to the second unit area A2, the second robot 100b, The floor cleaning of the first unit area A1 may be completed.
  • the second robot 100b may not wipe the floor of the second unit area A2 in which the first robot 100a has completed suctioning the contaminants.
  • the driving of the first robot 100a and the second robot 100b in the above-described fourth embodiment should be understood as driving in a cooperative driving mode, not independent driving.
  • the first robot 100a may detect the second obstacle OB2 , and transmit information about the second obstacle OB2 to the second robot 100b . Then, the second robot 100b receives information about the second obstacle OB2 from the first robot 100a, and places the second obstacle OB2 on the map stored in the memory 1700 . can be merged That is, the first robot 100a may share information about the second obstacle OB2 with the second robot 100b.
  • the mobile robot system 1 may include a first robot 100a and a second robot 100b.
  • the first robot 100a and the second robot 100b are provided in the main body 110 and the main body 110 forming the exterior, and use the network 50 to communicate data with other mobile robots. It may include a communication unit 1100 for sending and receiving.
  • the first robot 100a may include a cleaning unit 120 mounted on one side of the main body 110 to suck contaminants in the area to be cleaned.
  • the second robot 100b may include a mop part (not shown) that is mounted on one side of the main body 110 and wipes the floor of the area to be cleaned.
  • the first robot 100a and the second robot 100b may travel alone in a single driving mode, or may enter a cooperative driving mode using the network 50 to perform cooperative driving.
  • the first robot 100a and the second robot 100b enter the cooperative driving mode, the first robot 100a and the second robot 100b divide the area to be cleaned into a plurality of unit areas, each Collaborative driving is possible for each unit area.
  • at least one of the first robot 100a and the second robot 100b may detect an obstacle during cooperative driving in any one of the plurality of unit areas.
  • the first robot 100a and the second robot 100b is disposed between the divided areas (eg, between the first unit area A1 and the second unit area A2), Alternatively, an obstacle present inside the divided area (eg, inside the first unit area A1) may be detected.
  • an obstacle existing between the divided regions is defined as a first obstacle OB1
  • an obstacle existing inside the divided regions is defined as a second obstacle OB2 .
  • the first obstacle OB1 or the second obstacle OB2 may be a threshold, a carpet, or a cliff.
  • the first obstacle OB1 or the second obstacle OB2 is an obstacle that the first robot 100a and the second robot 100b can climb, and is an obstacle formed in a preset range, height or depth.
  • the first robot 100a and the second robot 100b climbing an obstacle means crossing a threshold, crossing a carpet, passing a gap in a cliff, or going down and then going up a slope of a cliff.
  • the first robot 100a and the second robot 100b may perform a preset scenario in response to the first obstacle OB1 or the second obstacle OB2 detected during cooperative driving. As described above, with respect to a preset scenario corresponding to the first obstacle OB1 or the second obstacle OB2, sensed during cooperative driving performed by the first robot 100a and the second robot 100b, A detailed description will be omitted.
  • the charging capacity is insufficient. Due to this, the first robot 100a and the second robot 100b may not be able to perform the cooperative driving. For example, if the charging capacity of any one of the first robot 100a and the second robot 100b is insufficient, it becomes difficult to travel before or after traveling, so it is necessary to stop performing the cooperative driving. If the cessation of the cooperative driving is stopped without a specific motion, there is a risk of causing a problem in the post-running of the first robot 100a and the second robot 100b or causing inconvenience to the user.
  • the embodiment of the system 1 is, as shown in FIG. 11 , the first robot 100a and the second robot 100a and the second A plurality of mobile robots 100a and 100b, including the second robot 100b that is driven based on the electric power charged by the charging station 400b, and travels along the path traveled by the first robot 100a Collaborate to drive
  • each of the first robot 100a and the second robot 100b charges the power of the battery in each of the first charging station 400a and the second charging station 400b. do.
  • the first robot 100a may be a robot that pre-runs in the target area of the cooperative driving and sucks dust, and the second robot 100b moves after the area in which the first robot 100a travels. It could be a robot that cleans dust while driving.
  • the first robot 100a and the second robot 100b detect the capacity charged in each battery while performing the cooperative driving mode, and according to the charging capacity value of the battery
  • the cooperative driving mode is released, and one or more of an independent driving mode and a charging mode of the battery are respectively performed in response to the charging capacity value.
  • each of the first robot 100a and the second robot 100b releases the cooperative driving mode when the charging capacity value of the battery is less than or equal to the reference capacity value, and each charging stand 400a, Go to 400b) to charge the battery or perform the single driving mode.
  • the first robot 100a moves to the first charging station 400a to charge the battery, and the first robot 100a moves to the first charging station 400a to charge the battery.
  • the second robot 100b moves to the second charging station 400b to charge the battery, or the single driving mode is activated.
  • the first robot 100a and the second robot 100b cancel the cooperative driving mode being performed, and at least one of the first robot 100a and the second robot 100b is the corresponding charging station ( 400a and/or 400b) to charge the battery, or at least one of the first robot 100a and the second robot 100b may perform the independent driving mode.
  • the independent driving mode may be performed immediately after the cooperative driving mode is released, or may be performed after moving to the corresponding charging station 400a and/or 400b to charge the battery.
  • Each of the first robot 100a and the second robot 100b may sense the capacity charged in the battery while driving.
  • each of the first robot 100a and the second robot 100b may sense the capacity charged in the battery while the cooperative driving mode is performed.
  • each of the first robot 100a and the second robot 100b may sense the capacity charged in the battery while performing a mode other than the cooperative driving mode.
  • Each of the first robot 100a and the second robot 100b may sense the capacity charged in the battery in real time.
  • the first robot 100a senses the charging capacity of the battery built in the first robot 100a in real time while driving, and the second robot 100b, while driving, the second robot 100b ) can detect the charging capacity of the built-in battery in real time.
  • Each of the first robot 100a and the second robot 100b may sense the capacity charged in the battery while driving, and quantify the detection result as the charging capacity value. Accordingly, the detection result of the capacity charged in the battery may be compared with the reference capacity value.
  • Each of the first robot 100a and the second robot 100b releases the cooperative driving mode when the respective charging capacity value is less than or equal to the reference capacity value, and then returns to the charging stations 100a and 100b. You can move to charge the battery.
  • the release of the cooperative driving mode may mean stopping the cooperative driving mode being performed.
  • the first robot 100a detects the charge capacity of the battery while performing the cooperative driving mode
  • the charge capacity value of the first robot 100a is less than or equal to the reference capacity value
  • the first After the robot 100a stops performing the collaborative driving mode it moves to the first charging station 400a to charge the battery
  • the second robot 100b moves the battery while performing the collaborative driving mode.
  • detecting the charging capacity of The battery may be charged by moving to the charging station 400b.
  • each of the first robot 100a and the second robot 100b may share information about the release of the cooperative driving mode with the other robot.
  • each of the first robot 100a and the second robot 100b transmits information about the release of the cooperative driving mode to the other robot when the cooperative driving mode is released, and the It may signal the release of the cooperative driving mode.
  • the first robot 100a when the first robot 100a releases the cooperative driving mode when the charging capacity value is equal to or less than the reference capacity value, the first robot 100a moves the cooperative driving to the second robot 100b
  • the second robot 100b recognizes the release of the cooperative driving mode, and the second robot 100b causes the charging capacity value to be less than or equal to the reference capacity value.
  • the second robot 100b transmits information about the release of the cooperative driving mode to the first robot 100a, so that the first robot 100a is the cooperative driving mode. release may be recognized.
  • the first robot 100a and the second robot 100b may stop performing the cooperative driving mode.
  • the first robot 100a and the second robot 100b respectively, after moving to the charging stand 400a, 400b, charge the battery until the charging capacity of the battery is charged above a certain standard. there is.
  • the charging capacity of the battery is the The battery may be charged until it is charged above a certain standard.
  • the predetermined reference may mean a level of the charging capacity of the battery.
  • the predetermined criterion may be set as a ratio [%] to the total capacity of the battery, or may be set as a capacity unit [Ah] of the battery.
  • Each of the first robot 100a and the second robot 100b preferably after moving to the charging stand 400a, 400b, can charge the battery until the battery is completely charged.
  • Each of the first robot 100a and the second robot 100b recognizes the current position before moving to the charging stations 400a and 400b when the respective charging capacity value is less than or equal to the reference capacity value.
  • the value may be stored, and after the charging capacity of the battery is charged to a certain level or more in the charging stations 400a and 400b, driving may be started using the location information value.
  • each of the first robot 100a and the second robot 100b stores the position information value corresponding to the position before moving to the charging stations 400a and 400b, and the charging stations 400a and 400b)
  • driving may be started using the location information value.
  • each of the first robot 100a and the second robot 100b After each of the first robot 100a and the second robot 100b completes charging the battery in the charging stations 400a and 400b, it moves to a location according to the location information value and starts driving or , or when driving starts, a notification for moving to a location according to the location information value may be output.
  • the driving corresponding to the charging capacity of each of the first robot 100a and the second robot 100b may be performed as shown in the diagram shown in FIG. 36A .
  • the first robot 100a When the charging capacity value of the first robot 100a is equal to or less than the reference capacity value, and the charging capacity value of the second robot 100b is greater than the reference capacity value, the first robot 100a performs the collaboration After releasing the driving mode, the battery is charged by moving to the first charging station 400a, and when the charging capacity of the battery is charged above a certain (capacity) standard, the position before moving to the first charging station 400a You can move to and perform stand-alone driving mode. In this case, after releasing the cooperative driving mode, the second robot 100b may move to the second charging station 400b according to the remaining cleaning area to charge the battery.
  • the second robot 100b may move to the second charging station 400b after completing the driving of the remaining cleaning area. . Also, if the area of the remaining cleaning area does not correspond to a predetermined standard, the second robot 100b may move to the second charging station 400b.
  • the first robot 100a performs the cooperative driving mode during execution P10 of the cooperative driving mode. After releasing the mode, it moves to the first charging station 400a (P11 or P12), but the second robot 100b performs the driving of the remaining cleaning area when the area of the remaining cleaning area corresponds to the predetermined standard.
  • the first robot 100a After completion (P11), it moves (P12) to the second charging station (400b), and if the area of the remaining cleaning area does not meet the predetermined criteria, it immediately moves (P12) to the second charging station (400b), the After charging the charging capacity of the battery in the first charging station 400a above a certain standard, the first robot 100a moves to the position XX1 before moving to the first charging station 400a and moves to the first charging station 400a. A single driving mode of the robot 100a may be performed (P13).
  • each charging station 400a, 400b when the charging capacity value of the first robot 100a is equal to or less than the reference capacity value and the charging capacity value of the second robot 100b is greater than the reference capacity value, the first robot 100a and the Each of the second robots 100b, after releasing the cooperative driving mode, moves to each charging station 400a, 400b to charge the battery, and when the charging capacity of the battery is charged above a certain standard, each charging station A single driving mode may be performed by moving to a position before moving to (400a, 400b).
  • each of the first robot 100a and the second robot 100b sets the charging capacity of the battery in each of the first charging station 400a and the second charging station 400b to a certain standard or more.
  • the first robot 100a moves to the position XX1 before moving to the first charging station 400a to perform the independent driving mode of the first robot 100a
  • the second robot ( 100b) may move to the position XX2 before moving to the second charging station 400b to perform the independent driving mode of the second robot 100b (P22).
  • the first robot 100a and the second robot (100b) each, after releasing the cooperative driving mode, move to each charging station (400a, 400b) to charge the battery, the first robot (100a), the charging capacity of the battery is more than a certain standard When it is charged, it can move to a position before moving to the first charging station 400a to perform an independent driving mode.
  • the first robot 100a moves to the first charging station 400a and the second robot 100b Move to the second charging station 400b (P12), but the first robot 100a charges the charging capacity of the battery in the first charging station 400a to a predetermined standard or more, and then the first charging station 400a ) before moving to the position XX1 to perform the independent driving mode of the first robot 100a (P13).
  • each charging station 400a, 400b moves to each charging station 400a, 400b to charge the battery, and when the charging capacity of the battery is charged above a certain standard, each charging station ( It is also possible to move to a position before moving to 400a and 400b) to perform an independent driving mode, respectively.
  • each of the first robot 100a and the second robot 100b sets the charging capacity of the battery in each of the first charging station 400a and the second charging station 400b to a certain standard or more.
  • the first robot 100a moves to the position XX1 before moving to the first charging station 400a to perform the independent driving mode of the first robot 100a
  • the second robot ( 100b) may move to the position XX2 before moving to the second charging station 400b to perform the independent driving mode of the second robot 100b (P22).
  • both the charging capacity value of the first robot 100a and the charging capacity value of the second robot 100b are equal to or less than the reference capacity value, each of the first robot 100a and the second robot 100b, After releasing the cooperative driving mode, the battery is charged by moving to each of the charging stations 400a and 400b, and the first robot 100a, when the charging capacity of the battery is charged above a certain standard, the first It is possible to perform a single driving mode by moving to a position before moving to the charging station 400a.
  • the first robot 100a moves to the first charging station 400a and the second robot 100b Move to the second charging station 400b (P12), but the first robot 100a charges the charging capacity of the battery in the first charging station 400a to a predetermined standard or more, and then the first charging station 400a ) before moving to the position XX1 to perform the independent driving mode of the first robot 100a (P13).
  • the first robot 100a and the charging capacity value of the second robot 100b are equal to or less than the reference capacity value, the first robot 100a and the second robot 100b, respectively After releasing the cooperative driving mode, it moves to each charging station 400a, 400b to charge the battery, and when the charging capacity of the battery is charged above a certain standard, it moves to each charging station 400a, 400b It is also possible to move to the previous position and perform independent driving mode, respectively.
  • each of the first robot 100a and the second robot 100b sets the charging capacity of the battery in each of the first charging station 400a and the second charging station 400b to a certain standard or more.
  • the first robot 100a moves to the position XX1 before moving to the first charging station 400a to perform the independent driving mode of the first robot 100a
  • the second robot ( 100b) may move to the position XX2 before moving to the second charging station 400b to perform the independent driving mode of the second robot 100b (P22).
  • the driving corresponding to the charging capacity of each of the first robot 100a and the second robot 100b may be performed as shown in the diagram shown in FIG. 36B .
  • the first robot 100a When the charging capacity value of the first robot 100a is equal to or less than the reference capacity value, and the charging capacity value of the second robot 100b is greater than the reference capacity value, the first robot 100a performs the collaboration After releasing the driving mode and switching to the independent driving mode, driving while performing the independent driving mode, the second robot 100b releases the cooperative driving mode, 2
  • the battery may be charged by moving to the charging station 400b. If the area of the remaining cleaning area corresponds to a predetermined (area) standard, the second robot 100b may move to the second charging station 400b after completing the driving of the remaining cleaning area. . Also, if the area of the remaining cleaning area does not correspond to a predetermined standard, the second robot 100b may move to the second charging station 400b.
  • the first robot 100a releases the cooperative driving mode and then switches to the independent driving mode while performing the cooperative driving mode.
  • the second robot 100b moves to the second charging station 400b after completing the driving of the remaining cleaning area when the area of the remaining cleaning area corresponds to the predetermined criterion, and if If the area of the remaining cleaning area does not meet the predetermined standard, it immediately moves to the second charging station 400b, and the first robot 100a sets the charging capacity of the battery in the first charging station 400a to a predetermined standard or more.
  • the first robot 100a may move to a position XX1 before moving to the first charging station 400a to perform a single driving mode of the first robot 100a.
  • the first robot 100a is After canceling the cooperative driving mode and switching to the independent driving mode, driving while performing the independent driving mode, the second robot 100b, after releasing the cooperative driving mode, the second charging stand 400b to charge the battery, and when the charging capacity of the battery is charged above a certain (capacity) standard, it may move to a position before moving to the second charging station 400b to perform an independent driving mode.
  • each of the first robot 100a and the second robot 100b performs the cooperative operation mode during the cooperative driving mode.
  • the first robot 100a switches to the independent driving mode and performs the independent driving mode, but the second robot 100b moves to the second charging station 400b, and the second charging station ( After charging the charging capacity of the battery to a certain standard or more in 400b), it moves to the position x2 before moving to the second charging station 400b to perform the independent driving mode of the second robot 100b.
  • the first robot 100a When the charging capacity value of the first robot 100a is greater than the reference capacity value and the charging capacity value of the second robot 100b is less than or equal to the reference capacity value, the first robot 100a is the cooperative driving After releasing the mode and switching to the independent driving mode, the second robot 100b runs while performing the independent driving mode, and after releasing the cooperative driving mode, moves to the second charging station 400b The battery can be charged.
  • the second robot 100b performs the cooperative driving while performing the cooperative driving mode P30. After releasing the mode, it moves to the second charging station 400b (P31), but the first robot 100a releases the cooperative driving mode and then switches to the independent driving mode to perform the independent driving mode (P32) can be done
  • the first robot 100a when the charging capacity value of the first robot 100a is greater than the reference capacity value and the charging capacity value of the second robot 100b is less than or equal to the reference capacity value, the first robot 100a is After canceling the cooperative driving mode and switching to the independent driving mode, driving while performing the independent driving mode, the second robot 100b, after releasing the cooperative driving mode, moves to the second charging station 400b
  • the battery is charged by moving, and when the charging capacity of the battery is charged above a certain (capacity) standard, it moves to a position before moving to the second charging station 400b to perform an independent driving mode.
  • each of the first robot 100a and the second robot 100b performs the cooperative performance mode while performing the cooperative driving mode.
  • the first robot 100a switches to the independent driving mode and performs the independent driving mode, but the second robot 100b moves to the second charging station 400b, and the second charging station ( After charging the charging capacity of the battery above a certain standard in 400b), it moves to the position XX2 before moving to the second charging station 400b to perform the independent driving mode of the second robot 100b.
  • both the charging capacity value of the first robot 100a and the charging capacity value of the second robot 100b are equal to or less than the reference capacity value, the first robot 100a releases the cooperative driving mode and drives alone After switching to the mode, the vehicle is driven while performing the independent driving mode, and the second robot 100b, after releasing the cooperative driving mode, moves to the second charging station 400b to charge the battery. there is.
  • the second robot 100b performs the cooperative driving while performing the cooperative driving mode P30. After releasing the mode, it moves to the second charging station 400b (P31), but the first robot 100a releases the cooperative driving mode and then switches to the independent driving mode to perform the independent driving mode (P32) can be done
  • the first robot 100a releases the cooperative driving mode and After switching to the independent driving mode, the driving mode is performed, and the second robot 100b, after releasing the cooperative driving mode, moves to the second charging station 400b to charge the battery.
  • the charging capacity of the battery is charged above a certain (capacity) standard, it may move to a position before moving to the second charging station 400b to perform an independent driving mode.
  • each of the first robot 100a and the second robot 100b performs the cooperative performance mode while performing the cooperative driving mode.
  • the first robot 100a switches to the independent driving mode and performs the independent driving mode, but the second robot 100b moves to the second charging station 400b, and the second charging station ( After charging the charging capacity of the battery above a certain standard in 400b), it moves to the position XX2 before moving to the second charging station 400b to perform the independent driving mode of the second robot 100b.
  • the cooperative driving may be performed by the cooperative driving performing method as shown in FIG. 40 .
  • the collaborative driving performing method (hereinafter referred to as a performing method) is driven based on the electric power charged by the first charging station 400a, and the first robot 100a and the second charging station driving in an area to be cleaned
  • each of the first robot 100a and the second robot 100b starts performing the cooperative driving mode (S100), the first robot 100a and the second robot 100b.
  • each of the first robot 100a and the second robot 100b Detecting the capacity charged in the battery by each of the 2 robots 100b (S200), each of the first robot 100a and the second robot 100b comparing the charging capacity value with a preset reference capacity value According to (S300) and the comparison result, at least one of the first robot 100a and the second robot 100b performs an independent driving mode or moves to the charging stations 400a and 400b to charge the battery ( S400).
  • the first robot 100a pre-runs in the target area of the cooperative driving and sucks dust
  • the second robot 100b runs after the area in which the first robot 100a travels, You can wipe off the dust.
  • the starting step ( S4100 ) may be a step in which the first robot 100a and the second robot 100b start driving according to the cooperative driving mode.
  • the detecting step ( S4200 ) may be a step in which each of the first robot 100a and the second robot 100b detects in real time the capacity charged in the battery while driving according to the cooperative driving mode.
  • the first robot 100a detects the charge capacity of the battery built into the first robot 100a, digitizes the detection result as the charge capacity value, and the second robot (100b) detects the charging capacity of the battery built in the second robot (100b), it is possible to quantify the detection result as the charging capacity value.
  • the charging capacity value obtained by digitizing the result of sensing the charging capacity as the reference capacity It may be a step of comparing with a value.
  • the first robot 100a compares the charging capacity value of the first robot 100a with the reference capacity value, and the second robot 100b uses the second robot ( The charging capacity value of 100b) may be compared with the reference capacity value.
  • each of the first robot 100a and the second robot 100b may transmit and share a result of comparing the charging capacity value with the reference capacity value.
  • the first robot 100a and the second robot 100b wherein the charging capacity value is equal to or less than the reference capacity value, moves to the charging stations 400a and 400b to charge the battery. It may be a step to
  • the charging step (S4400) when the charging capacity value of the first robot 100a is equal to or less than the reference capacity value, and the charging capacity value of the second robot 100b exceeds the reference capacity value, FIG. 36a
  • the first robot 100a releases the cooperative driving mode, it moves to the first charging station 400a to charge the battery, and the charging capacity of the battery is higher than a certain standard.
  • the 2 The battery may be charged by moving to the charging station 400b.
  • the second charging station 400b moves to the second charging station 400b after completing the driving of the remaining cleaning area (P11) if applicable (P12), and if the area of the remaining cleaning area does not meet the predetermined criteria, immediately the second charging station 400b ) to move (P12), and the first robot 100a charges the charging capacity of the battery in the first charging station 400a to a certain standard or more, and then moves to the first charging station 400a.
  • the single driving mode of the first robot 100a may be performed (P13).
  • the charging step (S4400) when the charging capacity value of the first robot 100a is equal to or less than the reference capacity value, and the charging capacity value of the second robot 100b exceeds the reference capacity value, FIG. 36a
  • each of the first robot 100a and the second robot 100b releases the cooperative driving mode, it moves to the respective charging stations 400a and 400b to charge the battery
  • the individual driving mode may be performed by moving to a position before moving to each of the charging stations 400a and 400b.
  • the charging step (S4400) is, as shown in FIG. After each of the first robot 100a and the second robot 100b releases the cooperative performance mode, the first robot 100a moves to the first charging station 400a and the second robot 100b After moving (P21) to the second charging station 400b, each of the first robot 100a and the second robot 100b moves to the first charging station 400a and the second charging station 400b, respectively.
  • the first robot 100a After charging the charging capacity of the battery to a certain level or more, the first robot 100a moves to the position XX1 before moving to the first charging station 400a, and the first robot 100a moves to a single driving mode , and the second robot 100b moves to the position XX2 before moving to the second charging station 400b to perform the independent driving mode of the second robot 100b (P22). .
  • the charging step (S400) when the charging capacity value of the first robot 100a exceeds the reference capacity value and the charging capacity value of the second robot 100b is less than or equal to the reference capacity value, As in the case of c, the first robot 100a releases the cooperative driving mode and switches to the independent driving mode, then runs while performing the independent driving mode, and the second robot 100b moves the cooperative driving After releasing the mode, the battery may be charged by moving to the second charging station 400b.
  • the charging step (S4400) is, as shown in FIG.
  • the second robot 100b moves to the second charging station 400b (P31), but the first robot 100a releases the cooperative driving mode and then switches to the independent driving mode
  • the independent driving mode P32 may be performed.
  • the charging step (S4400) when both the charging capacity value of the first robot 100a and the charging capacity value of the second robot 100b are equal to or less than the reference capacity value, as in the case of e in FIG. 36a, After each of the first robot 100a and the second robot 100b releases the cooperative driving mode, the first robot 100a moves to each charging station 400a, 400b to charge the battery, but the first robot 100a When the charging capacity of the battery is charged to a certain level or more, it may move to a position before moving to the first charging station 400a to perform an independent driving mode.
  • the charging step (S4400) is, as shown in FIG. 37 , the cooperative driving
  • the first robot 100a moves to the first charging station 400a.
  • the second robot 100b moves to the second charging station 400b (P12), but the first robot 100a charges the charging capacity of the battery in the first charging station 400a above a certain standard.
  • the first robot 100a may move to a position XX1 before moving to the first charging station 400a to perform an independent driving mode of the first robot 100a (P13).
  • the charging step (S4400) when both the charging capacity value of the first robot 100a and the charging capacity value of the second robot 100b are less than or equal to the reference capacity value, as in the case of f of FIG. 36a, After each of the first robot 100a and the second robot 100b releases the cooperative driving mode, the first robot 100a and the second robot 100b each move to the respective charging stations 400a and 400b to charge the battery, and the charging capacity of the battery is constant. When it is charged above the standard, it may move to a position before moving to each of the charging stations 400a and 400b to perform an independent driving mode, respectively.
  • the charging step (S4400) is, as shown in FIG. 38 , the cooperative driving After each of the first robot 100a and the second robot 100b releases the cooperative performance mode during the mode execution P20, the first robot 100a moves to the first charging station 400a.
  • the first robot 100a and the second robot 100b each move to the first charging stand 400a and the After charging the charging capacity of the battery to a certain standard or more in each of the second charging stations 400b, the first robot 100a moves to the position XX1 before moving to the first charging station 400a and moves to the second charging station 400a.
  • the single driving mode of the first robot 100a is performed, the second robot 100b moves to the position XX2 before moving to the second charging station 400b, and the second robot 100b moves to the independent driving mode of the second robot 100b. may be performed (P22).
  • the performing method including the starting step (S4100), the detecting step (S4200), the comparing step (S4300) and the charging step (S4400) is a computer-readable medium in which a program is recorded It can be implemented as code.
  • the computer-readable medium includes all types of recording devices in which data readable by a computer system is stored. Examples of computer-readable media include Hard Disk Drive (HDD), Solid State Disk (SSD), Silicon Disk Drive (SDD), ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage device, etc. There is also a carrier wave (eg, transmission over the Internet) that is implemented in the form of.
  • the computer may include the control unit 1800 .
  • controller 1100 communication unit
  • control unit 1900 sweeper

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Robotics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Electric Vacuum Cleaner (AREA)
  • Automation & Control Theory (AREA)
  • Charge And Discharge Circuits For Batteries Or The Like (AREA)
  • Manipulator (AREA)

Abstract

La présente invention concerne, selon un mode de réalisation, un système de robot mobile comprenant : un premier robot entraîné sur la base de l'énergie chargée au niveau d'une première station de charge et se déplaçant le long d'une région à nettoyer ; et un second robot entraîné sur la base de l'énergie chargée dans une seconde station de charge et se déplaçant le long d'un trajet le long duquel le premier robot se déplace, chacun du premier robot et du second robot détectant une capacité d'une batterie chargée pendant la réalisation d'un mode d'entraînement de collaboratif, libérant le mode d'entraînement collaboratif en fonction d'une valeur de capacité chargée de la batterie, et effectuant un ou plusieurs parmi un mode d'entraînement unique et un mode de charge de la batterie en réponse à la valeur de capacité chargée.
PCT/KR2021/012300 2020-10-08 2021-09-09 Système de robot mobile WO2022075610A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020200130380A KR20220047453A (ko) 2020-10-08 2020-10-08 이동 로봇 시스템
KR10-2020-0130380 2020-10-08

Publications (1)

Publication Number Publication Date
WO2022075610A1 true WO2022075610A1 (fr) 2022-04-14

Family

ID=81126613

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2021/012300 WO2022075610A1 (fr) 2020-10-08 2021-09-09 Système de robot mobile

Country Status (3)

Country Link
KR (2) KR20220047453A (fr)
TW (1) TWI808480B (fr)
WO (1) WO2022075610A1 (fr)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114983299A (zh) * 2022-06-06 2022-09-02 深圳银星智能集团股份有限公司 多清洁机器人的维护方法、清洁机器人、基站及清洁系统
KR20240050884A (ko) * 2022-10-12 2024-04-19 삼성전자주식회사 맵을 공유하는 로봇 및 그의 맵 공유 방법

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20080052127A (ko) * 2006-12-05 2008-06-11 한국전자통신연구원 청소 로봇의 충전 스테이션으로의 복귀 방법 및 장치
US20180242806A1 (en) * 2015-09-03 2018-08-30 Aktiebolaget Electrolux System of robotic cleaning devices
US20190217474A1 (en) * 2016-06-08 2019-07-18 Ecovacs Robotics Co., Ltd. Mother-child robot cooperative work system and work method thereof
KR20190088115A (ko) * 2018-01-03 2019-07-26 삼성전자주식회사 청소용 이동장치, 협업청소 시스템 및 그 제어방법
KR20200029972A (ko) * 2018-09-06 2020-03-19 엘지전자 주식회사 복수의 자율주행 이동 로봇

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8010229B2 (en) * 2006-12-05 2011-08-30 Electronics And Telecommunications Research Institute Method and apparatus for returning cleaning robot to charge station
EP3804599A4 (fr) * 2018-06-08 2022-08-10 Positec Power Tools (Suzhou) Co., Ltd Robot de nettoyage, procédé de commande de celui-ci et système de robot de nettoyage

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20080052127A (ko) * 2006-12-05 2008-06-11 한국전자통신연구원 청소 로봇의 충전 스테이션으로의 복귀 방법 및 장치
US20180242806A1 (en) * 2015-09-03 2018-08-30 Aktiebolaget Electrolux System of robotic cleaning devices
US20190217474A1 (en) * 2016-06-08 2019-07-18 Ecovacs Robotics Co., Ltd. Mother-child robot cooperative work system and work method thereof
KR20190088115A (ko) * 2018-01-03 2019-07-26 삼성전자주식회사 청소용 이동장치, 협업청소 시스템 및 그 제어방법
KR20200029972A (ko) * 2018-09-06 2020-03-19 엘지전자 주식회사 복수의 자율주행 이동 로봇

Also Published As

Publication number Publication date
TW202228587A (zh) 2022-08-01
KR20220056166A (ko) 2022-05-04
KR20220047453A (ko) 2022-04-18
TWI808480B (zh) 2023-07-11

Similar Documents

Publication Publication Date Title
AU2019335976B2 (en) A robot cleaner and a controlling method for the same
AU2019262468B2 (en) A plurality of robot cleaner and a controlling method for the same
AU2019262467B2 (en) A plurality of robot cleaner and a controlling method for the same
WO2020050494A1 (fr) Robot nettoyeur et son procédé de commande
AU2019262482B2 (en) Plurality of autonomous mobile robots and controlling method for the same
AU2019430311B2 (en) Plurality of autonomous mobile robots and controlling method for the same
WO2020050489A1 (fr) Robot nettoyeur et son procédé de commande
WO2019212239A1 (fr) Pluralité de robots nettoyeurs et leur procédé de commande
WO2019212240A1 (fr) Pluralité de robots nettoyeurs et leur procédé de commande
WO2022075610A1 (fr) Système de robot mobile
WO2019212276A1 (fr) Pluralité de robots mobiles autonomes et procédé de commande de tels robots mobiles autonomes
AU2019262477B2 (en) Plurality of autonomous mobile robots and controlling method for the same
WO2020050566A1 (fr) Pluralité de robots mobiles autonomes et procédé de commande de tels robots mobiles autonomes
WO2021172932A1 (fr) Robot mobile et son procédé de commande
AU2020268667B2 (en) Mobile robot and control method of mobile robots
WO2022075616A1 (fr) Système de robot mobile
WO2021225234A1 (fr) Robot nettoyeur et son procédé de commande
WO2022075614A1 (fr) Système de robot mobile
WO2022075611A1 (fr) Système de robot mobile
WO2022075615A1 (fr) Système de robot mobile
WO2020050565A1 (fr) Pluralité de robots mobiles autonomes et procédé de commande de ces derniers
WO2020022622A1 (fr) Procédé de commande d'un robot mobile à intelligence artificielle
WO2023120918A1 (fr) Bâtiment "robot-friendly", et procédé et système de commande de charge pour robot

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21877862

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21877862

Country of ref document: EP

Kind code of ref document: A1