WO2021141396A1 - Robot cleaner using artificial intelligence and control method thereof - Google Patents

Robot cleaner using artificial intelligence and control method thereof Download PDF

Info

Publication number
WO2021141396A1
WO2021141396A1 PCT/KR2021/000167 KR2021000167W WO2021141396A1 WO 2021141396 A1 WO2021141396 A1 WO 2021141396A1 KR 2021000167 W KR2021000167 W KR 2021000167W WO 2021141396 A1 WO2021141396 A1 WO 2021141396A1
Authority
WO
WIPO (PCT)
Prior art keywords
robot cleaner
map
image data
traveling
cleaning
Prior art date
Application number
PCT/KR2021/000167
Other languages
French (fr)
Inventor
Hyungjin Jeon
Chulmo Sung
Original Assignee
Lg Electronics Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lg Electronics Inc. filed Critical Lg Electronics Inc.
Publication of WO2021141396A1 publication Critical patent/WO2021141396A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L9/00Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating action; Storing devices specially adapted to suction cleaners or parts thereof; Carrying-vehicles specially adapted for suction cleaners
    • A47L9/28Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means
    • A47L9/2805Parameters or conditions being sensed
    • A47L9/2826Parameters or conditions being sensed the condition of the floor
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L9/00Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating action; Storing devices specially adapted to suction cleaners or parts thereof; Carrying-vehicles specially adapted for suction cleaners
    • A47L9/009Carrying-vehicles; Arrangements of trollies or wheels; Means for avoiding mechanical obstacles
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4061Steering means; Means for avoiding obstacles; Details related to the place where the driver is accommodated
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L9/00Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating action; Storing devices specially adapted to suction cleaners or parts thereof; Carrying-vehicles specially adapted for suction cleaners
    • A47L9/28Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means
    • A47L9/2805Parameters or conditions being sensed
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L9/00Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating action; Storing devices specially adapted to suction cleaners or parts thereof; Carrying-vehicles specially adapted for suction cleaners
    • A47L9/28Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means
    • A47L9/2836Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means characterised by the parts which are controlled
    • A47L9/2847Surface treating elements
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L9/00Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating action; Storing devices specially adapted to suction cleaners or parts thereof; Carrying-vehicles specially adapted for suction cleaners
    • A47L9/28Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means
    • A47L9/2836Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means characterised by the parts which are controlled
    • A47L9/2852Elements for displacement of the vacuum cleaner or the accessories therefor, e.g. wheels, casters or nozzles
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L9/00Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating action; Storing devices specially adapted to suction cleaners or parts thereof; Carrying-vehicles specially adapted for suction cleaners
    • A47L9/28Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means
    • A47L9/2857User input or output elements for control, e.g. buttons, switches or displays
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L9/00Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating action; Storing devices specially adapted to suction cleaners or parts thereof; Carrying-vehicles specially adapted for suction cleaners
    • A47L9/28Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means
    • A47L9/2894Details related to signal transmission in suction cleaners
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • B25J11/008Manipulators for service tasks
    • B25J11/0085Cleaning
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/021Optical sensing devices
    • B25J19/023Optical sensing devices including video camera means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1674Programme controls characterised by safety, monitoring, diagnostic
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/0272Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means comprising means for registering the travel distance, e.g. revolutions of wheels
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/0274Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72409User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories
    • H04M1/72415User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories for remote control of appliances
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • A47L2201/04Automatic control of the travelling movement; Automatic obstacle detection
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • A47L2201/06Control of the cleaning action for autonomous devices; Automatic detection of the surface condition before, during or after cleaning
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40411Robot assists human in non-industrial environment like home or office
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/45Nc applications
    • G05B2219/45098Vacuum cleaning robot

Definitions

  • the present disclosure relates to a robot cleaner and a control method of a robot cleaner, and more particularly, to a detection of a robot cleaner using artificial intelligence and a traveling technology according to the detection.
  • Patent Document 1 Korean Patent Laid-Open Publication No. 10-2017-0003764 (published on July 18, 2018)
  • Patent Document 2 Korean Patent Laid-Open Publication No. 10-2012-0065153 (published on June 18, 2012)
  • the present disclosure provides a cleaning robot capable of determining status information of a current robot cleaner by periodically comparing images through a three-dimensional sensor while providing a spatial map similar to an actual indoor space.
  • the present disclosure also provides a cleaning robot capable of providing an accurate spatial map by compensating for an occurrence of a wheel slip according to a restraint of a robot cleaner when creating a spatial map.
  • the present disclosure also provides a cleaning robot capable of alarming a user of a current state of the robot cleaner or performing a motion to escape the restraint by not only measuring traveling displacement, but also periodically determining whether traveling is actually performed by image sensing.
  • a robot cleaner includes: a traveler that moves a main body; a cleaner that performs a cleaning function; a traveling displacement measurer that detects traveling displacement; an image detector that obtains image data by periodically photographing a surrounding environment; and a controller that performs cleaning on a cleaning area, generates a map for the cleaning area based on information and the image data detected through the traveling displacement measurer and the image detector, and corrects and provides the map by reading a change in the image data to determine whether the robot cleaner travels abnormally.
  • the map may include physical shape information on the cleaning area and information on a current location of the robot cleaner.
  • the controller may determine that the robot cleaner is in an abnormal state.
  • the controller may determine that the robot cleaner is in the abnormal state when there is no difference between the image data periodically obtained, and a detection signal from the traveling displacement measurer indicates that the robot cleaner is traveling.
  • the controller may compare the image data and perform a map correction to remove a traveling distance of the robot cleaner from a point in time when there is no change in the image data.
  • the controller may induce escape by performing a restraint escape motion.
  • the controller may send an alarm to a user terminal when the restraint escape of the robot cleaner does not proceed.
  • the user terminal may have an application installed to control the robot cleaner, the map corrected through the application may be provided, and the alarm for the restraint escape may be transmitted from the robot cleaner.
  • the robot cleaner may collect the information and the image data on the cleaning area while performing cleaning in an edge mode or a zigzag mode.
  • a control method of a robot cleaner includes: obtaining a detection signal for detecting traveling displacement by performing cleaning while traveling in a cleaning area and obtaining image data by photographing a surrounding environment; generating a map for the cleaning area based on the detection signal and the image data; and correcting the map by reading a change in the image data to determine whether the robot cleaner travels abnormally.
  • the map may include physical shape information on the cleaning area and information on a current location of the robot cleaner.
  • the controller may determine that the robot cleaner is in an abnormal state.
  • the controller may determine that the robot cleaner is in the abnormal state when there is no difference between the image data periodically obtained, and the detection signal from the traveling displacement measurer indicates that the robot cleaner is traveling.
  • the controller may compare the image data and perform a map correction to remove a traveling distance of the robot cleaner from a point in time when there is no change in the image data.
  • the control method of the robot cleaner may further include inducing escape by a restraint escape motion when the robot cleaner is in an abnormal state.
  • an alarm may be issued to a user terminal when the restraint escape of the robot cleaner does not proceed.
  • the user terminal may have an application installed to control the robot cleaner, the map corrected through the application may be provided, and the alarm for the restraint escape may be transmitted from the robot cleaner.
  • the detection signal and the image data on the cleaning area may be collected while the cleaning is performed in an edge mode or a zigzag mode.
  • a robot cleaner system includes: a robot cleaner that performs cleaning on a cleaning area while moving a main body, the robot cleaner including a traveling displacement measurer that periodically detects traveling displacement; an image detector that obtains image data by periodically photographing a surrounding environment; and a controller that performs cleaning on a cleaning area, generates a map for the cleaning area based on information and the image data detected through the traveling displacement measurer and the image detector, and corrects and provides the map by reading a change in the image data to determine whether the robot cleaner travels abnormally; and a user terminal that has an application installed to control the robot cleaner to clean and travel and receives an alarm of whether the robot cleaner is in an abnormal state by receiving the map from the application.
  • the controller may determine that the robot cleaner is in the abnormal state when there is no difference between the image data periodically obtained, and a detection signal from the traveling displacement measurer indicates that the robot cleaner is traveling.
  • the user terminal may compare the image data to receive the map for the cleaning area from which a driving distance of the robot cleaner from a point in time when there is no change in the image data is removed, and receive information on a current location of the robot cleaner on the map.
  • the present disclosure may determine the status information of the current robot cleaner by periodically comparing images through the 3D sensor while providing the spatial map similar to the actual indoor space.
  • FIG. 1 is a configuration diagram of a smart home system including a robot cleaner according to an embodiment of the present disclosure.
  • FIG. 2 is a perspective view illustrating the robot cleaner and a charging stand for charging the robot cleaner according to the embodiment of the present disclosure.
  • FIG. 3 is an elevation view of the robot cleaner of FIG. 2 as viewed from the top.
  • FIG. 4 is an elevation view of the robot cleaner of FIG. 2 as viewed from the front.
  • FIG. 5 is an elevation view of the robot cleaner of FIG. 2 as viewed from the bottom.
  • FIG. 6 is a block diagram illustrating a control relationship between main components of the robot cleaner of FIG. 2.
  • FIG. 7 illustrates a control method of a robot cleaner according to an embodiment of the present disclosure.
  • FIG. 8 is a flowchart illustrating a method of determining a current state of a robot cleaner of FIG. 7.
  • FIGS. 9A to 13C are diagrams each illustrating a camera image, a state diagram of the robot cleaner, and a location of the robot cleaner in a spatial map for explaining the flow charts of FIGS. 7 and 8.
  • FIG. 14 is a diagram illustrating a display state of a user terminal according to FIG. 8.
  • FIGS. 15A and 15B are diagrams illustrating a correction of the spatial map according to FIGS. 7 and 8.
  • the front may mean a main traveling direction of a robot cleaner or a main traveling direction of pattern traveling of the robot cleaner.
  • the main traveling direction may mean a vector sum value of directions traveling within a predetermined time.
  • each component is exaggerated, omitted, or schematically illustrated for convenience and clarity of description. In addition, the size and area of each component do not fully reflect the actual size or area.
  • FIG. 1 is a configuration diagram of a robot system according to an embodiment of the present disclosure.
  • the robot system may include one or more robot cleaners 100 to provide a service at a designated place such as a home.
  • the robot system may include the robot cleaner 100 that provides a cleaning service for a designated place in a home or the like.
  • the robot cleaner 100 may provide a dry, wet, or dry/wet cleaning service according to functional blocks included.
  • the robot system includes a plurality of artificial intelligence robot cleaners 100 and a server 2 that may manage and control a plurality of artificial intelligence robot cleaners 100.
  • the server 2 may remotely monitor and control a state of the plurality of robot cleaners 100, and the robot system may provide more effective services by using the plurality of robot cleaners 100.
  • the plurality of robot cleaners 100 and the server 2 include communication means (not shown) that support one or more communication standards, and may communicate with each other.
  • the plurality of robot cleaners 100 and the server 2 may communicate with a PC, a mobile terminal, and other external servers 2.
  • the plurality of robot cleaners 100 and the server 2 may implement wireless communication with wireless communication technologies such as IEEE 802.11 WLAN, IEEE 802.15 WPAN, UWB, Wi-Fi, Zigbee, Z-wave, and Bluetooth.
  • the robot cleaner 100 may vary depending on how other devices or the server with which the robot cleaner 100 wants to communicate is communicating.
  • the plurality of robot cleaners 100 may implement wireless communication with other robots 100 and/or the server 2 through a 5G network.
  • the robot cleaner 100 performs wireless communication through the 5G network, real-time response and real-time control are possible.
  • a user may check information on the robots 100 in the robot system through a user terminal 3 such as a PC or a mobile terminal.
  • the server 2 is implemented as a cloud server 2, and the cloud server 2 may be linked to the robot 100 to monitor and control the robot cleaner 100 and provide various solutions and content remotely.
  • the server 2 may store and manage information received from the robot cleaner 100 and other devices.
  • the server 2 may be a server 2 that is provided by a manufacturer of the robot cleaner 100 or a company entrusted with services from the manufacturer.
  • the server 2 may be a control server 2 that manages and controls the robot cleaner 100.
  • the server 2 may identically control the robot cleaner 100 in a batch, or control the robot cleaner 100 individually. Meanwhile, the server 2 may be configured by distributing information and functions to a plurality of servers, or may be configured as a single integrated server.
  • the robot cleaner 100 and the server 2 include communication means (not illustrated) that support one or more communication standards, and may communicate with each other.
  • the robot cleaner 100 may transmit data related to space, object, and usage related data to the server 2.
  • the space and object related data may be recognition related data for a space or an object recognized by the robot cleaner 100, or image data for a space and an object acquired by an image acquirer.
  • the robot cleaner 100 and the server 2 may include artificial neural networks (ANN) in the form of software or hardware that are learned to recognize at least one of attributes such as a user, a voice, space attributes, and object attributes such as obstacles.
  • ANN artificial neural networks
  • the robot cleaner 100 and the server 2 may include a deep neural network (DNN), such as a convolutional neural network (CNN), a recurrent neural network (RNN), and a deep belief network (DBN), that is learned by deep learning.
  • DNN deep neural network
  • a structure of the deep neural network (DNN) such as a convolutional neural network (CNN) may be mounted on a controller 140 of the robot cleaner 100.
  • the server 2 may learn the deep neural network (DNN) based on data received from the robot cleaner 100, data input by a user, and the like, and then transmit the updated structure data of the deep neural network (DNN) to the robot 1. Accordingly, the structure of the deep neural network (DNN) of the artificial intelligence included in the robot 100 may be updated.
  • DNN deep neural network
  • the usage related data is data obtained according to the use of the robot cleaner 100, and may correspond to usage history data, detection signals obtained by the sensor unit, and the like.
  • the learned structure of the deep neural network may receive input data for recognition, recognize attributes of people, objects, and spaces included in the input data, and output the result.
  • the learned structure of the deep neural network may receive input data for recognition, and analyze and learn the usage related data of the robot cleaner 100 to recognize usage patterns and usage environment.
  • the space, object, and usage related data may be transmitted to the server 2 through the communication unit.
  • the server 2 may learn the deep neural network (DNN) based on the received data, and then transmit the updated structure data of the deep neural network (DNN) to the robot cleaner 100 using the artificial intelligence for updating.
  • DNN deep neural network
  • the robot 100 is getting smarter, and it is possible to provide a user experience (UX) that evolves as it is used.
  • the server 2 may provide the information on the control and current state of the robot cleaner 100 to the user terminal, and may generate and distribute an application for controlling the robot cleaner 100.
  • Such an application may be an application for a PC applied as the user terminal 3 or may be an application for a smartphone.
  • such an application may be an application for controlling smart home appliances such as the SmartThinQ application, an application capable of simultaneously controlling and supervising various electronic products of the applicant.
  • the robot cleaner 100 includes a main body 110.
  • a portion facing a ceiling in a traveling area is defined as an upper surface portion (see FIG. 3)
  • a portion facing a bottom in the traveling area is defined as a lower surface portion (see FIG. 5)
  • a portion facing a traveling direction among portions forming a circumference of the main body 110 between the upper surface portion and the lower surface portion is defined as a front surface portion (see FIG. 4).
  • a portion of the main body 110 facing a direction opposite to the front surface portion may be defined as a rear surface portion.
  • the main body 110 may include a case 111 forming a space in which various components constituting the robot cleaner 100 are accommodated.
  • the robot cleaner 100 includes a sensing unit 130 that detects surrounding situations.
  • the sensing unit 130 may detect information outside the robot cleaner 100.
  • the sensing unit 130 detects users around the robot cleaner 100.
  • the sensing unit 130 may detect objects around the robot cleaner 100.
  • the sensing unit 130 may detect information on a cleaning area.
  • the sensing unit 130 may detect obstacles such as walls, furniture, and cliffs on a traveling surface.
  • the sensing unit 130 may detect information on the ceiling.
  • the sensing unit 130 may include objects placed on the traveling surface and/or an external upper object.
  • the external upper object may include a ceiling or a lower surface of furniture disposed in an upper direction of the robot cleaner 100.
  • the robot cleaner 100 may map the cleaning area.
  • the sensing unit 130 may detect the information on the users around the robot cleaner 100.
  • the sensing unit 130 may detect the location information of the user.
  • the location information may include direction information on the robot cleaner 100.
  • the location information may include distance information between the robot cleaner 100 and the user.
  • the sensing unit 130 may detect the direction of the user with respect to the robot cleaner 100.
  • the sensing unit 130 may detect a distance between the user and the robot cleaner 100.
  • the location information may be immediately acquired by the detection of the sensing unit 130 or may be processed and acquired by the controller 140.
  • the sensing unit 130 may include an image detector 135 that detects the surrounding images.
  • the image detector 135 may detect an image in a specific direction for the robot cleaner 100.
  • the image detector 135 may detect an image in front of the robot cleaner 100.
  • the image detector 135 photographs the traveling area and may include a digital camera.
  • the digital camera may include at least one optical lens, an image sensor (for example, CMOS image sensor) configured to include a plurality of photodiodes (for example, pixels) formed by light passing through the optical lens, and a digital signal processor (DSP) constituting images based on signals output from the photodiodes.
  • the digital signal processor may generate not only still images but also moving images that are composed of frames composed of the still images.
  • the sensing unit 130 may include a distance detector 131 that detects a distance to the surrounding wall.
  • the distance between the robot cleaner 100 and the surrounding wall may be sensed by the distance detector 131.
  • the distance detector 131 detects a distance to a user in a specific direction of the robot cleaner 100.
  • the distance detector 131 may include a camera, an ultrasonic sensor, an infrared (IR) sensor, or the like.
  • the distance detector 131 may be disposed on the front surface portion of the main body 110 or may be disposed on a side surface portion thereof.
  • the distance detector 131 may detect surrounding obstacles.
  • the plurality of distance detectors 131 may be provided.
  • the sensing unit 130 may include a cliff detector 132 that detects whether cliffs exist on a surface in the traveling area.
  • the plurality of cliff detectors 132 may be provided.
  • the sensing unit 130 may further include a lower image sensor 137 for obtaining an image of a surface.
  • the robot cleaner 100 includes a traveler 160 that moves the main body 110.
  • the traveler 160 moves the main body 110 with respect to the surface.
  • the traveler 160 may include at least one driving wheel 166 that moves the main body 110.
  • the traveler 160 may include a drive motor.
  • the driving wheel 166 may be provided on left and right sides of the main body 110, respectively, and hereinafter, is referred to as a left wheel 166(L) and a right wheel 166(R), respectively.
  • the left wheel 166(L) and the right wheel 166(R) may be driven by a single drive motor, but if necessary, may be provided with a left wheel drive motor for driving the left wheel 166(L) and a right wheel drive motor for driving the right wheel 166(R), respectively.
  • the traveling direction of the main body 110 can be changed to the left or right by making a difference in a rotation speed of the left wheel 166(L) and the right wheel 166(R).
  • the robot cleaner 100 includes a cleaner 180 that performs a cleaning function.
  • the robot cleaner 100 may move the cleaning area and may clean the surface by the cleaner 180.
  • the cleaner 180 is a suction device that sucks foreign objects, brushes 184 and 185 that perform combing, a dust bin (not illustrated) that stores foreign objects collected by the suction device or the brush, and/or a mop unit (not illustrated) that performs mopping, and the like.
  • the lower surface portion of the main body 110 may be provided with a suction port 180h through which air is sucked.
  • the suction device (not illustrated) that provides suction power so that air may be sucked through the suction port 180h, and the dust bin (not illustrated) that collects dust sucked together with the air through the suction port 180h may be provided in the main body 110.
  • An opening for insertion and removal of the dust bin may be formed in the case 111, and a dust bin cover 112 for opening and closing the opening may be rotatably provided with respect to the case 111.
  • a roll-shaped main brush 184 that has brushes exposed through the suction port 180h and an auxiliary brush 185 that is located in front of the lower surface portion of the main body 110 and has a plurality of blades radially extending may be provided. Dust is removed from the surface in the traveling area by the rotation of the brushes 184 and 185, and the dust separated from the surface is sucked through the suction port 180h and collected in the dust bin.
  • a battery 138 may supply power necessary for the overall operation of the robot cleaner 100 as well as the drive motor.
  • the robot cleaner 100 may perform traveling to return to the charging stand 200 for charging, and the robot cleaner 100 may detect the location of the charging stand 200 on its own during the return traveling.
  • the charging stand 200 may include a signal transmission unit (not illustrated) that transmits a predetermined return signal.
  • the return signal may be an ultrasonic signal or an infrared signal, but is not necessarily limited thereto.
  • the image detector 135 is provided on the upper surface portion of the main body 110 to obtain an image of the ceiling in the cleaning area, but the location and the shooting range of the image detector 135 are not necessarily limited thereto.
  • the image detector 135 may be provided to acquire an image in front of the main body 110.
  • the robot cleaner 100 may further include an operator (not illustrated) capable of inputting on/off or various commands.
  • the robot cleaner 100 includes a storage unit 150 that stores various data. Various data required for control of the robot cleaner 100 may be recorded in the storage unit 150.
  • the storage unit 150 may include a volatile or nonvolatile recording medium.
  • the recording medium stores data that may be read by a micro processor, and may include a hard disk drive (HDD), a solid state disk (SSD), a silicon disk drive (SDD), a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
  • a map for a cleaning area may be stored in the storage unit 150.
  • the map may be input by an external terminal capable of exchanging information with the robot cleaner 100 through wired or wireless communication, or may be generated by self-learning of the robot cleaner 100.
  • external terminals include a remote control, a PDA, a laptop, a smartphone, a tablet, and the like that are equipped with an application for setting a map.
  • the traveling displacement measurer 165 may measure the traveling displacement based on the image acquired by the image detector 135.
  • the traveling displacement is a concept including a moving direction and a moving distance of the robot cleaner 100.
  • the traveling displacement measurer 165 may measure the traveling displacement by comparing successive pixels of a surface image that varies according to the continuous movement of the robot cleaner 100.
  • the traveling displacement measurer 165 may measure the traveling displacement of the robot cleaner 100 based on the operation of the traveler 160.
  • the controller 140 may measure the current or past moving speed, traveling distance, and the like of the robot cleaner 100 based on the rotation speed of the driving wheel 136, and may measure the current or past direction change process according to the rotation directions of each driving wheel 136(L) and 136(R).
  • the traveling displacement measurer 165 may measure the traveling displacement using at least one of the distance detector 131 and the image detector 135.
  • the controller 140 may recognize the location of the robot cleaner 100 on the map based on the measured traveling displacement.
  • a transmitter 170 may transmit the information on the robot cleaner to another robot cleaner or a central server.
  • a receiver 190 may receive the information from another robot cleaner or the central server.
  • the information transmitted by the transmitter 170 or the information received by the receiver 190 may include configuration information of the robot cleaner.
  • the robot cleaner 100 includes the controller 140 that processes and determines various types of information.
  • the controller 140 may perform information processing for learning the cleaning area.
  • the controller 140 may perform information processing for recognizing the current location on the map.
  • the controller 140 may control the overall operation of the robot cleaner 100 by controlling various components (for example, traveling displacement measurer 165, distance detector 131, image detector 135, traveler 160, transmitter 170, receiver 190, and the like) constituting the robot cleaner 100.
  • the control method according to the present embodiment may be performed by the controller 140.
  • the present disclosure may be a control method of the robot cleaner 100 or may be the robot cleaner 100 including the controller 140 performing the control method.
  • the present disclosure may be a computer program including each step of the control method, or may be a recording medium on which a program for implementing the control method with a computer is recorded.
  • the "recording medium” means a computer-readable recording medium.
  • the present disclosure may be a mobile robot control system including both hardware and software.
  • the controller 140 of the robot cleaner 100 processes and determines various types of information such as mapping and/or recognizing the current location.
  • the controller 140 may be provided to map the cleaning area through the image and learning and recognize the current location on the map. That is, the controller 140 may perform a simultaneous localization and mapping (SLAM) function.
  • SLAM simultaneous localization and mapping
  • the controller 140 may control driving of the traveler 160.
  • the controller 140 may control the operation of the traveler 180.
  • the robot cleaner 100 includes the storage unit 150 that stores various types of data.
  • the storage unit 150 records various types of information required for the control of the robot cleaner 100 and may include a volatile or nonvolatile recording medium.
  • the real cleaning area may correspond to the cleaning area on the map.
  • the cleaning area may be defined as a range in which all areas on a plane in which the robot cleaner 100 has driving experience and all areas on a plane in which the robot cleaner 100 is currently traveling are summed up.
  • the controller 140 may determine a movement path of the robot cleaner 100 based on the operation of the traveler 160. For example, the controller 140 may identify the current or past moving speed, traveling distance, and the like of the robot cleaner 100 based on the rotation speed of the driving wheel 166, and may also identify the current or past direction change process according to the rotation directions of each driving wheel 166(L) and 166(R). Based on the driving information of the robot cleaner 100 thus identified, the location of the robot cleaner 100 on the map may be updated. In addition, the location of the robot cleaner 100 on the map may be updated using the image information.
  • the controller 140 controls the traveling of the robot cleaner 100 and controls the driving of the traveler 160 according to the set traveling mode.
  • the traveling mode of the traveler 160 a zigzag mode, an edge mode, a spiral mode, a hybrid mode, or the like may be selectively set.
  • the zigzag mode is defined as a mode for cleaning while traveling in a zigzag path by being separated from a wall or an obstacle by a predetermined distance or more.
  • the edge mode is defined as a mode that cleans while sticking to a wall and traveling in a zigzag path.
  • the spiral mode is defined as a mode for spirally cleaning within a certain area around any one place.
  • the controller 140 generates the map for the cleaning area. That is, the controller 140 may form the spatial map for the cleaning area through a location recognized through prior cleaning and an image acquired at each location. The controller 140 may also update a previously generated map, classify a type of a cleaning area of a spatial map generated according to conditions, and match a cleaning method according to the classified type. In addition, the controller 140 performs the cleaning of the robot cleaner 100 in a highly efficient manner by calculating the efficiency of cleaning in a cleaning method matched with cleaning in the basic mode.
  • the controller 140 generates a basic map according to the detection signal of the sensing unit 130, specifically, the detection signal from the traveling displacement measurer 165, the distance detector 131, and the cliff detector 132.
  • a basic map may be a general grid map, and may be generated based on a direction in which the robot cleaner 100 rotates while traveling, a straight travel distance, a distance from a wall, and the like.
  • the controller 140 may extract the spatial information from the image data from the image detector 135 and add the extracted spatial information to the basic map to generate the spatial map.
  • the controller 140 may further include a map forming unit for forming the spatial map, but may be processed together in the controller 140.
  • the map generator generates a basic map through the detection signal obtained through the prior cleaning, and generates the spatial map by adding the spatial information from the image data to the basic map.
  • the map generator may extract spatial information, which is the detailed information on the space, from the image data generated by being continuously photographed and added to the spatial map.
  • the map generator may extract the straight line from the continuous image data and extract a vanishing point.
  • an edge can be extracted by matching the straight line, and an angle calculation is possible.
  • the map generator may determine the size of the obstacle not only in width and length, but also in height through algorithms such as a compression neural network technique, and may extract information on which object the obstacle is. In this way, the name and size of the obstacle may be defined, and the information on the obstacle may also be reflected in the spatial map to form a final spatial map.
  • a final spatial map is constituted by walls and walls to partition the space, and is shown so that the walls and the space formed by the walls may be distinguished from other spaces.
  • the map generator may be divided into a plurality of cleaning areas so that cleaning may be performed by traveling at once based on the final spatial map.
  • the formation of the final spatial map and the partition of the cleaning area may be performed by the map generator, but as described above, may be performed in a batch by the controller 140.
  • the controller 140 may partition the cleaning area according to the area of the mapped cleaning area to match the optimal cleaning method according to the shapes of each of the partitioned sub cleaning areas.
  • the controller 140 may recognize the current location using at least one of the traveling displacement measurer 165, the distance detector 131, and the image detector 135, and may recognize the current location on the map.
  • the controller 140 may determine the current state of the robot cleaner 100 by comparing the detection signals of the driving displacement measurer 165, the distance detector 131, and the image detector 135 for a plurality of cycles.
  • the final map may be calculated to correct the basic map or the spatial map.
  • An input unit 171 may receive on/off or various commands.
  • the input unit 171 may include a button, a key, a touch type display, or the like.
  • the input unit 171 may include a microphone for voice recognition.
  • An output unit 173 may notify a user of various types of information.
  • the output unit 173 may include a speaker and/or a display.
  • FIG. 7 is a flow chart illustrating a method of controlling a robot cleaner according to an embodiment of the present disclosure
  • FIG. 8 is a flowchart illustrating a method of determining a current state of the robot cleaner of FIG. 7
  • FIGS. 9A to 13C are diagrams each illustrating a camera image, a state diagram of the robot cleaner, and a location of the robot cleaner in a spatial map for explaining the flow charts of FIGS. 7 and 8.
  • the control method may be performed by the controller 140.
  • Each step of the flowchart diagrams of the control method and combinations of the flowchart diagrams may be performed by computer program instructions.
  • the instructions may be mounted on a general purpose computer, a special purpose computer, or the like, and the instructions generate means for performing the functions described in the flowchart step(s).
  • cleaning start information is received through the server 2 or the user terminal 3 (S100).
  • the robot cleaner 100 cleans the cleaning area from the current location (S110).
  • the controller 140 performs cleaning while driving the cleaning area, and accumulates each detection signal by detecting the cleaning area.
  • the traveling during the cleaning may be performed in an edge mode or a zigzag mode.
  • the robot cleaner 100 periodically detects the moving displacement and the surrounding environment through the traveling displacement measurer 165 through the wheel detection from the traveler 160 and the distance detector 131 through the ultrasonic sensor or the like, thereby acquiring the basic map as illustrated in FIG. 9C.
  • the basic map as illustrated in FIG. 9C may be created while classifying a travelable area and a non-travelable area from the distance detector 131 through the obstacle detection sensor.
  • controller 140 may receive image data from the image detector 135 in accordance with the detection cycle of the traveling displacement measurer 165 and the distance detector 131.
  • Such image data may be a still image photographing the front of the robot cleaner 100 as illustrated in FIG. 9A, but is not limited thereto.
  • FIG. 9A it may be an image of a specific direction from the robot cleaner 100, and a distance from a specific obstacle or the like may be calculated by periodically photographing an image in the same direction.
  • the image data photographed in the previous cycle is one obtaining a front obstacle ob through the image detector 135, and the state of the robot cleaner 100 at this time is illustrated in FIG. 9B.
  • the location of the robot cleaner 100 in the basic map represents 10 on the x-axis and 100 on the y-axis as illustrated in FIG. 9C, and represents the path along which the robot cleaner 100travels.
  • the location of the robot cleaner 100 may be recorded in the basic map, and such a basic map may be recorded in the storage unit 150 to represent a spatial map of the overall cleaning area.
  • the robot cleaner 100 performs the cleaning while moving along the driving direction, and as in the previous cycle, the detection signal is obtained from each functional block.
  • the controller 140 of the robot cleaner 100 may control to deviate from the abnormal state accordingly (S150).
  • the traveling direction may be changed to perform traveling.
  • the user terminal 3 may be alerted to induce the location change.
  • the detection signal in the current cycle are obtained in the state in which the detection signals are stored from each functional block in the previous cycle, and the image data photographing the front of the robot cleaner 100 is stored (S131).
  • the controller 140 obtains the information on the moving distance and direction change through the wheel sensor from the traveling displacement measurer 165 in the current cycle in the same manner as in the previous cycle, obtains the detection signal on the moving distance and the like from the front obstacle ob from the distance detector 131, and obtains the front image data from the image detector 135.
  • the detection signal and image data of various functional blocks obtained as described above may determine the current state of the robot cleaner 100 by comparing a value for the previous cycle and a value for the current cycle.
  • FIG. 10A When the image data of the current cycle is obtained as illustrated in FIG. 10A, it is possible to compare whether there is a difference between FIG. 9A, which is the image data of the previous cycle, and FIG. 10A, which is the image data of the current cycle.
  • the comparison determination may be performed by comparing pixel samples and the like, and may be easily calculated through distance comparison between feature points.
  • the image data may be compared for a plurality of cycles.
  • Coordinates of the robot cleaner 100 in FIG. 9C are (10, 100), coordinates in FIG. 10C are (10, 120), and coordinates of the robot cleaner 100 in FIG. 11C are (10, 140), which may be regarded as traveling along a y axis.
  • the controller 140 may secure data of the previous cycle and search for when the state in which there is no change in the image data starts.
  • the controller 140 may perform the correction on the map by the difference between the coordinates in the cycle of FIG. 9C and the coordinates of FIG. 11C (S135).
  • the basic map correction is performed by deleting the moving distance by 40 in the y-axis direction from the basic map.
  • Such calculation and data correction may be performed through a global kidnap recovery (GKR) algorithm, but may be performed through a simple program.
  • GKR global kidnap recovery
  • controller 140 performs an escape motion from the restrained state as illustrated in FIG. 12B (S136).
  • the controller 140 may control the traveler 160 to perform a predetermined restraint escape motion, and the restraint escape motion is set to rotation, rapid back-moving, etc. according to the angle of the surrounding obstacle ob.
  • the image data of the next cycle received from the image detector 135 may vary as illustrated in FIG. 12A.
  • the coordinates of the robot cleaner 100 of the basic map may continue to maintain the corrected current position (10, 100), and only the direction may be changed.
  • the traveling in a direction opposite to the previous traveling direction may be performed, and the mapping is performed again at the points 10 and 100 where the last mapping is performed.
  • the traveling direction changes reversely as illustrated in FIG. 13B, and the image data as illustrated in FIG. 13A may be obtained.
  • the robot cleaner 100 may alarm the user terminal 3 as illustrated in FIG. 14.
  • FIG. 14 illustrates a display state of the user terminal according to FIG. 8, and FIGS. 15A and 15B are diagrams illustrating the spatial map correction according to FIGS. 7 and 8.
  • the user terminal 3 in the smart home system including the robot cleaner 100 is installed with an application for controlling the robot cleaner 100.
  • Such an operation may be set by the rotation, the rapid back-moving, or the like as described above.
  • Such an alarm may be provided in a sentence such as "Help escape”, and may be provided visually and/or audible.
  • the robot cleaner 100 may provide the information on the current location of the robot cleaner 100 by providing the information on the spatial map that is the corrected basic map.
  • the spatial map may be a map as illustrated in FIG. 15B in which an error area A due to the wheel slip is removed and corrected by the restraint of the robot cleaner 100 as illustrated in FIG. 15A.
  • the user moves to the corresponding location and may directly release the robot cleaner 100 from the restrained state.
  • the robot cleaner 100 may perform an escape on its own according to the restrained state, or may be released from the restrained state through a user alarm.

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Robotics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Human Computer Interaction (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Electromagnetism (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Electric Vacuum Cleaner (AREA)

Abstract

Disclosed is a robot cleaner using artificial intelligence. The robot cleaner includes: a traveler that moves a main body; a cleaner that performs a cleaning function; a traveling displacement measurer that detects traveling displacement; an image detector that obtains image data by periodically photographing a surrounding environment; and a controller that performs cleaning on a cleaning area, generates a map for the cleaning area based on information and the image data detected through the traveling displacement measurer and the image detector, and corrects and provides the map by reading a change in the image data to determine whether the robot cleaner travels abnormally. Therefore, it is possible to determine the status information of the current robot cleaner by periodically comparing images through the 3D sensor while providing the spatial map similar to the actual indoor space. In addition, when creating the spatial map, it is possible to provide the accurate spatial map by compensating for the occurrence of the wheel slip due to the restraint of the robot cleaner. In addition, it is possible to induce to escape the current restraint state or to perform the restraint escape motion by alarming the current state of the robot cleaner to the user.

Description

ROBOT CLEANER USING ARTIFICIAL INTELLIGENCE AND CONTROL METHOD THEREOF
[1] The present disclosure relates to a robot cleaner and a control method of a robot cleaner, and more particularly, to a detection of a robot cleaner using artificial intelligence and a traveling technology according to the detection.
[2] Robots have been developed for industrial use and have been responsible for part of factory automation.
[3] In recent years, as the field of application of robots has been further expanded, medical robots, aerospace robots, etc. are being developed, and home robots that may be used in general homes are also being made. Among these robots, robots that can travel by their own force are called mobile robots. A typical example of a mobile robot used at home is a robot cleaner.
[4] Various technologies are known for detecting an environment around a robot cleaner and a user through various sensors provided in the robot cleaner. In addition, technologies are known in which the robot cleaner learns and maps a cleaning area by itself, and identifies a current location on a map. A robot cleaner that cleans a cleaning area while driving in a predetermined manner is known.
[5] In addition, the prior art (Korean Patent Laid-Open Publication No. 10-2017-0003764) discloses a method for processing a map (grid map) for a cleaning area into a form that is easy for a user to check (outline change, etc.), and cleaning a cleaning area according to a cleaning command input through the map.
[6] Meanwhile, the prior art (Korean Patent Laid-Open Publication No. 10-2012-0065153) discloses a system for detecting an obstacle during traveling by using a camera and a laser as a 3D sensor.
[7] However, such an obstacle detection system enables precise control to avoid upper/lower obstacles, walls, cliffs, and the like during traveling, but refers only to fragmentary obstacle information, so it is impossible to determine what the current state of the robot cleaner is.
[8] [Related Art Document]
[9] [Patent Document]
[10] (Patent Document 1) Korean Patent Laid-Open Publication No. 10-2017-0003764 (published on July 18, 2018)
[11] (Patent Document 2) Korean Patent Laid-Open Publication No. 10-2012-0065153 (published on June 18, 2012)
[12] The present disclosure provides a cleaning robot capable of determining status information of a current robot cleaner by periodically comparing images through a three-dimensional sensor while providing a spatial map similar to an actual indoor space.
[13] The present disclosure also provides a cleaning robot capable of providing an accurate spatial map by compensating for an occurrence of a wheel slip according to a restraint of a robot cleaner when creating a spatial map.
[14] The present disclosure also provides a cleaning robot capable of alarming a user of a current state of the robot cleaner or performing a motion to escape the restraint by not only measuring traveling displacement, but also periodically determining whether traveling is actually performed by image sensing.
[15] A robot cleaner includes: a traveler that moves a main body; a cleaner that performs a cleaning function; a traveling displacement measurer that detects traveling displacement; an image detector that obtains image data by periodically photographing a surrounding environment; and a controller that performs cleaning on a cleaning area, generates a map for the cleaning area based on information and the image data detected through the traveling displacement measurer and the image detector, and corrects and provides the map by reading a change in the image data to determine whether the robot cleaner travels abnormally.
[16] The map may include physical shape information on the cleaning area and information on a current location of the robot cleaner.
[17] When there is no difference between the image data periodically obtained, the controller may determine that the robot cleaner is in an abnormal state.
[18] The controller may determine that the robot cleaner is in the abnormal state when there is no difference between the image data periodically obtained, and a detection signal from the traveling displacement measurer indicates that the robot cleaner is traveling.
[19] The controller may compare the image data and perform a map correction to remove a traveling distance of the robot cleaner from a point in time when there is no change in the image data.
[20] When the robot cleaner is in the abnormal state, the controller may induce escape by performing a restraint escape motion.
[21] The controller may send an alarm to a user terminal when the restraint escape of the robot cleaner does not proceed.
[22] The user terminal may have an application installed to control the robot cleaner, the map corrected through the application may be provided, and the alarm for the restraint escape may be transmitted from the robot cleaner.
[23] The robot cleaner may collect the information and the image data on the cleaning area while performing cleaning in an edge mode or a zigzag mode.
[24] A control method of a robot cleaner includes: obtaining a detection signal for detecting traveling displacement by performing cleaning while traveling in a cleaning area and obtaining image data by photographing a surrounding environment; generating a map for the cleaning area based on the detection signal and the image data; and correcting the map by reading a change in the image data to determine whether the robot cleaner travels abnormally.
[25] The map may include physical shape information on the cleaning area and information on a current location of the robot cleaner.
[26] In the correction of the map, when there is no difference between the image data periodically obtained, the controller may determine that the robot cleaner is in an abnormal state.
[27] In the correcting of the map, the controller may determine that the robot cleaner is in the abnormal state when there is no difference between the image data periodically obtained, and the detection signal from the traveling displacement measurer indicates that the robot cleaner is traveling.
[28] In the correcting of the map, the controller may compare the image data and perform a map correction to remove a traveling distance of the robot cleaner from a point in time when there is no change in the image data.
[29] The control method of the robot cleaner may further include inducing escape by a restraint escape motion when the robot cleaner is in an abnormal state.
[30] In the inducing of the escape, an alarm may be issued to a user terminal when the restraint escape of the robot cleaner does not proceed.
[31] The user terminal may have an application installed to control the robot cleaner, the map corrected through the application may be provided, and the alarm for the restraint escape may be transmitted from the robot cleaner.
[32] In the performing of the cleaning, the detection signal and the image data on the cleaning area may be collected while the cleaning is performed in an edge mode or a zigzag mode.
[33] A robot cleaner system includes: a robot cleaner that performs cleaning on a cleaning area while moving a main body, the robot cleaner including a traveling displacement measurer that periodically detects traveling displacement; an image detector that obtains image data by periodically photographing a surrounding environment; and a controller that performs cleaning on a cleaning area, generates a map for the cleaning area based on information and the image data detected through the traveling displacement measurer and the image detector, and corrects and provides the map by reading a change in the image data to determine whether the robot cleaner travels abnormally; and a user terminal that has an application installed to control the robot cleaner to clean and travel and receives an alarm of whether the robot cleaner is in an abnormal state by receiving the map from the application.
[34] The controller may determine that the robot cleaner is in the abnormal state when there is no difference between the image data periodically obtained, and a detection signal from the traveling displacement measurer indicates that the robot cleaner is traveling.
[35] The user terminal may compare the image data to receive the map for the cleaning area from which a driving distance of the robot cleaner from a point in time when there is no change in the image data is removed, and receive information on a current location of the robot cleaner on the map.
[36] By the solution means, the present disclosure may determine the status information of the current robot cleaner by periodically comparing images through the 3D sensor while providing the spatial map similar to the actual indoor space.
[37] In addition, when creating the spatial map, it is possible to provide the accurate spatial map by compensating for the occurrence of the wheel slip due to the restraint of the robot cleaner. In addition, it is possible to induce to escape the current restraint state or to perform the restraint escape motion by alarming the user of the current state of the robot cleaner.
[38] The accompanying drawings, which are included to provide a further understanding of the present disclosure and are incorporated on and constitute a part of this specification, illustrate embodiments of the present disclosure and, together with the description, serve to explain the principles of the present disclosure.
[39] FIG. 1 is a configuration diagram of a smart home system including a robot cleaner according to an embodiment of the present disclosure.
[40] FIG. 2 is a perspective view illustrating the robot cleaner and a charging stand for charging the robot cleaner according to the embodiment of the present disclosure.
[41] FIG. 3 is an elevation view of the robot cleaner of FIG. 2 as viewed from the top.
[42] FIG. 4 is an elevation view of the robot cleaner of FIG. 2 as viewed from the front.
[43] FIG. 5 is an elevation view of the robot cleaner of FIG. 2 as viewed from the bottom.
[44] FIG. 6 is a block diagram illustrating a control relationship between main components of the robot cleaner of FIG. 2.
[45] FIG. 7 illustrates a control method of a robot cleaner according to an embodiment of the present disclosure.
[46] FIG. 8 is a flowchart illustrating a method of determining a current state of a robot cleaner of FIG. 7.
[47] FIGS. 9A to 13C are diagrams each illustrating a camera image, a state diagram of the robot cleaner, and a location of the robot cleaner in a spatial map for explaining the flow charts of FIGS. 7 and 8.
[48] FIG. 14 is a diagram illustrating a display state of a user terminal according to FIG. 8.
[49] FIGS. 15A and 15B are diagrams illustrating a correction of the spatial map according to FIGS. 7 and 8.
[50] In large and small comparisons expressed linguistically/mathematically throughout this description, "smaller than or equal to (smaller than)" and "smaller than (less than)" can be easily substituted with each other from the standpoint of those skilled in the art, 'greater than or equal to (greater than)' and 'greater than (excess)' can be easily substituted with each other from the standpoint of those skilled in the art, and it goes without saying that even if they are substituted in realizing the present disclosure, there is no problem in exerting the effect.
[51] Expressions referring to directions such as "before (F)/after (R)/left (Le)/right (Ri)/up (U)/down (D))" mentioned below are defined as indicated in the drawings, which is only intended to explain the present disclosure so that it can be clearly understood, and it goes without saying that each direction can be defined differently depending on where the criterion is placed.
[52] For example, the front may mean a main traveling direction of a robot cleaner or a main traveling direction of pattern traveling of the robot cleaner. Here, the main traveling direction may mean a vector sum value of directions traveling within a predetermined time.
[53] The use of terms such as "first" and "second" in front of components mentioned below is only to avoid confusion of components referred to, and is irrelevant to the order, importance, master-slave relationship, or the like between components. For example, the invention including only the second component without the first component may be implemented.
[54] In the drawings, the thickness or size of each component is exaggerated, omitted, or schematically illustrated for convenience and clarity of description. In addition, the size and area of each component do not fully reflect the actual size or area.
[55] In addition, angles and directions mentioned in the process of describing the structure of the present disclosure are based on those described in the drawings. In the description of the structure in the specification, when the reference point and the positional relationship with respect to the angle are not clearly mentioned, reference will be made to the related drawings.
[56] FIG. 1 is a configuration diagram of a robot system according to an embodiment of the present disclosure.
[57] Referring to FIG. 1, the robot system according to the embodiment of the present disclosure may include one or more robot cleaners 100 to provide a service at a designated place such as a home. For example, the robot system may include the robot cleaner 100 that provides a cleaning service for a designated place in a home or the like. In particular, the robot cleaner 100 may provide a dry, wet, or dry/wet cleaning service according to functional blocks included.
[58] Preferably, the robot system according to the embodiment of the present disclosure includes a plurality of artificial intelligence robot cleaners 100 and a server 2 that may manage and control a plurality of artificial intelligence robot cleaners 100.
[59] The server 2 may remotely monitor and control a state of the plurality of robot cleaners 100, and the robot system may provide more effective services by using the plurality of robot cleaners 100.
[60] The plurality of robot cleaners 100 and the server 2 include communication means (not shown) that support one or more communication standards, and may communicate with each other. In addition, the plurality of robot cleaners 100 and the server 2 may communicate with a PC, a mobile terminal, and other external servers 2.
[61] For example, the plurality of robot cleaners 100 and the server 2 may implement wireless communication with wireless communication technologies such as IEEE 802.11 WLAN, IEEE 802.15 WPAN, UWB, Wi-Fi, Zigbee, Z-wave, and Bluetooth. The robot cleaner 100 may vary depending on how other devices or the server with which the robot cleaner 100 wants to communicate is communicating.
[62] In particular, the plurality of robot cleaners 100 may implement wireless communication with other robots 100 and/or the server 2 through a 5G network. When the robot cleaner 100 performs wireless communication through the 5G network, real-time response and real-time control are possible.
[63] A user may check information on the robots 100 in the robot system through a user terminal 3 such as a PC or a mobile terminal.
[64] The server 2 is implemented as a cloud server 2, and the cloud server 2 may be linked to the robot 100 to monitor and control the robot cleaner 100 and provide various solutions and content remotely.
[65] The server 2 may store and manage information received from the robot cleaner 100 and other devices. The server 2 may be a server 2 that is provided by a manufacturer of the robot cleaner 100 or a company entrusted with services from the manufacturer. The server 2 may be a control server 2 that manages and controls the robot cleaner 100.
[66] The server 2 may identically control the robot cleaner 100 in a batch, or control the robot cleaner 100 individually. Meanwhile, the server 2 may be configured by distributing information and functions to a plurality of servers, or may be configured as a single integrated server.
[67] The robot cleaner 100 and the server 2 include communication means (not illustrated) that support one or more communication standards, and may communicate with each other.
[68] The robot cleaner 100 may transmit data related to space, object, and usage related data to the server 2.
[69] Here, the space and object related data may be recognition related data for a space or an object recognized by the robot cleaner 100, or image data for a space and an object acquired by an image acquirer.
[70] According to the embodiment, the robot cleaner 100 and the server 2 may include artificial neural networks (ANN) in the form of software or hardware that are learned to recognize at least one of attributes such as a user, a voice, space attributes, and object attributes such as obstacles.
[71] According to the embodiment of the present disclosure, the robot cleaner 100 and the server 2 may include a deep neural network (DNN), such as a convolutional neural network (CNN), a recurrent neural network (RNN), and a deep belief network (DBN), that is learned by deep learning. For example, a structure of the deep neural network (DNN) such as a convolutional neural network (CNN) may be mounted on a controller 140 of the robot cleaner 100.
[72] The server 2 may learn the deep neural network (DNN) based on data received from the robot cleaner 100, data input by a user, and the like, and then transmit the updated structure data of the deep neural network (DNN) to the robot 1. Accordingly, the structure of the deep neural network (DNN) of the artificial intelligence included in the robot 100 may be updated.
[73] In addition, the usage related data is data obtained according to the use of the robot cleaner 100, and may correspond to usage history data, detection signals obtained by the sensor unit, and the like.
[74] The learned structure of the deep neural network (DNN) may receive input data for recognition, recognize attributes of people, objects, and spaces included in the input data, and output the result.
[75] In addition, the learned structure of the deep neural network (DNN) may receive input data for recognition, and analyze and learn the usage related data of the robot cleaner 100 to recognize usage patterns and usage environment.
[76] Meanwhile, the space, object, and usage related data may be transmitted to the server 2 through the communication unit.
[77] The server 2 may learn the deep neural network (DNN) based on the received data, and then transmit the updated structure data of the deep neural network (DNN) to the robot cleaner 100 using the artificial intelligence for updating.
[78] Accordingly, the robot 100 is getting smarter, and it is possible to provide a user experience (UX) that evolves as it is used.
[79] On the other hand, the server 2 may provide the information on the control and current state of the robot cleaner 100 to the user terminal, and may generate and distribute an application for controlling the robot cleaner 100.
[80] Such an application may be an application for a PC applied as the user terminal 3 or may be an application for a smartphone.
[81] For example, such an application may be an application for controlling smart home appliances such as the SmartThinQ application, an application capable of simultaneously controlling and supervising various electronic products of the applicant.
[82] The robot cleaner 100 includes a main body 110. Hereinafter, in defining each part of the main body 110, a portion facing a ceiling in a traveling area is defined as an upper surface portion (see FIG. 3), a portion facing a bottom in the traveling area is defined as a lower surface portion (see FIG. 5), and a portion facing a traveling direction among portions forming a circumference of the main body 110 between the upper surface portion and the lower surface portion is defined as a front surface portion (see FIG. 4). In addition, a portion of the main body 110 facing a direction opposite to the front surface portion may be defined as a rear surface portion. The main body 110 may include a case 111 forming a space in which various components constituting the robot cleaner 100 are accommodated.
[83] The robot cleaner 100 includes a sensing unit 130 that detects surrounding situations. The sensing unit 130 may detect information outside the robot cleaner 100. The sensing unit 130 detects users around the robot cleaner 100. The sensing unit 130 may detect objects around the robot cleaner 100.
[84] The sensing unit 130 may detect information on a cleaning area. The sensing unit 130 may detect obstacles such as walls, furniture, and cliffs on a traveling surface. The sensing unit 130 may detect information on the ceiling. The sensing unit 130 may include objects placed on the traveling surface and/or an external upper object. The external upper object may include a ceiling or a lower surface of furniture disposed in an upper direction of the robot cleaner 100. Through the information detected by the sensing unit 130, the robot cleaner 100 may map the cleaning area.
[85] The sensing unit 130 may detect the information on the users around the robot cleaner 100. The sensing unit 130 may detect the location information of the user. The location information may include direction information on the robot cleaner 100. The location information may include distance information between the robot cleaner 100 and the user. The sensing unit 130 may detect the direction of the user with respect to the robot cleaner 100. The sensing unit 130 may detect a distance between the user and the robot cleaner 100.
[86] The location information may be immediately acquired by the detection of the sensing unit 130 or may be processed and acquired by the controller 140.
[87] The sensing unit 130 may include an image detector 135 that detects the surrounding images. The image detector 135 may detect an image in a specific direction for the robot cleaner 100. For example, the image detector 135 may detect an image in front of the robot cleaner 100. The image detector 135 photographs the traveling area and may include a digital camera. The digital camera may include at least one optical lens, an image sensor (for example, CMOS image sensor) configured to include a plurality of photodiodes (for example, pixels) formed by light passing through the optical lens, and a digital signal processor (DSP) constituting images based on signals output from the photodiodes. The digital signal processor may generate not only still images but also moving images that are composed of frames composed of the still images.
[88] The sensing unit 130 may include a distance detector 131 that detects a distance to the surrounding wall. The distance between the robot cleaner 100 and the surrounding wall may be sensed by the distance detector 131. The distance detector 131 detects a distance to a user in a specific direction of the robot cleaner 100. The distance detector 131 may include a camera, an ultrasonic sensor, an infrared (IR) sensor, or the like.
[89] The distance detector 131 may be disposed on the front surface portion of the main body 110 or may be disposed on a side surface portion thereof.
[90] The distance detector 131 may detect surrounding obstacles. The plurality of distance detectors 131 may be provided.
[91] The sensing unit 130 may include a cliff detector 132 that detects whether cliffs exist on a surface in the traveling area. The plurality of cliff detectors 132 may be provided.
[92] The sensing unit 130 may further include a lower image sensor 137 for obtaining an image of a surface.
[93] The robot cleaner 100 includes a traveler 160 that moves the main body 110. The traveler 160 moves the main body 110 with respect to the surface. The traveler 160 may include at least one driving wheel 166 that moves the main body 110. The traveler 160 may include a drive motor. The driving wheel 166 may be provided on left and right sides of the main body 110, respectively, and hereinafter, is referred to as a left wheel 166(L) and a right wheel 166(R), respectively.
[94] The left wheel 166(L) and the right wheel 166(R) may be driven by a single drive motor, but if necessary, may be provided with a left wheel drive motor for driving the left wheel 166(L) and a right wheel drive motor for driving the right wheel 166(R), respectively. The traveling direction of the main body 110 can be changed to the left or right by making a difference in a rotation speed of the left wheel 166(L) and the right wheel 166(R).
[95] The robot cleaner 100 includes a cleaner 180 that performs a cleaning function.
[96] The robot cleaner 100 may move the cleaning area and may clean the surface by the cleaner 180. The cleaner 180 is a suction device that sucks foreign objects, brushes 184 and 185 that perform combing, a dust bin (not illustrated) that stores foreign objects collected by the suction device or the brush, and/or a mop unit (not illustrated) that performs mopping, and the like.
[97] The lower surface portion of the main body 110 may be provided with a suction port 180h through which air is sucked. The suction device (not illustrated) that provides suction power so that air may be sucked through the suction port 180h, and the dust bin (not illustrated) that collects dust sucked together with the air through the suction port 180h may be provided in the main body 110.
[98] An opening for insertion and removal of the dust bin may be formed in the case 111, and a dust bin cover 112 for opening and closing the opening may be rotatably provided with respect to the case 111.
[99] A roll-shaped main brush 184 that has brushes exposed through the suction port 180h and an auxiliary brush 185 that is located in front of the lower surface portion of the main body 110 and has a plurality of blades radially extending may be provided. Dust is removed from the surface in the traveling area by the rotation of the brushes 184 and 185, and the dust separated from the surface is sucked through the suction port 180h and collected in the dust bin.
[100] A battery 138 may supply power necessary for the overall operation of the robot cleaner 100 as well as the drive motor. When the battery 138 is discharged, the robot cleaner 100 may perform traveling to return to the charging stand 200 for charging, and the robot cleaner 100 may detect the location of the charging stand 200 on its own during the return traveling.
[101] The charging stand 200 may include a signal transmission unit (not illustrated) that transmits a predetermined return signal. The return signal may be an ultrasonic signal or an infrared signal, but is not necessarily limited thereto.
[102] On the other hand, the image detector 135 is provided on the upper surface portion of the main body 110 to obtain an image of the ceiling in the cleaning area, but the location and the shooting range of the image detector 135 are not necessarily limited thereto. For example, the image detector 135 may be provided to acquire an image in front of the main body 110.
[103] In addition, the robot cleaner 100 may further include an operator (not illustrated) capable of inputting on/off or various commands.
[104] Referring to FIG. 6, the robot cleaner 100 includes a storage unit 150 that stores various data. Various data required for control of the robot cleaner 100 may be recorded in the storage unit 150. The storage unit 150 may include a volatile or nonvolatile recording medium. The recording medium stores data that may be read by a micro processor, and may include a hard disk drive (HDD), a solid state disk (SSD), a silicon disk drive (SDD), a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
[105] A map for a cleaning area may be stored in the storage unit 150. The map may be input by an external terminal capable of exchanging information with the robot cleaner 100 through wired or wireless communication, or may be generated by self-learning of the robot cleaner 100. In the former case, examples of external terminals include a remote control, a PDA, a laptop, a smartphone, a tablet, and the like that are equipped with an application for setting a map.
[106] Locations of a plurality of nodes corresponding to (one-to-one correspondence) a plurality of points in the cleaning area may be displayed on the map. Each area within a cleaning area may be displayed on the map. In addition, the current location of the robot cleaner 100 may be displayed on the map. The current location of the robot cleaner 100 on the map may be updated during the traveling.
[107] The traveling displacement measurer 165 may measure the traveling displacement based on the image acquired by the image detector 135. The traveling displacement is a concept including a moving direction and a moving distance of the robot cleaner 100. For example, the traveling displacement measurer 165 may measure the traveling displacement by comparing successive pixels of a surface image that varies according to the continuous movement of the robot cleaner 100.
[108] In addition, the traveling displacement measurer 165 may measure the traveling displacement of the robot cleaner 100 based on the operation of the traveler 160. For example, the controller 140 may measure the current or past moving speed, traveling distance, and the like of the robot cleaner 100 based on the rotation speed of the driving wheel 136, and may measure the current or past direction change process according to the rotation directions of each driving wheel 136(L) and 136(R).
[109] The traveling displacement measurer 165 may measure the traveling displacement using at least one of the distance detector 131 and the image detector 135.
[110] The controller 140 may recognize the location of the robot cleaner 100 on the map based on the measured traveling displacement.
[111] A transmitter 170 may transmit the information on the robot cleaner to another robot cleaner or a central server. A receiver 190 may receive the information from another robot cleaner or the central server. The information transmitted by the transmitter 170 or the information received by the receiver 190 may include configuration information of the robot cleaner.
[112] The robot cleaner 100 includes the controller 140 that processes and determines various types of information. The controller 140 may perform information processing for learning the cleaning area. The controller 140 may perform information processing for recognizing the current location on the map. The controller 140 may control the overall operation of the robot cleaner 100 by controlling various components (for example, traveling displacement measurer 165, distance detector 131, image detector 135, traveler 160, transmitter 170, receiver 190, and the like) constituting the robot cleaner 100.
[113] The control method according to the present embodiment may be performed by the controller 140. The present disclosure may be a control method of the robot cleaner 100 or may be the robot cleaner 100 including the controller 140 performing the control method. The present disclosure may be a computer program including each step of the control method, or may be a recording medium on which a program for implementing the control method with a computer is recorded. The "recording medium" means a computer-readable recording medium. The present disclosure may be a mobile robot control system including both hardware and software.
[114] The controller 140 of the robot cleaner 100 processes and determines various types of information such as mapping and/or recognizing the current location. The controller 140 may be provided to map the cleaning area through the image and learning and recognize the current location on the map. That is, the controller 140 may perform a simultaneous localization and mapping (SLAM) function.
[115] The controller 140 may control driving of the traveler 160. The controller 140 may control the operation of the traveler 180.
[116] The robot cleaner 100 includes the storage unit 150 that stores various types of data. The storage unit 150 records various types of information required for the control of the robot cleaner 100 and may include a volatile or nonvolatile recording medium.
[117] The real cleaning area may correspond to the cleaning area on the map. The cleaning area may be defined as a range in which all areas on a plane in which the robot cleaner 100 has driving experience and all areas on a plane in which the robot cleaner 100 is currently traveling are summed up.
[118] The controller 140 may determine a movement path of the robot cleaner 100 based on the operation of the traveler 160. For example, the controller 140 may identify the current or past moving speed, traveling distance, and the like of the robot cleaner 100 based on the rotation speed of the driving wheel 166, and may also identify the current or past direction change process according to the rotation directions of each driving wheel 166(L) and 166(R). Based on the driving information of the robot cleaner 100 thus identified, the location of the robot cleaner 100 on the map may be updated. In addition, the location of the robot cleaner 100 on the map may be updated using the image information.
[119] Specifically, the controller 140 controls the traveling of the robot cleaner 100 and controls the driving of the traveler 160 according to the set traveling mode. As the traveling mode of the traveler 160, a zigzag mode, an edge mode, a spiral mode, a hybrid mode, or the like may be selectively set.
[120] The zigzag mode is defined as a mode for cleaning while traveling in a zigzag path by being separated from a wall or an obstacle by a predetermined distance or more. The edge mode is defined as a mode that cleans while sticking to a wall and traveling in a zigzag path. The spiral mode is defined as a mode for spirally cleaning within a certain area around any one place.
[121] The controller 140 generates the map for the cleaning area. That is, the controller 140 may form the spatial map for the cleaning area through a location recognized through prior cleaning and an image acquired at each location. The controller 140 may also update a previously generated map, classify a type of a cleaning area of a spatial map generated according to conditions, and match a cleaning method according to the classified type. In addition, the controller 140 performs the cleaning of the robot cleaner 100 in a highly efficient manner by calculating the efficiency of cleaning in a cleaning method matched with cleaning in the basic mode.
[122] The controller 140 generates a basic map according to the detection signal of the sensing unit 130, specifically, the detection signal from the traveling displacement measurer 165, the distance detector 131, and the cliff detector 132. Such a basic map may be a general grid map, and may be generated based on a direction in which the robot cleaner 100 rotates while traveling, a straight travel distance, a distance from a wall, and the like.
[123] The controller 140 may extract the spatial information from the image data from the image detector 135 and add the extracted spatial information to the basic map to generate the spatial map.
[124] The controller 140 may further include a map forming unit for forming the spatial map, but may be processed together in the controller 140.
[125] The map generator generates a basic map through the detection signal obtained through the prior cleaning, and generates the spatial map by adding the spatial information from the image data to the basic map. In this case, the map generator may extract spatial information, which is the detailed information on the space, from the image data generated by being continuously photographed and added to the spatial map.
[126] Specifically, the map generator may extract the straight line from the continuous image data and extract a vanishing point. In addition, an edge can be extracted by matching the straight line, and an angle calculation is possible. In addition, it is also possible to use triangulation to infer the depth of the straight line. Using such spatial information, it is possible to extract the width of the space, the height of the space, and the like. In addition, when an obstacle is detected from the image data, the map generator may determine the size of the obstacle not only in width and length, but also in height through algorithms such as a compression neural network technique, and may extract information on which object the obstacle is. In this way, the name and size of the obstacle may be defined, and the information on the obstacle may also be reflected in the spatial map to form a final spatial map.
[127] Unlike a two-dimensional basic map with a general grid structure, such a final spatial map is constituted by walls and walls to partition the space, and is shown so that the walls and the space formed by the walls may be distinguished from other spaces.
[128] The map generator may be divided into a plurality of cleaning areas so that cleaning may be performed by traveling at once based on the final spatial map.
[129] The formation of the final spatial map and the partition of the cleaning area may be performed by the map generator, but as described above, may be performed in a batch by the controller 140.
[130] The controller 140 may partition the cleaning area according to the area of the mapped cleaning area to match the optimal cleaning method according to the shapes of each of the partitioned sub cleaning areas.
[131] The controller 140 may recognize the current location using at least one of the traveling displacement measurer 165, the distance detector 131, and the image detector 135, and may recognize the current location on the map.
[132] Meanwhile, the controller 140 may determine the current state of the robot cleaner 100 by comparing the detection signals of the driving displacement measurer 165, the distance detector 131, and the image detector 135 for a plurality of cycles.
[133] That is, by determining whether there is an error between the detection signals to the driving displacement measurer 165, the distance detector 131, and the image detector 135, it is determined whether the robot cleaner 100 is in a restrained state, and as a result, the final map may be calculated to correct the basic map or the spatial map.
[134] An input unit 171 may receive on/off or various commands. The input unit 171 may include a button, a key, a touch type display, or the like. The input unit 171 may include a microphone for voice recognition.
[135] An output unit 173 may notify a user of various types of information. The output unit 173 may include a speaker and/or a display.
[136] Hereinafter, a control method according to an embodiment of the present disclosure will be described with reference to FIGS. 7 to 13C.
[137] FIG. 7 is a flow chart illustrating a method of controlling a robot cleaner according to an embodiment of the present disclosure, FIG. 8 is a flowchart illustrating a method of determining a current state of the robot cleaner of FIG. 7, and FIGS. 9A to 13C are diagrams each illustrating a camera image, a state diagram of the robot cleaner, and a location of the robot cleaner in a spatial map for explaining the flow charts of FIGS. 7 and 8.
[138] In each flowchart, overlapping contents are indicated by the same reference numerals, and overlapping descriptions will be omitted.
[139] The control method may be performed by the controller 140. Each step of the flowchart diagrams of the control method and combinations of the flowchart diagrams may be performed by computer program instructions. The instructions may be mounted on a general purpose computer, a special purpose computer, or the like, and the instructions generate means for performing the functions described in the flowchart step(s).
[140] In addition, in some embodiments, it is possible for the functions mentioned in the steps to occur out of order. For example, two steps that are continuously shown may be simultaneously performed in fact or be performed in a reverse sequence depending on corresponding functions.
[141] Referring to FIG. 7, in the control method according to the embodiment, cleaning start information is received through the server 2 or the user terminal 3 (S100).
[142] At this time, the robot cleaner 100 cleans the cleaning area from the current location (S110).
[143] The controller 140 performs cleaning while driving the cleaning area, and accumulates each detection signal by detecting the cleaning area. In this case, the traveling during the cleaning may be performed in an edge mode or a zigzag mode.
[144] Specifically, the robot cleaner 100 periodically detects the moving displacement and the surrounding environment through the traveling displacement measurer 165 through the wheel detection from the traveler 160 and the distance detector 131 through the ultrasonic sensor or the like, thereby acquiring the basic map as illustrated in FIG. 9C.
[145] That is, while proceeding in the edge mode or the zigzag mode, the basic map as illustrated in FIG. 9C may be created while classifying a travelable area and a non-travelable area from the distance detector 131 through the obstacle detection sensor.
[146] Further, the controller 140 may receive image data from the image detector 135 in accordance with the detection cycle of the traveling displacement measurer 165 and the distance detector 131.
[147] Such image data may be a still image photographing the front of the robot cleaner 100 as illustrated in FIG. 9A, but is not limited thereto.
[148] However, as illustrated in FIG. 9A, it may be an image of a specific direction from the robot cleaner 100, and a distance from a specific obstacle or the like may be calculated by periodically photographing an image in the same direction.
[149] Referring to FIGS. 9A and 9B, the image data photographed in the previous cycle is one obtaining a front obstacle ob through the image detector 135, and the state of the robot cleaner 100 at this time is illustrated in FIG. 9B.
[150] That is, for the front obstacle ob with respect to the traveling direction of the robot cleaner 100, an obstacle is located within a wide angle of the camera of the image detector 135, and the image data therefor is obtained.
[151] At this time, the location of the robot cleaner 100 in the basic map represents 10 on the x-axis and 100 on the y-axis as illustrated in FIG. 9C, and represents the path along which the robot cleaner 100travels.
[152] For each cycle, the location of the robot cleaner 100 may be recorded in the basic map, and such a basic map may be recorded in the storage unit 150 to represent a spatial map of the overall cleaning area.
[153] For the next cycle, the robot cleaner 100 performs the cleaning while moving along the driving direction, and as in the previous cycle, the detection signal is obtained from each functional block.
[154] At this time, it is possible to determine whether there is a difference by comparing the current image data from the image detector 135 with the image data of the previous cycle (S130).
[155] If there is no such change in the image data, it is determined that the abnormality has occurred in the state of the robot cleaner 100, and the basic map created accordingly is corrected (S140).
[156] Such abnormal state determination and basic map correction will be described in detail later with reference to FIGS. 8 to 13.
[157] When the map is corrected for the abnormal state, the controller 140 of the robot cleaner 100 may control to deviate from the abnormal state accordingly (S150).
[158] Accordingly, since the direction of the robot cleaner 100 is changed, the traveling direction may be changed to perform traveling. When such a direction change is difficult, the user terminal 3 may be alerted to induce the location change.
[159] Next, when the robot cleaner 100 deviates from the abnormal state, the normal traveling and the cleaning may be performed, and the mapping may be continuously performed on the corrected basic map (S160).
[160] On the other hand, the abnormal state determination and basic map correction of the robot cleaner 100 will be described in detail.
[161] First, as illustrated in FIG. 8, the detection signal in the current cycle are obtained in the state in which the detection signals are stored from each functional block in the previous cycle, and the image data photographing the front of the robot cleaner 100 is stored (S131).
[162] The controller 140 obtains the information on the moving distance and direction change through the wheel sensor from the traveling displacement measurer 165 in the current cycle in the same manner as in the previous cycle, obtains the detection signal on the moving distance and the like from the front obstacle ob from the distance detector 131, and obtains the front image data from the image detector 135.
[163] The detection signal and image data of various functional blocks obtained as described above may determine the current state of the robot cleaner 100 by comparing a value for the previous cycle and a value for the current cycle.
[164] That is, as illustrated in FIG. 8, it is determined whether the image data of the previous cycle and the image data of the current cycle are the same (S132).
[165] When the image data of the current cycle is obtained as illustrated in FIG. 10A, it is possible to compare whether there is a difference between FIG. 9A, which is the image data of the previous cycle, and FIG. 10A, which is the image data of the current cycle. The comparison determination may be performed by comparing pixel samples and the like, and may be easily calculated through distance comparison between feature points.
[166] In this case, as illustrated in FIGS. 9 to 13, when the cycle is very short and it is difficult to clearly determine the difference between the image data, the image data may be compared for a plurality of cycles.
[167] That is, as illustrated in FIG. 11A, all image data for the third round may be obtained and compared with each other.
[168] At this time, if the image data are different from each other, it is determined that the traveling is normal, and the traveling is continued until the detection signal of the next cycle is obtained.
[169] On the other hand, when it is determined that the image data for the plurality of rounds is the same (S133), it is determined whether the current robot cleaner 100 is in the advanced state by reading the location information of the robot cleaner 100 in the basic map (S133).
[170] That is, referring to FIGS. 9C, 10C, and 11C, it is determined whether the detection signal from the traveling displacement measurer 165 indicates that the wheel of the robot cleaner 100 rotates and travels in each cycle.
[171] Coordinates of the robot cleaner 100 in FIG. 9C are (10, 100), coordinates in FIG. 10C are (10, 120), and coordinates of the robot cleaner 100 in FIG. 11C are (10, 140), which may be regarded as traveling along a y axis.
[172] Therefore, from the detection signal of the traveling displacement measurer 165, it is determined that the robot cleaner 100 travels along the y axis, but when there is no change in the image data, it is finally determined that the abnormality has occurred in the current robot cleaner 100 (S134).
[173] That is, it is determined that the wheel is slipping because the robot cleaner 100 is in the restrained state.
[174] However, even in such a situation, when the mapping is continued only with the recording of the traveling displacement measurer 165, an error may occur in the mapping, so the correction of the basic map is required.
[175] Accordingly, the controller 140 may secure data of the previous cycle and search for when the state in which there is no change in the image data starts.
[176] When only the image data of FIGS. 9A to 11A appear identically and there is a difference between the image data in the previous cycle of FIG. 9A, the controller 140 may perform the correction on the map by the difference between the coordinates in the cycle of FIG. 9C and the coordinates of FIG. 11C (S135).
[177] Therefore, the basic map correction is performed by deleting the moving distance by 40 in the y-axis direction from the basic map.
[178] Therefore, the location of the robot cleaner 100 in the current cycle is corrected to meet (10, 100) again.
[179] Such calculation and data correction may be performed through a global kidnap recovery (GKR) algorithm, but may be performed through a simple program.
[180] Next, the controller 140 performs an escape motion from the restrained state as illustrated in FIG. 12B (S136).
[181] That is, the controller 140 may control the traveler 160 to perform a predetermined restraint escape motion, and the restraint escape motion is set to rotation, rapid back-moving, etc. according to the angle of the surrounding obstacle ob.
[182] When the rotation is performed as illustrated in FIG. 12B, the image data of the next cycle received from the image detector 135 may vary as illustrated in FIG. 12A.
[183] At this time, as illustrated in FIG. 12C, the coordinates of the robot cleaner 100 of the basic map may continue to maintain the corrected current position (10, 100), and only the direction may be changed.
[184] Next, as illustrated in FIG. 13B, the traveling in a direction opposite to the previous traveling direction may be performed, and the mapping is performed again at the points 10 and 100 where the last mapping is performed.
[185] Accordingly, the traveling direction changes reversely as illustrated in FIG. 13B, and the image data as illustrated in FIG. 13A may be obtained.
[186] As illustrated in FIGS. 12A and 13A, it is confirmed that the robot cleaner 100 is in a normal traveling state because the image data photographed according to the traveling of the robot cleaner 100, that is, the change in the location, changes significantly.
[187] In this way, while the robot cleaner 100 is traveling, when the data obtained from the traveling displacement measurer 165 and the image data from the image detector 135 indicate different states, it is determined that the robot cleaner 100 is in an abnormal state and the mapping of the robot cleaner 100 may be corrected to provide an accurate map.
[188] At this time, when the abnormal state of the robot cleaner 100 is restricted to a specific position and it is impossible for the robot cleaner 100 to escape due to the direction change or the back-moving, the robot cleaner 100 may alarm the user terminal 3 as illustrated in FIG. 14.
[189] Hereinafter, a screen provided to the user terminal 3 will be described with reference to FIGS. 14 and 15.
[190] FIG. 14 illustrates a display state of the user terminal according to FIG. 8, and FIGS. 15A and 15B are diagrams illustrating the spatial map correction according to FIGS. 7 and 8.
[191] Referring to FIG. 14, the user terminal 3 in the smart home system including the robot cleaner 100 is installed with an application for controlling the robot cleaner 100.
[192] When the robot cleaner 100 is selected through such an application, an icon for controlling the robot cleaner 100 is displayed as illustrated in FIG. 14.
[193] At this time, when the robot cleaner 100 is in an abnormal state as illustrated in FIGS. 7 to 13C while the robot cleaner 100 is performing cleaning under the control of the user terminal 3, the robot cleaner 100 performs an operation to escape from the abnormal state by itself.
[194] Such an operation may be set by the rotation, the rapid back-moving, or the like as described above.
[195] However, in a situation where it is difficult for the robot cleaner to escape by itself, such as when the robot cleaner 100 is placed in a stepped area or an area where a retreat is blocked, it is possible to alarm the user through the application of the user terminal 3.
[196] Such an alarm may be provided in a sentence such as "Help escape", and may be provided visually and/or audible.
[197] In this case, the robot cleaner 100 may provide the information on the current location of the robot cleaner 100 by providing the information on the spatial map that is the corrected basic map.
[198] The spatial map may be a map as illustrated in FIG. 15B in which an error area A due to the wheel slip is removed and corrected by the restraint of the robot cleaner 100 as illustrated in FIG. 15A.
[199] Accordingly, by the alarm from the user terminal 3, the user moves to the corresponding location and may directly release the robot cleaner 100 from the restrained state.
[200] That is, the robot cleaner 100 may perform an escape on its own according to the restrained state, or may be released from the restrained state through a user alarm.
[201] In this way, it is determined that the robot cleaner 100 is in the restrained state and the basic map is corrected accordingly to generate the accurate basic map, thereby increasing the success rate of homing.

Claims (20)

  1. A robot cleaner comprising:
    a traveler that moves a main body;
    a cleaner that performs a cleaning function;
    a traveling displacement measurer that detects traveling displacement;
    an image detector that obtains image data by periodically photographing a surrounding environment; and
    a controller that performs cleaning on a cleaning area, generates a map for the cleaning area based on information and the image data detected through the traveling displacement measurer and the image detector, and corrects and provides the map by reading a change in the image data to determine whether the robot cleaner travels abnormally.
  2. The robot cleaner of claim 1, wherein the map includes physical shape information on the cleaning area and information on a current location of the robot cleaner.
  3. The robot cleaner of claim 2, wherein when there is no difference between the image data periodically obtained, the controller determines that the robot cleaner is in an abnormal state.
  4. The robot cleaner of claim 3, wherein the controller determines that the robot cleaner is in the abnormal state when there is no difference between the image data periodically obtained, and a detection signal from the traveling displacement measurer indicates that the robot cleaner is traveling.
  5. The robot cleaner of claim 4, wherein the controller compares the image data and performs a map correction to remove a traveling distance of the robot cleaner from a point in time when there is no change in the image data.
  6. The robot cleaner of claim 5, wherein when the robot cleaner is in the abnormal state, the controller induces escape by performing a restraint escape motion.
  7. The robot cleaner of claim 6, wherein the controller performs an alarm to a user terminal when the restraint escape of the robot cleaner does not proceed.
  8. The robot cleaner of claim 7, wherein the user terminal has an application installed to control the robot cleaner, the map corrected through the application is provided, and the alarm for the restraint escape is transmitted through the application from the robot cleaner.
  9. The robot cleaner of claim 1, wherein the robot cleaner collects the information and the image data on the cleaning area while performing cleaning in an edge mode or a zigzag mode.
  10. A control method of a robot cleaner, comprising:
    obtaining a detection signal for detecting traveling displacement by performing cleaning while traveling in a cleaning area and obtaining image data by photographing a surrounding environment;
    generating a map for the cleaning area based on the detection signal and the image data; and
    correcting the map by reading a change in the image data to determine whether the robot cleaner travels abnormally.
  11. The control method of claim 10, wherein the map includes physical shape information on the cleaning area and information on a current location of the robot cleaner.
  12. The control method of claim 11, wherein in the correction of the map, when there is no difference between the image data periodically obtained, the controller determines that the robot cleaner is in an abnormal state.
  13. The control method of claim 12, wherein in the correcting of the map, the controller determines that the robot cleaner is in the abnormal state when there is no difference between the image data periodically obtained, and the detection signal from the traveling displacement measurer indicates that the robot cleaner is traveling.
  14. The control method of claim 13, wherein in the correcting of the map, the controller compares the image data and performs a map correction to remove a traveling distance of the robot cleaner from a point in time when there is no change in the image data.
  15. The control method of claim 14, wherein the control method of the robot cleaner further includes inducing escape by a restraint escape motion when the robot cleaner is in an abnormal state.
  16. The control method of claim 15, wherein in the inducing of the escape, an alarm is issued to a user terminal when the restraint escape of the robot cleaner does not proceed.
  17. The control method of claim 16, wherein the user terminal has an application installed to control the robot cleaner, the map corrected through the application is provided, and the alarm for the restraint escape is transmitted from the robot cleaner.
  18. The control method of claim 11, wherein in the performing of the cleaning, the detection signal and the image data on the cleaning area are collected while the cleaning is performed in an edge mode or a zigzag mode.
  19. A robot cleaner system includes:
    a robot cleaner that performs cleaning on a cleaning area while moving a main body, the robot cleaner including a traveling displacement measurer that periodically detects traveling displacement;
    an image detector that obtains image data by periodically photographing a surrounding environment;
    a controller that performs cleaning on a cleaning area, generates a map for the cleaning area based on information and the image data detected through the traveling displacement measurer and the image detector, and corrects and provides the map by reading a change in the image data to determine whether the robot cleaner travels abnormally; and
    a user terminal that has an application installed to control the robot cleaner to clean and travel and receives an alarm of whether the robot cleaner is in an abnormal state by receiving the map from the application.
  20. The robot cleaner system of claim 19, wherein the controller determines that the robot cleaner is in the abnormal state when there is no difference between the image data periodically obtained, and a detection signal from the traveling displacement measurer indicates that the robot cleaner is traveling.
PCT/KR2021/000167 2020-01-08 2021-01-07 Robot cleaner using artificial intelligence and control method thereof WO2021141396A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020200002651A KR102423573B1 (en) 2020-01-08 2020-01-08 A robot cleaner using artificial intelligence and control method thereof
KR10-2020-0002651 2020-01-08

Publications (1)

Publication Number Publication Date
WO2021141396A1 true WO2021141396A1 (en) 2021-07-15

Family

ID=76788118

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2021/000167 WO2021141396A1 (en) 2020-01-08 2021-01-07 Robot cleaner using artificial intelligence and control method thereof

Country Status (2)

Country Link
KR (1) KR102423573B1 (en)
WO (1) WO2021141396A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113729578A (en) * 2021-08-04 2021-12-03 深圳创动科技有限公司 Cleaning robot, motion state monitoring method thereof, server and storage medium
US20220342421A1 (en) * 2021-04-23 2022-10-27 Irobot Corporation Navigational control of autonomous cleaning robots
WO2023222751A1 (en) * 2022-05-18 2023-11-23 Nilfisk A/S A method of estimating a position of a cleaning machine

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20230158668A (en) * 2022-05-11 2023-11-21 삼성전자주식회사 Robot and controlling method thereof

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100406636B1 (en) * 2001-04-18 2003-11-21 삼성광주전자 주식회사 Robot cleaner and system and method of controling thereof
KR101641244B1 (en) * 2010-02-02 2016-07-20 엘지전자 주식회사 Robot cleaner and controlling method thereof
KR20180121244A (en) * 2017-04-28 2018-11-07 엘지전자 주식회사 Moving robot and controlling method thereof
WO2019083291A1 (en) * 2017-10-25 2019-05-02 엘지전자 주식회사 Artificial intelligence moving robot which learns obstacles, and control method therefor
KR102021834B1 (en) * 2017-07-12 2019-09-17 엘지전자 주식회사 Moving Robot and controlling method

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20120065153A (en) 2010-12-10 2012-06-20 엘지전자 주식회사 System and method for paying deposit
KR20170003764A (en) 2015-06-30 2017-01-10 넥서스환경디자인연구원(주) Narrow-mouth frog habitat restoration facilities
JP6814118B2 (en) * 2017-09-15 2021-01-13 株式会社日立製作所 Robot controls, systems, and methods

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100406636B1 (en) * 2001-04-18 2003-11-21 삼성광주전자 주식회사 Robot cleaner and system and method of controling thereof
KR101641244B1 (en) * 2010-02-02 2016-07-20 엘지전자 주식회사 Robot cleaner and controlling method thereof
KR20180121244A (en) * 2017-04-28 2018-11-07 엘지전자 주식회사 Moving robot and controlling method thereof
KR102021834B1 (en) * 2017-07-12 2019-09-17 엘지전자 주식회사 Moving Robot and controlling method
WO2019083291A1 (en) * 2017-10-25 2019-05-02 엘지전자 주식회사 Artificial intelligence moving robot which learns obstacles, and control method therefor

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220342421A1 (en) * 2021-04-23 2022-10-27 Irobot Corporation Navigational control of autonomous cleaning robots
US11940800B2 (en) * 2021-04-23 2024-03-26 Irobot Corporation Navigational control of autonomous cleaning robots
CN113729578A (en) * 2021-08-04 2021-12-03 深圳创动科技有限公司 Cleaning robot, motion state monitoring method thereof, server and storage medium
WO2023222751A1 (en) * 2022-05-18 2023-11-23 Nilfisk A/S A method of estimating a position of a cleaning machine

Also Published As

Publication number Publication date
KR20210089462A (en) 2021-07-16
KR102423573B1 (en) 2022-07-20

Similar Documents

Publication Publication Date Title
WO2021141396A1 (en) Robot cleaner using artificial intelligence and control method thereof
WO2019124913A1 (en) Robot cleaners and controlling method thereof
WO2018139865A1 (en) Mobile robot
WO2021006556A1 (en) Moving robot and control method thereof
WO2018124682A2 (en) Mobile robot and control method therefor
WO2017091008A1 (en) Mobile robot and control method therefor
WO2016200098A1 (en) Mobile robot and method of controlling same
WO2021006677A2 (en) Mobile robot using artificial intelligence and controlling method thereof
WO2020139064A1 (en) Cleaning robot and method of performing task thereof
AU2020209330B2 (en) Mobile robot and method of controlling plurality of mobile robots
EP3525992A1 (en) Mobile robot system and mobile robot
WO2021002499A1 (en) Method for tracking user location by using swarm robots, tag device, and robot implementing same
WO2019117576A1 (en) Mobile robot and mobile robot control method
AU2020231781B2 (en) Moving robot and controlling method for the moving robot
AU2020362530B2 (en) Robot cleaner and method for controlling the same
WO2021006674A2 (en) Mobile robot and control method therefor
WO2020230931A1 (en) Robot generating map on basis of multi-sensor and artificial intelligence, configuring correlation between nodes and running by means of map, and method for generating map
AU2020253014B2 (en) Robot cleaner using artificial intelligence and control method thereof
WO2021020911A1 (en) Mobile robot
WO2020251274A1 (en) Robot cleaner using artificial intelligence and control method thereof
AU2020208074B2 (en) Mobile robot and method of controlling mobile robot
AU2018257677B2 (en) Moving robot and control method thereof
WO2021177724A1 (en) Mobile robot and control method therefor
WO2020138954A1 (en) Mobile robot and method for controlling mobile robot
WO2021006550A1 (en) Robot cleaner using artificial intelligence and controlling method thereof

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21738841

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21738841

Country of ref document: EP

Kind code of ref document: A1