EP2908204A1 - Robot nettoyeur et son procédé de commande - Google Patents

Robot nettoyeur et son procédé de commande Download PDF

Info

Publication number
EP2908204A1
EP2908204A1 EP15154611.6A EP15154611A EP2908204A1 EP 2908204 A1 EP2908204 A1 EP 2908204A1 EP 15154611 A EP15154611 A EP 15154611A EP 2908204 A1 EP2908204 A1 EP 2908204A1
Authority
EP
European Patent Office
Prior art keywords
cleaning
room
location
door
robot cleaner
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
EP15154611.6A
Other languages
German (de)
English (en)
Other versions
EP2908204B1 (fr
Inventor
Kyungmin Han
Taebum Kwon
Donghoon Yi
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
LG Electronics Inc
Original Assignee
LG Electronics Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by LG Electronics Inc filed Critical LG Electronics Inc
Publication of EP2908204A1 publication Critical patent/EP2908204A1/fr
Application granted granted Critical
Publication of EP2908204B1 publication Critical patent/EP2908204B1/fr
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L9/00Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating action; Storing devices specially adapted to suction cleaners or parts thereof; Carrying-vehicles specially adapted for suction cleaners
    • A47L9/28Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4011Regulation of the cleaning machine by electric means; Control systems and remote control systems therefor
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L9/00Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating action; Storing devices specially adapted to suction cleaners or parts thereof; Carrying-vehicles specially adapted for suction cleaners
    • A47L9/28Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means
    • A47L9/2857User input or output elements for control, e.g. buttons, switches or displays
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/0274Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • A47L2201/04Automatic control of the travelling movement; Automatic obstacle detection
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • A47L2201/06Control of the cleaning action for autonomous devices; Automatic detection of the surface condition before, during or after cleaning

Definitions

  • the present invention relates to a robot cleaner, and more particularly, to a robot cleaner and controlling method thereof.
  • the present invention is suitable for a wide scope of applications, it is particularly suitable for partitioning a cleaning area with reference to a passage or wall opening, such as a door, by recognizing the passage (door) and then cleaning the partitioned cleaning areas sequentially.
  • a vacuum cleaner is a device for cleaning a room floor, a carpet and the like.
  • the vacuum cleaner sucks in the air containing particles from outside by activating an air suction device configured with a motor, a fan and the like to generate an air sucking force by being provided within a cleaner body, collects dust and mist by separating the particles, and then discharges a particle-free clean air out of the cleaner.
  • the vacuum cleaner may be classified into a manual vacuum cleaner directly manipulated by a user or a robot cleaner configured to do a cleaning by itself without user's manipulation.
  • the robot cleaner sucks particles including dust and the like from a floor while running by itself within an area to clean up.
  • the robot cleaner composes an obstacle map or a cleaning map including obstacle information using an obstacle sensor and/or other sensor(s) provided to the robot cleaner and is able to clean up a whole cleaning area by auto-run.
  • a residential space such as a house is generally partitioned into a plurality of rooms through doors.
  • a whole cleaning area can be partitioned into a plurality of zones or rooms through doors.
  • a cleaning is normally done by room unit. For instance, a cleaning is performed in order of a bedroom, a living room, a kitchen and a small room. So to speak, it barely occurs that a cleaning is done in order of bedroom ⁇ living room ⁇ bedroom. The reason for this is that a user intuitively or unconsciously recognizes that a room-unit cleaning or a sequential cleaning of a plurality of rooms is an efficient cleaning method.
  • a robot cleaner randomly cleans a whole cleaning area in general.
  • a robot cleaner generally does a cleaning by partitioning a whole cleaning area into a plurality of cleaning zones. Yet, such a cleaning area partitioning is not a room-unit partitioning. The reason for this is that a cleaning area is arbitrarily partitioned into a plurality of zones based on coordinate information on the cleaning area only.
  • a prescribed cleaning area may be set across two rooms. While a cleaning of one of the two rooms is not done yet, a cleaning of the other may be attempted.
  • the robot cleaner may do the cleaning by frequently moving between the two rooms unnecessarily. Eventually, a cleaning efficiency is lowered and user's reliability on the robot cleaner is decreased as well.
  • the robot cleaner does the cleaning by moving between two rooms frequently, it is contrary to the intuitive cleaning method. In particular, if a user observes the cleaning work done by the robot cleaner, the user may think that 'This robot cleaner is not smart'.
  • a separate artificial device such as a signal generator, a sensor or the like
  • it is attempted to distinguish rooms in a manner that a robot cleaner indirectly recognizes the door location through the installed device.
  • the separate device needs to be installed separately from the robot cleaner, a product cost is raised or inconvenience is caused to a user.
  • the separate device may degrade a fine view and may be possibly damaged due to being left alone for a long time.
  • the door sill detection sensor consists of a light emitting unit and a light receiving unit, having limitations put on raising recognition accuracy. The reason for this is that, since a size, shape, surface roughness, surface profile and color of a door sill are not uniform, light has difficulty in being reflected by the door sill effectively.
  • a door sill tends to be removed from a residential space in general.
  • rooms are partitioned through a door, since there is no door sill, floors of the rooms are continuously connected to each other. Hence, it is meaningless to distinguish rooms in a cleaning area without a door sill using a door sill detection sensor.
  • the present invention is directed to a robot cleaner and controlling method thereof that substantially obviate one or more problems due to limitations and disadvantages of the related art.
  • One object of the present invention is to provide a robot cleaner and controlling method thereof, by which a cleaning can be done by room unit in a manner of recognizing a cleaning area by the room unit through a door.
  • Another object of the present invention is to provide a robot cleaner and controlling method thereof, by which a product cost can be lowered using a related art camera without a separate configuration for recognizing a door.
  • Another object of the present invention is to provide a robot cleaner and controlling method thereof, by which a room including a specific location can be exclusively cleaned after designation of the specific location.
  • Another object of the present invention is to provide a robot cleaner and controlling method thereof, by which a whole cleaning area can be cleaned by room units in a manner of partitioning the whole cleaning area into a plurality of rooms and then doing and completing the cleaning of the rooms sequentially (i.e., one by one).
  • Another object of the present invention is to provide a robot cleaner and controlling method thereof, by which various tastes of a user can be satisfied in a manner of executing other cleaning modes as well as a room-unit cleaning mode.
  • door may refer generally to a passage or wall opening between two rooms, which may not be closable. This is to provide for the recent trend of openings or passages between rooms without a closable door leaf.
  • a method of controlling a robot cleaner may include the steps of deriving a door location in a cleaning area through an image information, composing a cleaning map by detecting an obstacle in the cleaning area, creating a room information for distinguishing a plurality of rooms partitioned with reference to the door from each other by having the derived door location reflected in the cleaning map, and performing a cleaning by room units distinguished from each other through the room information.
  • Room information may refer to an identifier associated with a room, and may be unique for each room.
  • a camera may be provided to the robot cleaner.
  • the camera may photograph a front image or a top image (e.g., an image of a ceiling, etc.).
  • the door location deriving step may include the step of creating the image information while the robot cleaner runs in the cleaning area.
  • Feature lines corresponding to a door shape may be extracted from image information in the door location deriving step. Further, a combination of the feature lines may be recognized as a door.
  • the running of the robot cleaner may be performed for the creation of the image information only.
  • the running of the robot cleaner may be performed for the running for the cleaning or the creation of an obstacle map.
  • the running of the robot cleaner may be performed to execute a plurality of functions simultaneously. For instance, the robot cleaner can create the obstacle map and the image information while running for the cleaning.
  • a timing point of creation of the image information or a presence or non-presence of a simultaneous execution with another function may vary depending on an initial attempt of the cleaning of the cleaning area of the robot cleaner or a cleaning attempt after cumulation of experiences of cleanings of the same cleaning area.
  • a robot cleaner capable of a general random cleaning or a cleaning through a random cleaning area as well as a room-unit cleaning. So to speak, it is able to provide a robot cleaner capable of selecting one of a plurality of cleaning modes.
  • a presence or non-presence of the image information for the door location derivation or a creation timing point of the image information can be diversified.
  • the door location deriving step may be started during the cleaning map composing step. After completion of the cleaning map composing step, the door location deriving step may be completed. Hence, it is able to prevent the robot cleaner from running individually for the cleaning map composition and the door location derivation.
  • the door location deriving step may be started and then completed.
  • the cleaning map composition and the door location derivation can be separately performed. This may raise efficiency and accuracy of each function execution.
  • the door location derivation may be performed only if a user's selection is made, for example.
  • the cleaning map composing step may be completed before the user's selection.
  • the door location derivation is performed only by skipping the cleaning map composition.
  • the feature lines may be sorted into a vertical line and a horizontal line and the door may be recognized through a combination of the vertical line and the horizontal line.
  • the reason for this is that a door in a normal residential space has a rectangular shape configured with vertical lines and horizontal lines.
  • the door may be closed or open.
  • the recognition of the door location may be achieved not through the door itself but through a door frame or in case of a passage without door, through the passage, i.e. the wall opening of the passage.
  • the door location deriving step may further include the step of grouping similar feature lines through angle and location informations of the feature lines recognized as the door.
  • angle refers to an angle based on the ceiling.
  • the door or passage may includes a pair of substantially vertical lines based on the ceiling which are parallel, and a substantially horizontal line based on the ceiling. The horizontal line of the door or passage may be located between the pair of the vertical lines and adjacent the ceiling.
  • Image information on a single door can be created in various viewpoints. For instance, if a distance difference between the door and the robot cleaner varies or the robot cleaner is located in front/rear/left/right side of the door, the feature lines may be obtained differently. Likewise, as mentioned in the foregoing description, in case of photographing not a real door but a door frame or passage, various feature lines may be obtained from a single door frame or wall opening of the passage.
  • feature lines similar to a door may be obtained. Yet, it is difficult to group these feature lines. In other words, there are not so many feature lines having the similar angles and location informations. Hence, these feature lines are not recognized as a door through the grouping step.
  • the door location deriving step may further include the step of calculating an average angle and an average location of the grouped feature lines.
  • it is able to perform the step of calculating the average angle and the average location of the feature lines for the door candidates recognized as a door by excluding the door candidates failing to be recognized as the door.
  • the door location may be derived through the calculated average angle and the calculated average location.
  • a door or passage may be recognized based on a predetermined height of a pair of vertical feature lines, e.g. to differentiate a door or passage from a table or the like.
  • a door or passage may be recognized based on a predetermined separation distance of two vertical feature lines. It is noted that the terms “horizontal” and “vertical” may refer to the orientation of the object contour lines in the room corresponding to the feature lines on the image.
  • the robot cleaner may perform a plurality of cleaning modes.
  • a random mode of cleaning a cleaning area randomly, a random mode of cleaning a cleaning area in zigzag, and a partitioning mode of cleaning a cleaning area by partitioning the cleaning area into neighbor areas are included.
  • the robot cleaner according to an embodiment of the present invention may perform a room cleaning mode of doing a cleaning by room units. Hence, in accordance with an inputted cleaning mode, the robot cleaner does a cleaning with a different pattern.
  • the method may further include the step of receiving an input of a room-unit cleaning mode. If this mode is inputted, a cleaning can be performed by room units. To this end, the method may further include the step of determining whether the room information was previously created. If the room information was previously created, a room-unit cleaning may be performed. In particular, the room-unit cleaning may be performed through the previously saved room information. In other words, the separate steps for the room information creation mentioned in the foregoing description may be skipped.
  • the separate steps for the room information creation mentioned in the foregoing description may be performed for the room-unit cleaning. Yet, this case may be categorized into a case that the cleaning map was previously composed or a case that the cleaning map was not previously composed. Hence, the step of determining whether the cleaning map was previously composed may be performed.
  • the room-unit cleaning may be performed after performing the door location deriving step and the room information creating step. If the cleaning map was not previously composed, the room-unit cleaning may be performed after performing the door location deriving step, the cleaning map composing step and the room information creating step.
  • a method of controlling a robot cleaner may include the steps of deriving a passage (door) location in a cleaning area through an image information, composing a cleaning map by detecting an obstacle in the cleaning area, creating room information for distinguishing a plurality of rooms partitioned with reference to the passage (door) from each other by having the derived passage (door) location reflected in the cleaning map, and performing a cleaning by room units distinguished from each other through the room information.
  • a method of controlling a robot cleaner may include the steps of deriving a door location in a cleaning area through an image information, composing a cleaning map by detecting an obstacle in the cleaning area and assigning an area to be necessarily cleaned in a whole cleaning area as a plurality of cells distinguished from each other, giving a room information on each of a plurality of the cells in a manner of having the derived door location reflected in the cleaning map and sorting a plurality of the cells by room units distinguished from each other, and performing a cleaning by the room units distinguished from each other through the room information.
  • a room may comprise at least one cell. Each cell may be associated to one room.
  • the method may further include the steps of receiving an input of a cell information, moving the robot cleaner to at least one selected from the group consisting of an inputted cell location, an inside of a room including the inputted cell and a door location for entering the room including the inputted cell, and finishing the cleaning of the room including the inputted cell.
  • a cleaning of a specific room may be selectively performed.
  • a plurality of rooms can be cleaned in consecutive order.
  • a user can designate the cleaning order for a plurality of rooms.
  • a method of controlling a robot cleaner may include the steps of deriving a door location by creating an image information in a cleaning area through a camera provided to the robot cleaner and then extracting feature lines corresponding to a door shape from the image information, composing a cleaning map by detecting an obstacle in the cleaning area, creating a space information for creating space information on spaces distinguished from each other with reference to the door by having the derived door location reflected in the cleaning map, and performing a cleaning by space units distinguished from each other through the space information.
  • a cleaning order for a plurality of spaces distinguished from each other may be set and the cleaning may be then performed sequentially by the space units in the determined cleaning order.
  • the cleaning may be then performed sequentially by the space units in the determined cleaning order.
  • a cleaning of the 4 rooms is performed sequentially by determining the cleaning order.
  • the robot cleaner moves to a next room and then performs the cleaning.
  • a method of controlling a robot cleaner may include the step of performing the cleaning on the whole cleaning area in a manner of deriving a door location in the cleaning area by extracting feature lines corresponding to a door shape from an image information created by searching the cleaning area, creating a plurality of room informations distinguished from each other with reference to a door by reflecting the derived door location, and then finishing the cleaning of each room sequentially through the created room information.
  • the robot cleaner may be controlled to move to a different room for a next cleaning from the specific room through the door.
  • the robot cleaner may assign an area to be cleaned in the whole cleaning area as a plurality of cells distinguished from each other and may then control a plurality of the cells to be saved in a manner of being sorted by room units distinguished from each other by reflecting the door location.
  • each cell corresponds to which room through the corresponding label.
  • the robot cleaner may be controlled to move to do the cleaning of a plurality of the cells sorted into a different room.
  • a cleaning of a plurality of cells having a room #2 label may be performed.
  • the method may further include the steps of receiving an input of a cell information, moving the robot cleaner to at least one selected from the group consisting of an inputted cell location, an inside of a room including the inputted cell and a door location for entering the room including the inputted cell, and finishing the cleaning of the room including the inputted cell.
  • a room-unit cleaning of a specific room can be performed.
  • the robot cleaner can perform and finish the cleaning of the room #1.
  • the room-unit cleaning can be performed not only on a plurality of rooms sequentially but also on a specific room only.
  • a room corresponding to a cell inputted through an input of a cell information is cleaned with top priority and a cleaning of a next room can be performed sequentially.
  • it is able to implement a very convenient 'smart cleaner'.
  • a user can designate a room #1 to be cleaned.
  • a related art robot cleaner is unable to obtain a user's intention precisely.
  • the related art robot cleaner is able to recognize a cleaning area including a user-designated location, e.g., a cleaning area located across the room #1 and a room #2 only.
  • the robot cleaner is able to finish the cleaning of a portion of the room #1 only by moving between the room #1 and the room #2.
  • the robot cleaner is able to do the cleaning of the whole room #1 effectively by obtaining the user's intention precisely.
  • the robot cleaner is able to start and then finish the cleaning of the room #1 without moving between the room #1 and another room.
  • the cell information may be inputted through an external terminal communication-connected to the robot cleaner. Hence, it is possible to facilitate the control of the robot cleaner.
  • the present invention provides the following effects and/or features.
  • a robot cleaner can efficiently do a cleaning by room unit in a manner of recognizing a cleaning area by the room unit through a door.
  • a product cost of a robot cleaner can be lowered using a related art camera without a separate configuration for recognizing a door.
  • a robot cleaner can clean up a room including a specific location exclusively if the specific location is designated.
  • a robot cleaner can clean up a whole cleaning area by room units in a manner of partitioning the whole cleaning area into a plurality of rooms and then doing and completing the cleaning of the rooms sequentially (i.e., one by one).
  • a robot cleaner can satisfy various tastes of a user in a manner of executing other cleaning modes as well as a room-unit cleaning mode.
  • an efficient robot cleaner can be provided in a manner of flexibly determining a door location deriving timing or whether to derive a door location.
  • FIGs. 1 to 4 A configuration of a robot cleaner according to one embodiment of the present invention is described in detail with reference to FIGs. 1 to 4 as follows.
  • FIG. 1 is a perspective diagram of a robot cleaner according to a related art or one embodiment of the present invention.
  • FIG. 2 is a perspective diagram for an internal configuration of a robot cleaner according to one embodiment of the present invention.
  • FIG. 3 is a bottom perspective view of a robot cleaner according to one embodiment of the present invention.
  • FIG. 4 is a block diagram of a robot cleaner configuring a robot cleaner system according to one embodiment of the present invention.
  • a robot cleaner 100 may include a cleaner body 110 configuring an exterior, a suction device 120 provided within the cleaner body 110, a suction nozzle 130 configured to suck dust from a floor by the activated suction device 120, and a dust collection device 140 configured to collect particles in the air sucked by the suction nozzle 130.
  • the cleaner body 110 of the robot cleaner 100 may have a cylindrical shape of which height is relatively smaller than its diameter, i.e., a shape of a flat cylinder.
  • the cleaner body 110 of the robot cleaner 100 may have a rectangular shape of which corners are rounded.
  • a suction device 120, a suction nozzle 130 and a dust collection device 140 communicating with the suction nozzle 130 may be provided within the cleaner body 110.
  • a sensor configured to detect a distance from a wall of a room or an obstacle, i.e., an obstacle sensor 175 and a bumper (not shown in the drawing) configured to buffer the impact of collision may be provided to an outer circumferential surface of the cleaner body 110.
  • a running unit 150 for moving the robot cleaner 100 may be provided.
  • the running unit 150 may be provided to be projected from an inside of the cleaner body 110 toward an outside of the cleaner body 110, and more particularly, toward a bottom surface.
  • the running unit 150 may include a left running wheel 152 and a right running wheel 154 provided to both sides of a bottom part of the cleaner body 110, respectively.
  • the left running wheel 152 and the right running wheel 154 are configured to be rotated by a left wheel motor 152a and a right wheel motor 154a, respectively.
  • the robot cleaner 100 can do the cleaning of a room by turning its running directions by itself.
  • At least one auxiliary wheel 156 is provided to a bottom of the cleaner body 110 so as to lead a motion or movement of the robot cleaner 100 as well as to minimize the friction between the robot cleaner 100 and the floor.
  • FIG. 4 is a block diagram with reference to a control unit 160 of the robot cleaner 100.
  • a cleaner control unit 160 for controlling operations of the robot cleaner 100 by being connected various parts of the robot cleaner 100 may be provided.
  • a battery 170 for supplying power to the suction device 120 and the like may be provided.
  • the suction device 120 configured to generate an air sucking force may be provided in rear of the battery 170.
  • the dust collection device 140 may be installed in a manner of being detachable in rear from a dust collection device installation part 140a provided in rear of the suction device 120.
  • the suction nozzle 130 is provided under the dust collection device 140 to suck particles from a floor together with air.
  • the suction device 120 is installed to incline between the battery 170 and the dust collection device 140.
  • the suction device 120 is configured in a manner of including a motor (not shown in the drawing) electrically connected to the battery 170 and a fan (not shown in the drawing) connected to a rotational shaft of the motor to force air to flow.
  • the suction nozzle 130 is exposed in a direction of a bottom side of the cleaner body 110 through an opening (not shown in the drawing) formed on a bottom of the body 110, thereby coming into contact with a floor of a room.
  • the robot cleaner 100 includes a first wireless communication unit 190 capable of wireless communication with an external device.
  • the first wireless communication unit 190 may include a Wi-Fi module.
  • the first wireless communication unit 190 may be configured to Wi-Fi communicate with an external device, and more particularly, with an external terminal.
  • the external terminal may include a smartphone having a Wi-Fi module installed thereon.
  • a camera module 195 may be provided to the cleaner body 110.
  • the camera module 195 may include a top camera 197 configured to create a ceiling information on a ceiling image viewed from the robot cleaner 100, i.e., an upward image information.
  • the camera module 195 may include a front camera 196 configured to create a front image information.
  • the camera module 195 may be configured to create image information by photographing a cleaning area.
  • a single camera may be provided.
  • the single camera may be configured to photograph images at various angles.
  • a plurality of cameras may be provided.
  • the robot cleaner is able to compose a cleaning map by detecting obstacles in a cleaning area.
  • a cleaning map is schematically shown in FIG. 7 .
  • the image informations created by the cameras 196 and 197 can be transmitted to an external terminal. For instance, a user may be able to control the robot cleaner while watching the image informations through the external terminal.
  • a separate control unit may be provided in addition to the former control unit 160 configured to control the suction device 120 or the running unit 150 (e.g., wheels) to be activated/deactivated.
  • the former control unit 160 control unit 160 may be called a main control unit 160.
  • the main control unit 160 can control various sensors, a power source device and the like.
  • the latter control unit may include a control unit configured to create location information of the robot cleaner.
  • the latter control unit may be named a vision control unit 165.
  • the main control unit 160 and the vision control unit 165 can exchange signals with each other by serial communications.
  • the vision control unit 165 can create a location of the robot cleaner 100 through the image information of the camera module 195.
  • the vision control unit 165 partitions a whole cleaning area into a plurality of cells and is also able to create a location information on each of the cells.
  • the Wi-Fi module 190 can be installed on the vision control unit 165.
  • a memory 198 may be connected to the vision control unit 165 or the camera module 195. Of course, the memory 198 can be connected to the main control unit 160. Various informations including the location information of the robot cleaner 100, the information on the cleaning area, the information on the cleaning map and the like can be saved in the memory 198.
  • the robot cleaner 100 may include a second wireless communication unit 180 separate from the aforementioned Wi-Fi module 190.
  • the second wireless communication unit 180 may be provided for the short range wireless communication as well.
  • the second wireless communication unit 180 may include a module that employs a short range communication technology such as Bluetooth, RFID (radio frequency identification), IrDA (infrared data association), UWB (ultra wideband), ZigBee and/or the like.
  • a short range communication technology such as Bluetooth, RFID (radio frequency identification), IrDA (infrared data association), UWB (ultra wideband), ZigBee and/or the like.
  • the second wireless communication unit 180 may be provided for a short range communication with a charging holder (not shown in the drawing) of the robot cleaner 100.
  • the hardware configuration of the robot cleaner 100 according to one embodiment of the present invention may be similar or equal to that of a related art robot cleaner. Yet, a method of controlling a robot cleaner according to one embodiment of the present invention or a method of doing a cleaning using a robot cleaner according to one embodiment of the present invention may be different from that of the related art.
  • a method of controlling a robot cleaner configured to do a cleaning by room unit can be provided.
  • the cleaning by the room unit may mean a following process. Namely, after a cleaning of a specific room has been finished, a cleaning of a next room can be done. So to speak, according to the cleaning by the room unit, after a cleaning area has been partitioned into a plurality of rooms by the room unit, a cleaning of each of a plurality of the rooms can be started and finished in consecutive order.
  • the cleaning by the room unit may include a cleaning of the specific room only. The reason for this is that the specific room is distinguished as a different cleaning area.
  • a controlling method may include a door location deriving step S30.
  • the step S30 of deriving a door location in a cleaning area through image information can be performed.
  • the image information can be created through the cameras 196 and 197. Through this image information, it is able to derive a door location. Details of this step S30 shall be described in detail later.
  • the controlling method may include a cleaning map composing step S20.
  • a cleaning map composing step S20 it is able to perform the step S20 of composing a cleaning map by detecting obstacles in the cleaning area. Through the cleaning map, it is able to distinguish an obstacle are and an area on which a cleaning can be performed or an area that should be cleaned from each other in the whole cleaning area.
  • This composition of the cleaning map maybe performed using the information created through the aforementioned obstacle sensor 175 or the cameras 196 and 197.
  • the order in performing the door location deriving step S30 and the cleaning map composing step S20 can be changed.
  • the cleaning map composition is performed and the door location derivation can be then performed.
  • the door location deriving step S30 and the cleaning map composing step S20 may not be performed in consecutive order.
  • the door location deriving step S30 and the cleaning map composing step S20 can be performed on the premise of a room information creating step S40.
  • the room information creating step S40 may include a step of creating room information for distinguishing a plurality of rooms of the cleaning area, which is partitioned with reference to the door, from each other by having the derived door location reflected in the cleaning map.
  • the cleaning map of the cleaning area and the door location derivation are premised.
  • a room-unit cleaning S50 may be performed through the room information.
  • the cleaning map composing step S20 can be performed by cell unit. In other words, it is able to compose a cleaning map of a whole cleaning area in a manner of partitioning a cleaning area into a plurality of cells and then giving absolute or relative location coordinates to a plurality of the cells, respectively.
  • the whole cleaning area may be assigned as a plurality of cells in which an obstacle area and a cleaning executable area are distinguished from each other.
  • the cleaning executable area may include an area on which a cleaning should be performed.
  • a door location derived in the door location deriving step S30 may be reflected in the composed cleaning map.
  • the door location can be also assigned as a cell and may be distinguished from the obstacle area.
  • the door location may be distinguished from the cleaning executable area.
  • the door location since the door location corresponds to an area on which a cleaning should be performed, it may be unnecessary to be distinguished from the cleaning executable area in association with doing the cleaning. So to speak, it may be enough for the rooms to be distinguished from each other through cells assigned to door locations.
  • room information may be given to each of a plurality of the cells.
  • individual room information may be given to each cell corresponding to a cleaning executable area.
  • the room information giving action can be performed with reference to a door location.
  • each room may be recognized as an area having a closed loop through a wall and the door location.
  • a living room may be connected to a room #1 through a door #1.
  • the room #1 may have a single closed loop through the living room and a wall. Hence, it is able to give a label 'room #1' to all cells within the room #1.
  • an individual room label can be given to each of a plurality of rooms including the living room. Through this, it is substantially possible to sort the cells of the whole cleaning area by rooms.
  • the room-unit cleaning step S50 may include the step of after completing the cleaning of a plurality of cells sorted as a specific room, moving to do a cleaning of a plurality of cells sorted as a next room. It is able to complete a cleaning of a whole cleaning area in a manner of repeating an operation of starting and finishing a cleaning of one room, an operation of moving to a next room, and an operation of starting and finishing a cleaning of the next room. Therefore, it is possible to do the subsequent cleanings of a plurality of rooms.
  • a robot cleaner can execute various cleaning modes.
  • a room-unit cleaning mode of ding a cleaning by room unit e.g., 'smart mode' may be executed if a user makes a selection or a predetermined condition is met.
  • An input of this mode may include an input directly applied to a robot cleaner by a user. For instance, such an input can be applied through an input unit (not shown in the drawing) provided to the robot cleaner.
  • a mode input may be applied through an external terminal communication-connected to the robot cleaner.
  • the aforementioned controlling method may include a step S10 of receiving an input of a cleaning mode. If the 'smart mode' is inputted in this step S10, the robot cleaner can perform the room-unit cleaning S50.
  • the room-unit cleaning may be initially performed by the robot cleaner. Moreover, several execution experiences or a number of execution experiences may be cumulated. Hence, a process for performing the room-unit cleaning may be changed depending on a presence or no-presence of the experience(s).
  • the cleaning map composing step S20, the door location deriving step S30 and the room information creating step S40 which are shown in FIG. 5 , can be performed or skipped if necessary.
  • the reason for this is that informations created in these steps may be saved previously. In this case, since the previously saved informations are available, it is unnecessary to create new information. Yet, at least one portion of the above steps may be performed before executing the room-unit cleaning S40 in consideration of a cumulative count or frequency of 'smart mode' executions.
  • the 'smart mode' is inputted, it is able to perform a step S11 of determining whether room information was previously created. If it is determined that the room information was previously created in the step S11, the room-unit cleaning S50 can be performed by skipping the room information creating step S40 and/or the door location deriving step S30.
  • the door location deriving step S30 is performed.
  • the reason for this is that a door location is necessary to create room information.
  • a cleaning map may be previously composed and saved. The reason for this is that a cleaning map may be composed to execute a cleaning mode different from the 'smart mode'.
  • the cleaning map composing step S20 may be performed. Thereafter, the door location deriving step S30 may be performed. Yet, if the room information was previously created, the cleaning map composing step S20 may be skipped and the door location deriving step S30 can be performed.
  • the room information can be created through the informations, which are created by performing the steps of creating new informations, or the previously saved informations [S40]. Hence, through the previously created room information or the newly created room information, the room-unit cleaning S50 can be performed.
  • a door location can be derived irrespective of an inputted cleaning mode.
  • a cleaning map can be composed in order for a robot cleaner to do a cleaning irrespective of a cleaning mode.
  • room information can be created in advance before 'smart mode' is executed.
  • a start timing point and an end timing point of the door location deriving step S30 may be diversified in relation with the cleaning map composing step S20.
  • the door location deriving step S30 can be performed. After the cleaning map composing step S20 has been completed, the door location deriving step S30 can be completed. Through this, it is able to skip a separate running for deriving a door location only. Hence, it is able to derive a door location mode efficiently. And, it is further able to create room information.
  • the door location deriving step S30 can start and then end. Through this, it is able to derive a more accurate door location. And, it is possible to derive a door location only if necessary.
  • the aforementioned room-unit cleaning does not mean that a plurality of rooms is cleaned up individually and sequentially only.
  • the room-unit cleaning does not premise that a cleaning of a whole cleaning area is executed and finished. For instance, a case of exclusively doing a cleaning of a specific room irrespective of a cleaning of a different room is included. In other words, a cleaning of a room #1 is performed but a cleaning of a different room may not be performed.
  • a user orders a cleaning of a room #1 it may mean that the user intends to execute and finish the cleaning of the room #1. Namely, the user does not intend to clean a different room together with a specific area of the room #1.
  • the present embodiment may include a case of cleaning a specified room exclusively.
  • an individual room label may be given to each of a plurality of cells of a cleaning area. Hence, if a specific cell is selected, a room having the specific cell belong thereto can be specified.
  • the cleaning mode inputting step S10 shown in FIG. 5 may correspond to an input of ordering a specific room to be cleaned. For instance, if an input of ordering a room #1 to be cleaned is applied, a step of cleaning the room #1, i.e., the room-unit cleaning step S50 can be performed.
  • room information may be premised for the room-unit cleaning. Hence, if the room information was previously created, the robot cleaner moves to the room #1 and is then able to immediately start to clean the room #1. Yet, if the room information was not previously created, the room-unit cleaning may be performed through the aforementioned steps.
  • an input of ordering a specific room (e.g., a room #1) to be cleaned can be applied in various ways. For instance, this input may be applied through a step of receiving an input of cell information. If a specific cell is selected from a plurality of cells, the specific cell is in a state that a label for a specific room has been given already. Hence, a cleaning of the room including the specific cell can be performed.
  • a robot cleaner can move to at least one of a location of an inputted cell, an inside of a room including the inputted cell, and a door location for entering the room including the inputted cell.
  • the robot cleaner can move to a cleaning start location through the inputted cell information.
  • the robot cleaner performs a cleaning of the room including the inputted cell and then finishes the cleaning.
  • the above-mentioned cell information may be inputted through the aforementioned cleaning map.
  • a display (not shown in the drawing) configured to display the cleaning map can be provided to the robot cleaner.
  • the cleaning map may be displayed through an external terminal. The reason for this is that the robot cleaner can transmit the cleaning map information to the external terminal by communications.
  • An external terminal such as a smartphone basically includes a display.
  • a specific room or a portion of the specific room can be selected from a displayed cleaning map.
  • the corresponding selection can be made through a touch input.
  • selected information is transmitted to a robot cleaner.
  • the robot cleaner can do a cleaning of the selected room through the corresponding information.
  • a door location deriving method is described in detail with reference to FIG. 5 and FIG. 6 as follows.
  • FIG. 5 is a flowchart of a controlling method according to one embodiment of the present invention.
  • FiG. 6 is a schematic diagram of image information created through the top camera 196 of the camera module 195.
  • FIG. 6 shows one example of image information including a door (particularly, a door frame). It is a matter of course that a door location may be created through the front camera. The reason for this is that the front camera is able to create an image including a ceiling view by setting a photographing angle to a top direction.
  • the robot cleaner creates image informations at various locations while running in the cleaning area [S31].
  • Door candidates can be extracted from the image informations.
  • the door candidates can be extracted through feature lines capable of representing door shapes or door frame shapes [S32].
  • a feature line which represents a door shape, from an image including a ceiling 1, a left sidewall 2, a right sidewall 3, a door frame 7 and a front wall 8 on which the door frame 7 is formed.
  • this image information may be changed as a location of a robot cleaner varies.
  • various straight line components can be extracted from the image shown in FIG. 6 .
  • it is able to extract various horizontal lines including horizontal lines formed on the boundaries between the ceiling 1 and the left and right sidewalls 2 and 3, a horizontal line formed on the boundary between the ceiling 1 and the door frame 7, a horizontal line formed on the boundary between the ceiling 1 and the front wall 8, and a horizontal line formed by the door frame 7 itself.
  • various vertical lines including vertical lines formed on boundaries between the ceiling 1, the front wall 8, and the left and right sidewalls 2 and 3 and vertical lines formed by the door frame 7 itself.
  • horizontal and vertical lines formed by various structures may be further extracted as well as the former horizontal and vertical lines.
  • a door candidate through a combination of the feature lines, and more particularly, through a combination of a horizontal line and a vertical line.
  • a single horizontal line 6 and a pair of vertical lines 4 and 5 formed by being respectively connected to both sides of the horizontal line are combined together, it can be extracted as a door candidate. Namely, it can be recognized as a door [S33].
  • a robot cleaner can obtain its current location and locations of feature lines appearing in image information from a cleaning map. Hence, the robot cleaner creates a plurality of image information at various angles or locations and is then able to extract feature lines from a plurality of the created image informations.
  • the robot cleaner can create images including the same object, e.g., the same door frame 7, at various locations. For instance, the robot cleaner moves a little bit from a location at which the image shown in FIG. 6 is photographed and is then able to photograph an image in which a location of the door frame 7 is moved. Hence, it is able to extract various feature lines through a relative location change of the robot cleaner and a location change of the door frame 7 in the image information.
  • a location of the door frame 7 is fixed, it is able to derive a location of the feature line (e.g., the horizontal line 6 of the door frame 7) from the cleaning map.
  • a location of the feature line can be derived through the location change of the door frame 7 in the image information in accordance with a relative location change of the robot cleaner.
  • a location can be extracted through a 3D reconstruction of feature lines.
  • Door candidates recognized as a door among the extracted feature lines can be grouped [S34], e.g. based on position or location, at which the image information was created.
  • feature lines having similar angles and similar locations can be grouped together.
  • the corresponding feature lines can be derived as a door.
  • the feature lines may not be derived as a door. Hence, it is able to improve door recognition accuracy through the grouping of feature lines.
  • FIG. 7 shows one example of a cleaning map 10.
  • a location of an obstacle such as a wall 11 and a cleaning executable area 12 are embodied in a manner of being distinguished from each other.
  • the cleaning map 10 may be embodied or datarized. In particular, data can be embodied if necessary. After a whole cleaning area has been partitioned into a plurality of cells 13, each of the cells is distinguished as an obstacle such as the wall or the like or a cleaning executable area.
  • FIG. 7 shows a very schematic diagram of a cleaning map. Hence, such an obstacle in a space such as a structure, a table or the like is omitted.
  • a door location is not reflected in the cleaning map shown in FIG. 7 .
  • a door location and a cleaning executable area are not distinguished from each other. Since it is unable to obtain a door location from the cleaning map, it is impossible to distinguish rooms with reference to a door location.
  • FIG. 8 shows a cleaning map 20 in which a passage location, i.e. door location is reflected.
  • a door location 14 is represented as a shape of slashes to be distinguished from such an obstacle area as a wall 11 and a cleaning executable area 12.
  • each cell can be distinguished as one of an obstacle area, a cleaning executable area (i.e., a normal area) and a door location area.
  • a cleaning executable area i.e., a normal area
  • a door location area may be set as an obstacle area.
  • each room forms a closed loop through a door area and a wall area.
  • an area within a single closed loop can be distinguished as a specific room. If a door location is reflected in a cleaning map, a whole cleaning area can be partitioned into 7 rooms 31 to 37 independent from each other for example.
  • a label corresponding to a room number can be given to each cell in the corresponding room. Through this, a room-unit cleaning can be performed.
  • the cleaning map 20 shown in FIG. 8 may be displayed on a robot cleaner or an external terminal. Moreover, a location (not shown in the drawing) of the robot cleaner may be shown in the cleaning map 20. Therefore, a user can obtain a location of a robot cleaner in a whole cleaning area.
  • a user may order a room #1 31 to be cleaned. For instance, the user may touch a random point within the room #1 31. In this case, it is able to specify a cell location corresponding to the touched point and a room to which the cell belongs. Therefore, the robot cleaner moves to the room #1 and is then able to do the cleaning of the room #1.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Mechanical Engineering (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Multimedia (AREA)
  • Electromagnetism (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Robotics (AREA)
  • Human Computer Interaction (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Electric Vacuum Cleaner (AREA)
  • Manipulator (AREA)
EP15154611.6A 2014-02-12 2015-02-11 Robot nettoyeur et son procédé de commande Active EP2908204B1 (fr)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020140016235A KR102158695B1 (ko) 2014-02-12 2014-02-12 로봇 청소기 및 이의 제어방법

Publications (2)

Publication Number Publication Date
EP2908204A1 true EP2908204A1 (fr) 2015-08-19
EP2908204B1 EP2908204B1 (fr) 2018-12-26

Family

ID=52462863

Family Applications (1)

Application Number Title Priority Date Filing Date
EP15154611.6A Active EP2908204B1 (fr) 2014-02-12 2015-02-11 Robot nettoyeur et son procédé de commande

Country Status (4)

Country Link
US (1) US20150223659A1 (fr)
EP (1) EP2908204B1 (fr)
KR (1) KR102158695B1 (fr)
CN (1) CN104825101B (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105534408A (zh) * 2016-01-27 2016-05-04 昆山硅步机器人技术有限公司 一种全自动机器人吸尘器
CN110174888A (zh) * 2018-08-09 2019-08-27 深圳瑞科时尚电子有限公司 自移动机器人控制方法、装置、设备及存储介质
EP3907575A4 (fr) * 2019-01-03 2022-05-04 Ecovacs Robotics Co., Ltd. Procédé de division de région dynamique et d'identification de canal de région, et robot de nettoyage
EP4163819A4 (fr) * 2020-07-13 2023-12-06 Dreame Innovation Technology (Suzhou) Co., Ltd. Procédé de commande de dispositif automoteur, appareil, support de stockage et dispositif automoteur

Families Citing this family (77)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7706917B1 (en) * 2004-07-07 2010-04-27 Irobot Corporation Celestial navigation system for an autonomous robot
US10198008B2 (en) * 2013-11-15 2019-02-05 Hitachi, Ltd. Mobile robot system
KR20160048492A (ko) * 2014-10-24 2016-05-04 엘지전자 주식회사 로봇 청소기 및 이의 제어 방법
DE102015109775B3 (de) 2015-06-18 2016-09-22 RobArt GmbH Optischer Triangulationssensor zur Entfernungsmessung
EP3344104B1 (fr) * 2015-09-03 2020-12-30 Aktiebolaget Electrolux Système de dispositifs de nettoyage robotisés
DE102015114883A1 (de) 2015-09-04 2017-03-09 RobArt GmbH Identifizierung und Lokalisierung einer Basisstation eines autonomen mobilen Roboters
CN105182980A (zh) * 2015-09-23 2015-12-23 上海物景智能科技有限公司 一种自动清洁设备的控制系统及控制方法
CN105411491A (zh) * 2015-11-02 2016-03-23 中山大学 一种基于环境监测的家庭智能清洁系统及方法
DE102015119501A1 (de) * 2015-11-11 2017-05-11 RobArt GmbH Unterteilung von Karten für die Roboternavigation
DE102015119865B4 (de) 2015-11-17 2023-12-21 RobArt GmbH Robotergestützte Bearbeitung einer Oberfläche mittels eines Roboters
DE102015121666B3 (de) 2015-12-11 2017-05-24 RobArt GmbH Fernsteuerung eines mobilen, autonomen Roboters
DE102016102644A1 (de) 2016-02-15 2017-08-17 RobArt GmbH Verfahren zur Steuerung eines autonomen mobilen Roboters
WO2017140089A1 (fr) * 2016-02-16 2017-08-24 江苏美的清洁电器股份有限公司 Système robotisé de nettoyage et robot de nettoyage
CN105511477A (zh) * 2016-02-16 2016-04-20 江苏美的清洁电器股份有限公司 清扫机器人系统和清扫机器人
WO2018024897A1 (fr) * 2016-08-05 2018-02-08 RobArt GmbH Procédé de commande d'un robot mobile autonome
JP7055127B2 (ja) * 2016-08-25 2022-04-15 エルジー エレクトロニクス インコーポレイティド 移動ロボット及びその制御方法
KR20180024467A (ko) * 2016-08-30 2018-03-08 삼성전자주식회사 로봇 청소기, 단말 장치 및 그 제어 방법
WO2018066052A1 (fr) * 2016-10-04 2018-04-12 三菱電機株式会社 Corps mobile autonome
KR20180082264A (ko) * 2017-01-10 2018-07-18 엘지전자 주식회사 이동 로봇 및 그 제어방법
EP3590014B1 (fr) 2017-03-02 2021-11-17 Robart GmbH Procédé de commande d'un robot mobile autonome
CN108733037B (zh) * 2017-04-17 2021-03-16 哈工大机器人集团股份有限公司 一种扫地机器人的可避让清扫方法
CN108803590A (zh) * 2017-04-28 2018-11-13 深圳乐动机器人有限公司 机器人清洁模式控制系统
JP7007108B2 (ja) * 2017-05-23 2022-01-24 東芝ライフスタイル株式会社 電気掃除機
CN109079772A (zh) * 2017-06-14 2018-12-25 深圳乐动机器人有限公司 机器人及机器人系统
CN107252287A (zh) * 2017-08-02 2017-10-17 深圳星鸿云科技有限公司 拖地机清扫方法及系统
CN107329476A (zh) * 2017-08-02 2017-11-07 珊口(上海)智能科技有限公司 一种房间拓扑地图构建方法、系统、装置及扫地机器人
CN107479551B (zh) * 2017-08-22 2020-11-10 北京小米移动软件有限公司 控制移动的方法和装置
CN111328386A (zh) * 2017-09-12 2020-06-23 罗博艾特有限责任公司 通过自主移动机器人对未知环境的探察
CN107378953A (zh) * 2017-09-20 2017-11-24 深圳市杉川机器人有限公司 清扫控制方法、装置、清扫机器人及可读存储介质
US10612929B2 (en) * 2017-10-17 2020-04-07 AI Incorporated Discovering and plotting the boundary of an enclosure
CN107913039B (zh) * 2017-11-17 2020-11-13 北京奇虎科技有限公司 用于清洁机器人的区块选择方法、装置及机器人
KR102489806B1 (ko) * 2018-01-03 2023-01-19 삼성전자주식회사 청소용 이동장치, 협업청소 시스템 및 그 제어방법
US11614746B2 (en) * 2018-01-05 2023-03-28 Irobot Corporation Mobile cleaning robot teaming and persistent mapping
CN108319268A (zh) * 2018-02-08 2018-07-24 衢州职业技术学院 一种基于视觉的机器人导航过门方法
US11457788B2 (en) * 2018-05-11 2022-10-04 Samsung Electronics Co., Ltd. Method and apparatus for executing cleaning operation
CN110477810B (zh) * 2018-05-14 2021-06-29 杭州萤石软件有限公司 扫地机器人的控制方法、装置和扫地机器人
JP7281707B2 (ja) * 2018-07-06 2023-05-26 パナソニックIpマネジメント株式会社 移動ロボット、及び、制御方法
CN110806746A (zh) * 2018-07-18 2020-02-18 杭州萤石软件有限公司 应用于移动机器人的功能区划分方法和移动机器人
CN108968825B (zh) * 2018-08-17 2020-12-11 深圳领贝智能科技有限公司 一种扫地机器人和机器人扫地方法
WO2020044394A1 (fr) * 2018-08-27 2020-03-05 三菱電機株式会社 Système de commande, climatiseur et procédé de commande
US10835096B2 (en) 2018-08-30 2020-11-17 Irobot Corporation Map based training and interface for mobile robots
US11278176B2 (en) * 2018-09-06 2022-03-22 Irobot Corporation Scheduling system for autonomous robots
CN210704858U (zh) * 2018-09-28 2020-06-09 成都家有为力机器人技术有限公司 一种具有双目摄像头的清洁机器人
CN111166240A (zh) * 2018-11-09 2020-05-19 北京奇虎科技有限公司 清洁禁区的设置方法、装置、设备及存储介质
CN110897567A (zh) * 2018-12-13 2020-03-24 成都家有为力机器人技术有限公司 一种基于目标物识别的清洁方法及清洁机器人
CN111459153B (zh) * 2019-01-03 2022-09-06 科沃斯机器人股份有限公司 动态区域划分与区域通道识别方法及清洁机器人
KR102255273B1 (ko) 2019-01-04 2021-05-24 삼성전자주식회사 청소 공간의 지도 데이터를 생성하는 장치 및 방법
CN114947652A (zh) * 2019-03-21 2022-08-30 深圳阿科伯特机器人有限公司 导航、划分清洁区域方法及系统、移动及清洁机器人
CN110081885B (zh) * 2019-04-02 2021-10-26 北京云迹科技有限公司 运行区域划分方法及装置
CN114942638A (zh) * 2019-04-02 2022-08-26 北京石头创新科技有限公司 机器人工作区域地图构建方法、装置
CN111862133B (zh) * 2019-04-26 2023-07-21 速感科技(北京)有限公司 封闭空间的区域分割方法、装置和可移动设备
CN110251000A (zh) * 2019-05-20 2019-09-20 广东宝乐机器人股份有限公司 一种提高扫地机器人清洁效率的方法
CN109984687A (zh) * 2019-06-03 2019-07-09 常州工程职业技术学院 一种扫地机器人的自动清扫控制方法
GB2584839B (en) * 2019-06-12 2022-12-21 Dyson Technology Ltd Mapping of an environment
CN110269550B (zh) * 2019-06-13 2021-06-08 深圳市银星智能科技股份有限公司 一种门位置识别方法以及移动机器人
CN112075879A (zh) * 2019-06-14 2020-12-15 江苏美的清洁电器股份有限公司 一种信息处理方法、装置及存储介质
KR102314537B1 (ko) * 2019-06-18 2021-10-18 엘지전자 주식회사 이동 로봇 및 그 제어방법
CN112493924B (zh) * 2019-08-26 2023-03-10 苏州宝时得电动工具有限公司 清洁机器人及其控制方法
CN111110122B (zh) * 2019-12-03 2020-10-23 尚科宁家(中国)科技有限公司 扫地机器人
CN111419118A (zh) * 2020-02-20 2020-07-17 珠海格力电器股份有限公司 一种划分区域的方法、装置、终端及计算机可读介质
CN110974091B (zh) * 2020-02-27 2020-07-17 深圳飞科机器人有限公司 清洁机器人及其控制方法、存储介质
US11561102B1 (en) 2020-04-17 2023-01-24 AI Incorporated Discovering and plotting the boundary of an enclosure
CN111920353A (zh) * 2020-07-17 2020-11-13 江苏美的清洁电器股份有限公司 清扫控制方法、清扫区域划分方法、装置、设备、存储介质
KR20220012001A (ko) * 2020-07-22 2022-02-03 엘지전자 주식회사 로봇 청소기 및 이의 제어방법
CN111627063B (zh) * 2020-07-28 2020-10-16 北京云迹科技有限公司 电子地图上房间门口位置的点位识别方法及装置
CN112155476B (zh) * 2020-09-16 2021-07-20 珠海格力电器股份有限公司 一种机器人的控制方法、装置、电子设备及存储介质
CN114246512A (zh) * 2020-09-25 2022-03-29 苏州三六零机器人科技有限公司 扫地机清扫方法、装置、扫地机和计算机存储介质
CN112369982B (zh) * 2020-10-14 2022-03-15 深圳拓邦股份有限公司 门槛识别方法、装置、扫地机器人及存储介质
CN114557635B (zh) * 2020-11-27 2023-11-03 尚科宁家(中国)科技有限公司 清洁机器人及其分区识别方法
CN112462780B (zh) * 2020-11-30 2024-05-21 深圳市杉川致行科技有限公司 扫地控制方法、装置、扫地机器人及计算机可读存储介质
CN112674655B (zh) * 2021-01-14 2022-06-10 深圳市云鼠科技开发有限公司 基于沿墙的回充方法、装置、计算机设备和存储器
CN113156447A (zh) * 2021-03-10 2021-07-23 深圳市杉川机器人有限公司 房门位置的确定方法、扫地机以及计算机可读存储介质
CN113397444B (zh) * 2021-07-02 2023-01-24 珠海格力电器股份有限公司 目标障碍物的识别方法、清扫机器的控制方法和处理器
CN114027746B (zh) * 2021-10-29 2022-11-18 珠海格力电器股份有限公司 控制方法、装置、存储介质、电子设备及清洁机器人
CN114098534B (zh) * 2021-11-30 2023-02-17 深圳Tcl新技术有限公司 扫地机的清扫区域识别方法、装置、存储介质及电子设备
CN114365974B (zh) * 2022-01-26 2023-01-10 微思机器人(深圳)有限公司 一种室内清洁分区方法、装置和扫地机器人
CN114947655A (zh) * 2022-05-17 2022-08-30 安克创新科技股份有限公司 机器人控制方法、装置、机器人和计算机可读存储介质

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2078996A2 (fr) * 2008-01-11 2009-07-15 Samsung Electronics Co., Ltd. Procédé et appareil pour la planification de l'itinéraire d'un robot mobile
EP2407847A2 (fr) * 2010-07-01 2012-01-18 Vorwerk & Co. Interholding GmbH Appareil automobile et procédé d'orientation d'un tel appareil
EP2450762A2 (fr) * 2010-11-03 2012-05-09 Lg Electronics Inc. Robot nettoyeur et son procédé de commande
EP2592518A2 (fr) * 2011-11-14 2013-05-15 Samsung Electronics Co., Ltd Robot nettoyeur et son procédé de commande

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100661339B1 (ko) * 2005-02-24 2006-12-27 삼성광주전자 주식회사 로봇 청소기
US7578020B2 (en) * 2005-06-28 2009-08-25 S.C. Johnson & Son, Inc. Surface treating device with top load cartridge-based cleaning system
KR101506738B1 (ko) * 2008-07-28 2015-03-27 엘지전자 주식회사 로봇청소기 및 그 구동방법
CN101941012B (zh) * 2009-07-03 2012-04-25 泰怡凯电器(苏州)有限公司 清洁机器人及其脏物识别装置和该机器人的清洁方法
KR20110054472A (ko) * 2009-11-17 2011-05-25 엘지전자 주식회사 로봇 청소기 및 그의 제어 방법
KR20110054480A (ko) * 2009-11-17 2011-05-25 엘지전자 주식회사 로봇 청소기 및 이의 제어 방법
KR20120021064A (ko) * 2010-08-31 2012-03-08 엘지전자 주식회사 이동 로봇 및 이의 제어 방법
KR20130089554A (ko) * 2012-02-02 2013-08-12 엘지전자 주식회사 로봇 청소기 및 그 제어 방법
CN103576681B (zh) * 2012-07-26 2017-04-12 苏州宝时得电动工具有限公司 自动行走设备及其控制方法
CN103271699B (zh) * 2013-05-29 2016-05-18 东北师范大学 一种智能家居清洁机器人

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2078996A2 (fr) * 2008-01-11 2009-07-15 Samsung Electronics Co., Ltd. Procédé et appareil pour la planification de l'itinéraire d'un robot mobile
EP2407847A2 (fr) * 2010-07-01 2012-01-18 Vorwerk & Co. Interholding GmbH Appareil automobile et procédé d'orientation d'un tel appareil
EP2450762A2 (fr) * 2010-11-03 2012-05-09 Lg Electronics Inc. Robot nettoyeur et son procédé de commande
EP2592518A2 (fr) * 2011-11-14 2013-05-15 Samsung Electronics Co., Ltd Robot nettoyeur et son procédé de commande

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
ANGUELOV D ET AL: "Detecting and modeling doors with mobile robots", ROBOTICS AND AUTOMATION, 2004. PROCEEDINGS. ICRA '04. 2004 IEEE INTERN ATIONAL CONFERENCE ON NEW ORLEANS, LA, USA APRIL 26-MAY 1, 2004, PISCATAWAY, NJ, USA,IEEE, US, 26 April 2004 (2004-04-26), pages 3777, XP010769126, ISBN: 978-0-7803-8232-9, DOI: 10.1109/ROBOT.2004.1308857 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105534408A (zh) * 2016-01-27 2016-05-04 昆山硅步机器人技术有限公司 一种全自动机器人吸尘器
CN110174888A (zh) * 2018-08-09 2019-08-27 深圳瑞科时尚电子有限公司 自移动机器人控制方法、装置、设备及存储介质
EP3907575A4 (fr) * 2019-01-03 2022-05-04 Ecovacs Robotics Co., Ltd. Procédé de division de région dynamique et d'identification de canal de région, et robot de nettoyage
US11618168B2 (en) 2019-01-03 2023-04-04 Ecovacs Robotics Co., Ltd. Dynamic region division and region passage identification methods and cleaning robot
EP4163819A4 (fr) * 2020-07-13 2023-12-06 Dreame Innovation Technology (Suzhou) Co., Ltd. Procédé de commande de dispositif automoteur, appareil, support de stockage et dispositif automoteur

Also Published As

Publication number Publication date
CN104825101A (zh) 2015-08-12
KR102158695B1 (ko) 2020-10-23
KR20150095121A (ko) 2015-08-20
CN104825101B (zh) 2018-04-17
US20150223659A1 (en) 2015-08-13
EP2908204B1 (fr) 2018-12-26

Similar Documents

Publication Publication Date Title
EP2908204B1 (fr) Robot nettoyeur et son procédé de commande
EP3199083B1 (fr) Robot de nettoyage et son procédé de commande
US9977954B2 (en) Robot cleaner and method for controlling a robot cleaner
US9820625B2 (en) Robot cleaner
US20190332121A1 (en) Moving robot and control method thereof
CN111328386A (zh) 通过自主移动机器人对未知环境的探察
EP2413772B1 (fr) Robot mobile avec une unique caméra et procédé pour reconnaître l'environnement en 3 dimensions de celui-ci
JP6054136B2 (ja) 機器制御装置、および自走式電子機器
KR101893152B1 (ko) 로봇 청소기 시스템 및 그 제어방법
KR20170077756A (ko) 청소 로봇 및 청소 로봇의 제어 방법
CN109381122A (zh) 运行自动前进的清洁设备的方法
KR101976462B1 (ko) 로봇청소기 및 그 제어방법
CN109613913A (zh) 自主移动机器人工作方法、自主移动机器人及系统
AU2014278987A1 (en) Cleaning robot and method for controlling the same
CN111356393B (zh) 用于清洁的移动设备及其控制方法
CN106796418A (zh) 简化表面清洁的装置以及记录执行的清洁工作的方法
CN110636789A (zh) 电动吸尘器
CN111265151B (zh) 机器人控制方法、装置和存储介质
US20200033878A1 (en) Vacuum cleaner
KR20150009048A (ko) 청소 로봇 및 그 제어 방법
CN113693501A (zh) 清洁设备和清洁路径、清洁地图生成方法及生成系统
KR102314537B1 (ko) 이동 로봇 및 그 제어방법
CN106137058B (zh) 清洁机器人系统及虚拟墙检测方法
KR20210007360A (ko) 이동 로봇 및 그 제어방법
KR102227427B1 (ko) 청소 로봇, 홈 모니터링 장치 및 그 제어 방법

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

17P Request for examination filed

Effective date: 20160219

RBV Designated contracting states (corrected)

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

RIC1 Information provided on ipc code assigned before grant

Ipc: G05D 1/02 20060101AFI20180629BHEP

INTG Intention to grant announced

Effective date: 20180726

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: LG ELECTRONICS INC.

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE PATENT HAS BEEN GRANTED

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: AT

Ref legal event code: REF

Ref document number: 1082270

Country of ref document: AT

Kind code of ref document: T

Effective date: 20190115

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602015022102

Country of ref document: DE

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190326

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181226

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181226

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181226

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190326

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181226

REG Reference to a national code

Ref country code: NL

Ref legal event code: MP

Effective date: 20181226

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG4D

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: AL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181226

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181226

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181226

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190327

REG Reference to a national code

Ref country code: AT

Ref legal event code: MK05

Ref document number: 1082270

Country of ref document: AT

Kind code of ref document: T

Effective date: 20181226

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181226

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181226

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190426

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181226

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181226

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181226

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181226

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181226

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190426

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181226

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181226

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602015022102

Country of ref document: DE

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181226

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181226

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20190211

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181226

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

REG Reference to a national code

Ref country code: BE

Ref legal event code: MM

Effective date: 20190228

REG Reference to a national code

Ref country code: IE

Ref legal event code: MM4A

26N No opposition filed

Effective date: 20190927

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20190228

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20190228

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20190211

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20190228

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181226

Ref country code: FR

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20190226

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: TR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181226

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: GB

Payment date: 20200108

Year of fee payment: 6

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MT

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20190211

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181226

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: HU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO

Effective date: 20150211

GBPC Gb: european patent ceased through non-payment of renewal fee

Effective date: 20210211

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GB

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210211

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181226

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DE

Payment date: 20230105

Year of fee payment: 9

REG Reference to a national code

Ref country code: DE

Ref legal event code: R079

Ref document number: 602015022102

Country of ref document: DE

Free format text: PREVIOUS MAIN CLASS: G05D0001020000

Ipc: G05D0001430000