EP3995065A1 - Appareil de nettoyage autonome - Google Patents

Appareil de nettoyage autonome Download PDF

Info

Publication number
EP3995065A1
EP3995065A1 EP21203056.3A EP21203056A EP3995065A1 EP 3995065 A1 EP3995065 A1 EP 3995065A1 EP 21203056 A EP21203056 A EP 21203056A EP 3995065 A1 EP3995065 A1 EP 3995065A1
Authority
EP
European Patent Office
Prior art keywords
cleaning device
obstacle
area
cleaning
message
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
EP21203056.3A
Other languages
German (de)
English (en)
Other versions
EP3995065B1 (fr
Inventor
Harald Windorfer
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vorwerk and Co Interholding GmbH
Original Assignee
Vorwerk and Co Interholding GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vorwerk and Co Interholding GmbH filed Critical Vorwerk and Co Interholding GmbH
Publication of EP3995065A1 publication Critical patent/EP3995065A1/fr
Application granted granted Critical
Publication of EP3995065B1 publication Critical patent/EP3995065B1/fr
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L9/00Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating action; Storing devices specially adapted to suction cleaners or parts thereof; Carrying-vehicles specially adapted for suction cleaners
    • A47L9/28Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means
    • A47L9/2894Details related to signal transmission in suction cleaners
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L9/00Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating action; Storing devices specially adapted to suction cleaners or parts thereof; Carrying-vehicles specially adapted for suction cleaners
    • A47L9/28Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means
    • A47L9/2805Parameters or conditions being sensed
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/24Floor-sweeping machines, motor-driven
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4002Installations of electric equipment
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4002Installations of electric equipment
    • A47L11/4008Arrangements of switches, indicators or the like
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4061Steering means; Means for avoiding obstacles; Details related to the place where the driver is accommodated
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L9/00Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating action; Storing devices specially adapted to suction cleaners or parts thereof; Carrying-vehicles specially adapted for suction cleaners
    • A47L9/28Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means
    • A47L9/2805Parameters or conditions being sensed
    • A47L9/281Parameters or conditions being sensed the amount or condition of incoming dirt or dust
    • A47L9/2815Parameters or conditions being sensed the amount or condition of incoming dirt or dust using optical detectors
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • A47L2201/04Automatic control of the travelling movement; Automatic obstacle detection
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • A47L2201/06Control of the cleaning action for autonomous devices; Automatic detection of the surface condition before, during or after cleaning

Definitions

  • the invention relates to an automatically moving cleaning device with a drive device for moving the cleaning device within an environment, a communication interface for outputting a message to a user of the cleaning device, an obstacle detection device for detecting obstacles within the environment and a computing device, which is set up based on to create a map of the surroundings based on the obstacles detected by the obstacle detection device and to transmit control commands to the communication interface, with the computing device also being set up to determine, on the one hand, a first partial area of the surroundings in which a position of a movable obstacle changes over time, and on the other hand a second partial area of the surroundings to determine which could only be cleaned incompletely.
  • Cleaning devices of the aforementioned type are well known in the prior art. These are usually also referred to as cleaning robots and can be embodied, for example, as mobile vacuum cleaners, wiping devices or combined suction-wiping devices.
  • the automatically moving cleaning devices usually have a computing device that is designed to carry out a navigation of the cleaning device within an environment.
  • the computing device is assigned an obstacle detection device which, for example, measures distances to obstacles in the area and this is transmitted to the computing device in order to create a map of the surroundings.
  • the environment map created in this way can then be used by the computing device in order to localize and navigate the cleaning device in the environment.
  • route planning can also be carried out in order to navigate the cleaning device through the area using a planned, predefined route, for example in order to carry out defined consecutive cleaning activities as part of a cleaning plan.
  • the computing device controls the movement of the cleaning device through the environment and preferably logs those partial areas of the environment that have already been cleaned by the cleaning device. It is also detected and documented which partial areas of the environment have not yet been cleaned or have been incompletely cleaned. This documentation takes place, for example, directly in the map of the surroundings, in which cleaned and non-cleaned partial areas of the surroundings are marked. In addition, the map of the surroundings contains entries about the positions of detected obstacles. Obstacles are, on the one hand, room boundaries and, on the other hand, objects such as pieces of furniture, decorative objects or the like.
  • the arithmetic unit determines such moving obstacles, for example, by comparing maps of the surroundings that have been created in succession over time and/or based on the speed at which the obstacle is moving.
  • the disadvantage here is that there is no regular accessibility to the surrounding sub-areas with moving obstacles, so that an environment with several surrounding sub-areas is often cleaned incompletely or at least unevenly and surrounding sub-areas remain in which dust and dirt gradually accumulate and are manually cleaned by the user need to be removed.
  • the computing device of the cleaning device be set up to compare the position of the determined first partial area of the surrounding area with the position of the determined second partial area of the surrounding area and, if the positions match, send a message relating to the movable obstacle to a user of the cleaning device via the communication interface output, the computing device being set up to determine a change in the position of the movable obstacle that recurs over time by comparing a plurality of maps of the environment that have been created in chronological order, and the message contains a request to remove the obstacle manually from the defined partial area of the environment, namely to a specific one in the To shift message-defined location, which is outside of the surrounding sub-area in which the position of the movable obstacle changes repeatedly over time, is located.
  • the arithmetic unit thus determines partial areas of the environment which, on the one hand, have been cleaned incompletely or unevenly and, on the other hand, have moving obstacles. If both of the aforementioned prerequisites apply to the same sub-area of the environment, it is concluded that the obstacle present there prevents optimal, in particular uniform, cleaning of the sub-area of the surroundings. Since the computing device is also aware that a detected obstacle is a moving obstacle, a message is sent to the user of the cleaning device, so that he can take measures for optimal cleaning of the surrounding area.
  • a cleaning robot which, for the purpose of achieving the best possible cleaning result, outputs a message to a user that a movable obstacle, such as a chair, is present in a specific partial area of the environment, which prevents optimal cleaning of the partial area of the environment.
  • the cleaning device first determines locations where there are moving obstacles. Furthermore, the cleaning device determines partial areas of the surroundings which could not be cleaned at all or only with little success. The determined locations of the pollution and the moving obstacles are compared with one another, with a message being output to the user if they match. It is therefore not trusted that during a subsequent cleaning run the cleaning device accidentally reaches a position that was previously blocked by a moving obstacle.
  • the arithmetic unit is set up to determine a change in the position of the movable obstacle by comparing a plurality of maps of the surroundings that have been created in chronological succession.
  • the cleaning device can compare a map of the surroundings created during a current cleaning run with a map of the surroundings that was last created in order to identify one or more moving obstacles based on existing differences between the two maps of the surroundings.
  • a map of the surroundings can be created and compared with one or more previous maps of the surroundings, for example after each work routine, in particular after processing a deployment plan containing predefined time and location requirements.
  • changes in the position of moving obstacles are detected. For example, a chair that was used at an earlier point in time may not be in exactly the same position as before it was used.
  • the notification to the user which is issued in the event of a match between a position of a movable obstacle and a position of incomplete cleaning, contains a request to manually remove the obstacle from the defined environmental sub-area, namely the movable obstacle to a specific one in the notification to relocate to a defined location.
  • the user of the cleaning device preferably receives a suggestion to carry the movable obstacle directly to a different position in the environment after the next use. For example, he can be asked not to put a chair in the same position as before, for example no longer to move back to a table, but to remove it from the table.
  • the cleaning device can then eliminate dirt at the relevant location when the user has removed the obstacle there.
  • the cleaning device preferably has a dirt sensor, which is designed to determine the degree of dirt in a partial area of the surroundings.
  • the dirt sensor is designed to identify such sub-areas of the surroundings which could not be cleaned at all, incompletely, unevenly and/or only with an increased expenditure of time compared to a norm. In this way, sub-areas of the surrounding area that are difficult to clean can be identified and stored in the map of the surrounding area.
  • the dirt sensor can have a camera with an associated image processing device or a particle sensor.
  • the cleaning device has a camera that takes pictures of the environment, which are compared with reference images of the environment by means of an associated image processing device.
  • the cleaning device can also have a particle sensor, which measures, in particular quantifies, dirt and/or dust particles removed from a surface to be cleaned by the cleaning device, for example a suction fan or a wiping cloth.
  • the notification to the user particularly preferably contains an image or a graphic representation of the movable obstacle.
  • the cleaning device has a camera, for example, an image recorded by the camera can be displayed directly to the user, for example on a display of the cleaning device itself, or on an external terminal device.
  • the message can also contain a graphic representation of the moving obstacle, for example an icon that represents a specific type of moving object, for example an icon that represents a chair, an armchair, a side table or the like.
  • the movable obstacle is a piece of furniture
  • the cleaning device having a memory with reference parameters of defined pieces of furniture
  • the computing device being set up to compare parameters of a detected movable obstacle with the stored reference parameters.
  • a detected obstacle can be identified with regard to its type and a message can be sent directly to the user, for example, which reflects the type of obstacle to be moved.
  • the message can directly contain a text component such as "chair", or a graphically represented chair symbol.
  • the communication interface be designed to output the notification directly to a user as an optical signal or acoustic signal or to output the notification via wireless data communication to a communication link with the cleaning appliance to transmit the user's standing external device.
  • the communication interface of the cleaning device has a display or a loudspeaker, for example.
  • An image or a symbol representing the moving obstacle can be shown to the user on the display of the cleaning device, or alternatively text which describes the moving obstacle.
  • the user can also be given an acoustic signal, for example in the form of a voice output that contains a request such as "Please do not put the chair back at the table".
  • the communication interface of the cleaning device can transmit the message to an external end device of the user, for example to a mobile phone, a tablet computer, a laptop or the like.
  • the data is preferably transmitted wirelessly via a home network or the Internet.
  • a display of the mobile terminal device can be used to then display text, an image and/or a symbol.
  • the mobile terminal can also make a corresponding voice output that contains a message for the user.
  • the obstacle detection device of the cleaning appliance is preferably set up to also detect removal of the movable obstacle from the partial area of the surrounding area and then to report this to the computing device, with the computing device being set up to then control cleaning of the partial area of the surrounding area.
  • the obstacle detection device can again detect obstacles in the area and transmit corresponding information to the computing device, with the computing device then determining in a comparison with older maps of the environment that a moving obstacle is no longer present in the relevant partial area of the environment.
  • the computing device can then control the cleaning device in such a way that it moves to the previous position of the movable obstacle and carries out cleaning there.
  • the user can then receive a message via the communication interface of the cleaning device that the corresponding position of the environment has been cleaned and the user can now put the movable obstacle back to the desired location.
  • the cleaning device or its obstacle detection device does not have to detect itself that the movable obstacle has been removed. Rather, the user makes a corresponding entry via the communication interface of the cleaning device, preferably also with the aid of an external terminal device.
  • the arithmetic unit of the cleaning device then processes the information from the user in such a way that the partial area around the area cleared of the movable obstacle is cleaned. If this is done, a corresponding notification can then be issued to the user.
  • FIG figure 1 1 shows an example of an automatically moving cleaning device 1.
  • the cleaning device 1 is here a robotic vacuum cleaner with a cleaning element 16 in the form of a cleaning brush, which rotates about a substantially horizontal axis.
  • the cleaning device 1 has wheels 15 and a drive device 2 for the wheels 15, which drive device 2 comprises an electric motor (not shown).
  • the cleaning device 1 has a communication interface 3 for wireless communication with an external end device 14 (see FIG figure 3 ) on.
  • the cleaning device 1 can be integrated into a home network, for example, or also be connected to the Internet via the communication interface 3 .
  • An obstacle detection device 5 of the cleaning device 1 is designed to detect obstacles 6 , 7 in the vicinity of the cleaning device 1 .
  • the obstacles 6, 7 can be stationary obstacles 6, for example large pieces of furniture such as cupboards, beds and the like, and mobile obstacles 7, such as small furniture, including chairs, side tables and the like.
  • the obstacle detection device 5 is here, for example, a distance measuring device which measures distances from the cleaning appliance 1 to the obstacles 6, 7.
  • the obstacle detection device 5 can be an optical measuring device, for example a triangulation measuring device that measures in a 360-degree angular range around the obstacle detection device 5 .
  • the cleaning appliance 1 has a computing device 8 which uses the distances determined by the obstacle detection device 5 to create a map of the surroundings 9 (see figure 2 ) that represents a floor plan of the environment.
  • the floor plan corresponds here to a floor of an apartment or a house with several rooms, which in turn are divided into partial areas 10, 11.
  • the area map 9 also shows the detected obstacles 6, 7 and a current location of the cleaning device 1.
  • the computing device 8 can use the area map 9 to identify the current position of the cleaning device 1 and continuously update it in the area map 9.
  • the cleaning device 1 can also use a dirt sensor 12 to detect a level of dirt in the environment.
  • the dirt sensor 12 can have a camera in connection with image processing software, for example, which uses an image comparison to identify current soiling conditions as average, below average or above average. Other classifications can be, for example, low, medium, high or the like.
  • the computing device 8 can also enter these determined degrees of contamination in the map 9 of the surroundings.
  • a memory 13 of the cleaning appliance 1 can have images of reference degrees of soiling, with which an image currently recorded by the dirt sensor 12 can be compared in order to determine a degree of soiling.
  • the environment map 9 can also be stored in the memory 13 .
  • FIG 2 12 shows an exemplary environment map 9 created by the cleaning device 1.
  • This environment map 9 can be created, for example, as part of a so-called SLAM method (Simultaneous Localization and Mapping).
  • the obstacle detection device 5 of the cleaning device 1 continuously measures distances to obstacles 6, 7 in the area and uses this to create the area map 9, while the cleaning device 1 simultaneously localizes itself using the already existing parts of the area map 9 and currently detected obstacle data continuously with the in the area map 9 stored data.
  • the area map 9 contains here only an example of a plurality of obstacles 6, 7, where stationary obstacles 6 (which the user usually does not move) can be distinguished from movable obstacles 7 (which the user frequently moves) to better illustrate the invention.
  • a plurality of partial environmental areas 10, 11 are marked in environmental map 9, environmental partial areas 10 representing areas of the environment that contain a moving obstacle 7 and environmental partial areas 11 identifying areas of the environment in which dirt sensor 12 of cleaning device 1 has an increased degree of soiling compared to a reference Has been established.
  • the figure 3 shows an external terminal 14 of a user of the cleaning device 1 in the form of a mobile phone.
  • the external terminal 14 has a display 17 in the usual way, on which messages 4 are displayed to the user.
  • the display 17 shows a request to the user to keep a specific obstacle 7 (chair) away from a defined partial area 10, 11 (surroundings of the table).
  • the cleaning appliance 1 first creates a map 9 of the surroundings using its computing device 8 .
  • map 9 As shown by way of example in figure 2 shown, obstacles 6, 7 noted that the obstacle detection device 5 of the cleaning device 1 has determined during a reconnaissance or work trip.
  • the cleaning device 1 uses its dirt sensor 12 to determine those partial areas 11 of the surroundings that are dirty beyond a standard value.
  • Such subregions 11 of the surroundings are, for example, in the region of smaller pieces of furniture, the surroundings of which cannot be completely cleaned by the cleaning device 1 .
  • Surrounding areas 11 with such obstacles 7 are, for example, areas under chairs, between the chair legs of which Cleaning device 1 cannot drive through.
  • the partial area 11 in which this obstacle 7 is located cannot be optimally cleaned.
  • Small pieces of furniture such as chairs are among the movable obstacles 7 that are frequently carried or pushed back and forth by a user. For this reason, these are marked as moving obstacles 7 in the environment map 9, in contrast to larger, non-mobile pieces of furniture, which are marked here as obstacles 6, for example.
  • These non-mobile pieces of furniture include kitchen cabinets, sideboards, beds and the like.
  • the cleaning device 1 uses the computing device 8 on the basis of the obstacles 6, 7 detected by the obstacle detection device 5 and the detection signals of the dirt sensor 12 to continuously create a new, up-to-date map of the area 9 and preferably stores this in the memory 13.
  • the created environment map 9 compared with one or more older environment maps 9 and analyzed in relation to changes in location of moving obstacles 7.
  • the computing device 8 can determine that a movable obstacle 7 has been placed at a different position within the surroundings.
  • a moving obstacle 7 can be, for example, a chair that was previously at a dining table and was then put back in a slightly different position after the user has moved away from the dining table.
  • the maps 9 of the surroundings which are created in chronological succession are also analyzed in relation to sub-areas 11 of the surroundings which have degrees of pollution which deviate from a norm. For example, it can be a question of partial areas 11 of the surroundings which could not be cleaned at all, could be cleaned incompletely, inhomogeneously or only with great expenditure of time.
  • the computing device 8 compares the positions of the detected partial areas 11 of the surroundings with an incomplete cleaning result or one that is not optimal in some other way and the partial areas 10 of the surroundings which have movable obstacles 7 . If the computing device 8 determines that a sub-area 10, 11 of the invention belongs both to the sub-areas 10 with moving obstacles 7 and to the sub-areas 11 with incomplete or sub-optimal cleaning, this is marked in the area map 9. By comparing the sub-areas 11 of the surrounding area, which are difficult to clean, with the sub-areas 10 of the surrounding area, which have positions of movable obstacles 7, the computing device 8 determines a proposed solution which, as in figure 3 shown as a message 4 on the display 17 of the external terminal 14 of the user.
  • the message 4 "Please do not put the chair back at the table” is issued, prompting the user not to put the movable obstacle 7 "chair” back in its previous position, namely not in the partial area 10, after the next use .
  • This can be done in that the user no longer puts the chair back in the dining area or moves it closer to the table, but places it somewhat apart from it.
  • the fact that the movable obstacles 7 are thus moved in a targeted manner ensures that all partial areas 10, 11 of the surroundings are cleaned evenly, regularly and homogeneously.
  • the user can also receive a message 4 in the form of an image or an icon, which represents the obstacle 7 to be moved.
  • a message 4 in the form of an image or an icon, which represents the obstacle 7 to be moved.
  • an image of the space containing this movable obstacle 7 can also be displayed.
  • the image itself can have been captured by an obstacle detection device 5 of the cleaning device 1, in particular a camera.
  • issuing messages 4 on an external Terminal 14 can also have cleaning device 1 itself a display 17 on which messages 4 are displayed to the user.
  • the communication interface 3 of the cleaning device 1 can also output an optical or acoustic message 4, for example a voice message, a signal projected onto a wall or the like.
  • the user can send information to the computing device 8 of the cleaning device, for example via a touch-sensitive display 17 of the external terminal device 14 1 transmit that the movable obstacle 7 has been removed from the partial area 11 of the surrounding area to be cleaned.
  • the computing device 8 can then start cleaning in the respective partial area 11 of the surrounding area.
  • the cleaning device 1 can also determine autonomously, in particular with the cooperation of its obstacle detection device 5, that the movable obstacle 7 has been removed from the partial area 11 of the surrounding area and then control the cleaning of the partial area 11 of the surrounding area accordingly.

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Electric Vacuum Cleaner (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Manufacturing And Processing Devices For Dough (AREA)
  • Power Steering Mechanism (AREA)
  • Processing Of Meat And Fish (AREA)
EP21203056.3A 2020-11-04 2021-10-18 Appareil de nettoyage autonome Active EP3995065B1 (fr)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
DE102020129026.8A DE102020129026A1 (de) 2020-11-04 2020-11-04 Sich selbsttätig fortbewegendes Reinigungsgerät

Publications (2)

Publication Number Publication Date
EP3995065A1 true EP3995065A1 (fr) 2022-05-11
EP3995065B1 EP3995065B1 (fr) 2023-03-01

Family

ID=78463387

Family Applications (1)

Application Number Title Priority Date Filing Date
EP21203056.3A Active EP3995065B1 (fr) 2020-11-04 2021-10-18 Appareil de nettoyage autonome

Country Status (5)

Country Link
US (1) US20220133112A1 (fr)
EP (1) EP3995065B1 (fr)
CN (1) CN114431770A (fr)
DE (1) DE102020129026A1 (fr)
ES (1) ES2941634T3 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114869175A (zh) * 2022-05-26 2022-08-09 美智纵横科技有限责任公司 清洁避障方法、装置、电子设备及存储介质

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2508957A2 (fr) * 2011-04-07 2012-10-10 LG Electronics Inc. Robot nettoyeur, système de commande à distance et procédé correspondant
EP3441840A1 (fr) * 2017-08-11 2019-02-13 Vorwerk & Co. Interholding GmbH Procédé de fonctionnement d'un appareil de nettoyage en mouvement de manière automatique

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101984214B1 (ko) 2012-02-09 2019-05-30 삼성전자주식회사 로봇 청소기의 청소 작업을 제어하기 위한 장치 및 방법
US11400595B2 (en) * 2015-01-06 2022-08-02 Nexus Robotics Llc Robotic platform with area cleaning mode
DE102015100419A1 (de) 2015-01-13 2016-07-14 Miele & Cie. Kg Verfahren und Anordnung zum Bearbeiten von Bodenflächen
US11449061B2 (en) * 2016-02-29 2022-09-20 AI Incorporated Obstacle recognition method for autonomous robots
CN107368079B (zh) * 2017-08-31 2019-09-06 珠海市一微半导体有限公司 机器人清扫路径的规划方法及芯片
DE102019101338A1 (de) 2019-01-18 2020-07-23 Vorwerk & Co. Interholding Gmbh System aus einem ausschließlich manuell geführten Bodenbearbeitungsgerät und einem ausschließlich automatisch betriebenen Bodenbearbeitungsgerät sowie Verfahren zum Betrieb eines solchen Systems
US11191407B2 (en) * 2019-01-31 2021-12-07 Irobot Corporation Cleaning of pet areas by autonomous cleaning robots
US11537141B2 (en) * 2019-12-19 2022-12-27 Diversey, Inc. Robotic cleaning device with dynamic area coverage

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2508957A2 (fr) * 2011-04-07 2012-10-10 LG Electronics Inc. Robot nettoyeur, système de commande à distance et procédé correspondant
EP3441840A1 (fr) * 2017-08-11 2019-02-13 Vorwerk & Co. Interholding GmbH Procédé de fonctionnement d'un appareil de nettoyage en mouvement de manière automatique

Also Published As

Publication number Publication date
ES2941634T3 (es) 2023-05-24
US20220133112A1 (en) 2022-05-05
EP3995065B1 (fr) 2023-03-01
CN114431770A (zh) 2022-05-06
DE102020129026A1 (de) 2022-05-05

Similar Documents

Publication Publication Date Title
EP2898382B1 (fr) Robot et procédé d'inspection autonome ou de traitement de surface
EP3685722B1 (fr) Système composé d'un appareil de traitement du sol guidé exclusivement à la main et d'un appareil de traitement du sol à commande exclusivement automatique ainsi que procédé de fonctionnement d'un tel système
EP3441840B1 (fr) Procédé de fonctionnement d'un appareil de nettoyage en mouvement de manière automatique
EP3814067A1 (fr) Exploration d'une zone d'intervention d'un robot par un robot mobile autonome
EP3408719B1 (fr) Procédé de création d'une carte d'environnement pour un appareil de traitement à déplacement autonome
EP3416018B1 (fr) Système pourvu d'au moins deux appareils de traitement du sol
EP2741483A2 (fr) Appareil de nettoyage automobile et procédé de fonctionnement d'un tel
EP3367200A1 (fr) Procédé de fonctionnement d'un robot en mouvement de manière automatique
EP3708058B1 (fr) Appareil de traitement du sol et système comprenant l'appareil de traitement du sol et un terminal externe
DE102016121320A1 (de) Verfahren zum Betrieb eines sich selbsttätig fortbewegenden Roboters
DE102015100419A1 (de) Verfahren und Anordnung zum Bearbeiten von Bodenflächen
DE102011006062A1 (de) Verfahren zur autonomen Inspektion einer Umgebung oder Bearbeitung von Bodenflächen
EP3733037B1 (fr) Système comprenant un appareil de traitement du sol guidé à la main, un appareil de traitement du sol à fonctionnement entièrement automatique et un dispositif de calcul
EP3355147A1 (fr) Procédé de fonctionnement d'un véhicule en mouvement de manière automatique
DE102016124856A1 (de) Verfahren zur Erstellung einer Umgebungskarte für ein Bearbeitungsgerät
EP3995065B1 (fr) Appareil de nettoyage autonome
EP3683645B1 (fr) Système doté d'un premier appareil de traitement du sol et d'un second appareil de traitement du sol ainsi que procédé de fonctionnement d'un tel système
DE102017113612A1 (de) Autonomes Reinigungssystem und Reinigungsverfahren
DE102021206786B4 (de) Verfahren zur autonomen Bearbeitung von Bodenflächen
EP4116788A1 (fr) Appareil de traitement du sol autonome
DE102017113609A1 (de) Autonomes Reinigungssystem und Reinigungsverfahren
DE102017113606A1 (de) Autonomes Reinigungssystem und Reinigungsverfahren
DE102017113605A1 (de) Autonomes Reinigungssystem und Reinigungsverfahren
EP3984434A2 (fr) Procédé de fonctionnement d'un appareil mobile autonome
EP4188183A1 (fr) Système de nettoyage de plancher

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN PUBLISHED

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20220617

RBV Designated contracting states (corrected)

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

INTG Intention to grant announced

Effective date: 20221026

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE PATENT HAS BEEN GRANTED

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

Free format text: NOT ENGLISH

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

Ref country code: AT

Ref legal event code: REF

Ref document number: 1550439

Country of ref document: AT

Kind code of ref document: T

Effective date: 20230315

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 502021000466

Country of ref document: DE

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

Free format text: LANGUAGE OF EP DOCUMENT: GERMAN

REG Reference to a national code

Ref country code: SE

Ref legal event code: TRGR

REG Reference to a national code

Ref country code: NL

Ref legal event code: FP

REG Reference to a national code

Ref country code: ES

Ref legal event code: FG2A

Ref document number: 2941634

Country of ref document: ES

Kind code of ref document: T3

Effective date: 20230524

P01 Opt-out of the competence of the unified patent court (upc) registered

Effective date: 20230517

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG9D

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230301

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230601

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230301

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230301

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230301

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230301

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230602

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230301

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230301

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230301

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230703

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230301

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230301

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230301

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230701

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 502021000466

Country of ref document: DE

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: ES

Payment date: 20231117

Year of fee payment: 3

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230301

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230301

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: SE

Payment date: 20231025

Year of fee payment: 3

Ref country code: IT

Payment date: 20231031

Year of fee payment: 3

Ref country code: FR

Payment date: 20231023

Year of fee payment: 3

Ref country code: DE

Payment date: 20231018

Year of fee payment: 3

26N No opposition filed

Effective date: 20231204

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230301

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20231018