EP3524114B1 - Cleaning robot and method for controlling the same - Google Patents

Cleaning robot and method for controlling the same Download PDF

Info

Publication number
EP3524114B1
EP3524114B1 EP18214814.8A EP18214814A EP3524114B1 EP 3524114 B1 EP3524114 B1 EP 3524114B1 EP 18214814 A EP18214814 A EP 18214814A EP 3524114 B1 EP3524114 B1 EP 3524114B1
Authority
EP
European Patent Office
Prior art keywords
distance
cleaning robot
control
capture
distance sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
EP18214814.8A
Other languages
German (de)
French (fr)
Other versions
EP3524114A1 (en
Inventor
Markus Kühnel
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BSH Hausgeraete GmbH
Original Assignee
BSH Hausgeraete GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by BSH Hausgeraete GmbH filed Critical BSH Hausgeraete GmbH
Publication of EP3524114A1 publication Critical patent/EP3524114A1/en
Application granted granted Critical
Publication of EP3524114B1 publication Critical patent/EP3524114B1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L9/00Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating action; Storing devices specially adapted to suction cleaners or parts thereof; Carrying-vehicles specially adapted for suction cleaners
    • A47L9/28Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means
    • A47L9/2857User input or output elements for control, e.g. buttons, switches or displays
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4011Regulation of the cleaning machine by electric means; Control systems and remote control systems therefor
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • A47L2201/04Automatic control of the travelling movement; Automatic obstacle detection

Definitions

  • the invention relates to a method for the efficient, robust and comfortable control of a cleaning robot.
  • the invention also relates to a cleaning robot with a cost-effective, robust and comfortable user interface.
  • a cleaning robot in particular a vacuum robot, typically has different sensors in order to be able to navigate automatically in a room and to clean the floor of the room.
  • cleaning robots In order to be able to drive and clean under pieces of furniture, such as tables or chairs, cleaning robots usually have a relatively low height (for example of about 10 cm or less).
  • the user interface of a cleaning robot is typically arranged on the upper side of the cleaning robot facing away from the floor to be cleaned.
  • the user interface can have one or more buttons that can be pressed by a user of the robot. In order to make an input via the user interface, the user has to bend down to the cleaning robot, which is typically not very comfortable.
  • a user interface with voice input can be provided.
  • This document deals with the technical task of providing a cost-effective, robust and comfortable user interface for a cleaning robot.
  • a cleaning robot for cleaning a subsurface or a floor
  • the cleaning robot can be set up to move independently and / or automatically in a room or site in order to clean (in particular vacuum) the floor or subsurface of the room or site.
  • the cleaning robot can comprise one or more drive units (e.g. with one or more drive wheels).
  • the cleaning robot typically includes one or more cleaning units with which the substrate can be cleaned.
  • the cleaning robot can comprise one or more environment sensors in order to orientate itself within the room or site.
  • the one or more drive units or cleaning units are typically arranged on the underside of the cleaning robot, which, when the cleaning robot is in operation, faces the underground.
  • the one or more environment sensors are typically arranged on a side wall of the cleaning robot, which can be arranged essentially perpendicular to the underside of the cleaning robot. Furthermore, a user interface of the cleaning robot can be arranged on the upper side of the cleaning robot, which is oriented upwards when the cleaning robot is in operation, which enables a user of the cleaning robot to transfer control instructions to the cleaning robot (e.g. via one or more buttons). Exemplary control instructions are the setting of a specific operating mode and / or the activation or stop of the cleaning robot.
  • the cleaning robot comprises at least one distance sensor which is set up to acquire distance data relating to an object that is guided into a detection area of the distance sensor by a user.
  • the distance sensor is set up to emit an optical and / or an acoustic distance measurement signal, in particular an ultrasonic signal and / or an infrared signal, in order to acquire the distance data.
  • the distance sensor can comprise a transmission module which is set up to transmit a (for example, pulsed) distance measurement signal.
  • the distance sensor can comprise a receiving module which is set up to receive the distance measurement signal reflected on an object. The distance data can indicate whether a reflected distance measurement signal is received (and thus an object is in the detection range of the distance sensor).
  • the Distance data if necessary, display the transit time of the distance measurement signal sent and received again.
  • a specific value of the distance to an object that is located in the detection range of the distance sensor can be determined from the transit time.
  • the distance data can thus indicate whether or not an object is located in the detection range of the distance sensor at a specific point in time.
  • the distance data can possibly indicate the distance at which the object is located from the distance sensor (with a specific spatial resolution of, for example, 1 cm, 5 mm, 1 mm or less).
  • the distance sensor can have a certain sampling rate (e.g. 10Hz, 50Hz, 100Hz, 1000Hz or more) with which distance values are recorded.
  • the distance data can thus comprise a time sequence of distance values.
  • the distance value for a specific point in time indicates whether or not an object is located in the detection range of the distance sensor at the specific point in time.
  • the distance value can indicate the distance at which the object is located at the specific point in time from the distance sensor (with a specific spatial resolution of e.g. 1 cm, 5 mm, 1 mm or less).
  • the distance sensor is preferably designed in such a way that the detection area is accessible to a hand and / or foot of the user of the cleaning robot during operation of the cleaning robot.
  • the detection area can extend at least partially vertically upward from the cleaning robot during operation of the cleaning robot.
  • the detection area can extend, for example, along a detection axis, the detection axis running upwards at least partially perpendicular to the top of the cleaning robot.
  • the detection area can correspond, for example, to a circular cylinder around the detection axis, the height of the circular cylinder corresponding to a detection distance of the distance sensor up to which an object can be detected.
  • the radius of the detection area (designed as a circular cylinder) is typically smaller by a factor of 10, 100 or more than the detection distance.
  • the distance sensor can have a radiation-shaped detection area which extends along a detection axis.
  • the (radial) detection area is preferably designed such that a user can comfortably guide a hand and / or foot through the detection area.
  • the cleaning robot further comprises a control unit which is set up to detect, on the basis of the distance data, a control gesture that is associated with a control instruction to the cleaning robot.
  • a control gesture that is associated with a control instruction to the cleaning robot.
  • a control gesture that is associated with a control instruction to the cleaning robot.
  • control unit In response to the detected control gesture, the control unit is further configured to operate the cleaning robot in accordance with the control instruction associated with the control gesture.
  • a cleaning robot with one or more distance sensors enables cost-efficient and robust recognition of gestures.
  • a convenient user interface for a cleaning robot can thus be provided in a cost-efficient and robust manner.
  • the distance data can comprise a time sequence of distance values, the distance value at a specific point in time depending on the distance between the object used for gesture control and the distance sensor at the specific point in time.
  • the control unit can be configured to detect the control gesture on the basis of the sequence of distance values.
  • the control unit can be set up to compare a recorded temporal sequence of distance values with a plurality of different reference sequences of distance values in order to detect the control gesture.
  • the plurality of different reference sequences can correspond to a corresponding plurality of different control gestures and / or be associated with a corresponding plurality of different control instructions.
  • a distance measure between the captured sequence and the different reference sequences can be calculated.
  • the reference sequence with the lowest distance measure can then be selected in order to detect a specific control gesture.
  • the different reference sequences and / or the different control gestures can differ with regard to the one or more distance values contained.
  • the different reference sequences and / or the different control gestures can differ with regard to the temporal sequence of distance values and / or with regard to the speed with which different distance values follow one another. Different parameters (values, value profile and / or speed) of a time sequence of distance values can thus be taken into account in order to distinguish different control gestures for different control instructions. This enables robust and comfortable gesture control.
  • the control unit can be set up to detect a control gesture by means of a machine-learned classifier.
  • the classifier can be designed to identify a plurality of subspaces for a corresponding plurality of different control gestures within a value range of possible distance data.
  • the classifier can e.g. comprise a neural network. The use of a machine-learned classifier enables particularly robust user control based on different control gestures.
  • the control unit can be set up to determine, on the basis of the distance data and as a function of a distance threshold value, whether the control gesture caused by the hand or foot of the user.
  • the distance threshold value can correspond, for example, to the typical height of the knee and / or the hip of a person.
  • the value of the distance of an object guided into the detection range of the distance sensor can thus be taken into account and compared with the distance threshold value.
  • a relatively small distance can be interpreted as a control input by means of a foot and a relatively large distance can be interpreted as a control input by means of a hand.
  • the control instruction associated with the control gesture can then depend on whether the control gesture was effected by the hand or by the foot of the user. It can thus be made possible in a comfortable manner for a user to effect different control instructions with his foot or with his hand. The convenience of a user interface of a cleaning robot can thus be further increased.
  • the control unit can be set up to determine on the basis of the distance data whether an object is moving towards the distance sensor or whether an object is moving away from the distance sensor.
  • the control gesture caused by a user can then be detected based on this.
  • control gestures and control instructions associated therewith
  • Such control gestures can be carried out by a user in a comfortable and intuitive manner, so that the comfort of a user interface of a cleaning robot is further increased.
  • the cleaning robot can have several distance sensors for gesture control.
  • the cleaning robot can comprise a first distance sensor for capturing first distance data and a second distance sensor for capturing second distance data.
  • the first distance sensor can have a first detection area along a first detection axis and the second distance sensor can have a second detection area along a second detection axis.
  • the control unit can then be set up to detect the control gesture on the basis of the first distance data and on the basis of the second distance data.
  • the first detection axis and the second detection axis can run essentially parallel to one another. Furthermore, the first detection axis and the second detection axis can run offset from one another in such a way that the first detection area and the second detection area do not or at least partially do not overlap. A state can thus be detected in which an object is located in the first detection area but not in the second detection area and / or in which an object is in the second detection area but not in the first detection area. Reliable recognition of different control gestures on the basis of the first and second distance data can thus be made possible.
  • the control unit can in particular be set up to determine, on the basis of the first and second distance data, a time sequence in which an object enters the first detection area of the first distance sensor and the second detection area of the second distance sensor.
  • the control gesture can then be determined as a function of the chronological sequence. The number of distinguishable control gestures (and thus the number of possible control instructions) can thus be increased further.
  • the detection range of a distance sensor used for gesture control can extend at least partially horizontally away from the cleaning robot during operation of the cleaning robot.
  • a distance sensor can be used which is arranged on the side wall of the cleaning robot.
  • a cleaning robot typically comprises a plurality of such distance sensors (as environment sensors) in order to detect during the movement or navigation of the cleaning robot whether the cleaning robot is approaching an obstacle.
  • the control unit can thus be set up to navigate or move the cleaning robot as a function of the distance data.
  • the distance data of a distance sensor which has an essentially horizontally running (radial) detection area and which is typically used for the environment recognition of the cleaning robot, can thus also be used for Providing a user interface can be used.
  • the control unit can be set up to distinguish on the basis of the distance data, in particular on the basis of the time sequence of distance values displayed by the distance data, whether an object in the detection range of the distance sensor is an obstacle or an object causing the control gesture acts. This can be done using a machine-learned classifier, for example.
  • the gesture control can take place in a comfortable manner by a user's foot.
  • the foot can be guided in a defined way into, through and / or out of the detection area in order to perform a control gesture (which differs from the movement of another obstacle).
  • a robust and comfortable user interface for a cleaning robot can thus be provided in a particularly cost-efficient manner.
  • one or more distance sensors that are already present and used for navigation can also be used as a user interface in a cost-neutral manner.
  • the distance sensor used for gesture control can be a switching sensor which is set up to generate a switching signal as distance data when an object enters or exits the detection area of the distance sensor.
  • the use of a cost-efficient switching sensor is a cost-efficient way of enabling gesture control for a cleaning robot, particularly when using several distance sensors.
  • the distance data can thus (possibly only) indicate in a binary manner whether or not an object is in the detection area.
  • the distance data can only indicate whether an object is located at a distance from the distance sensor that is the same as or smaller than the detection distance.
  • the distance data cannot display a value of the distance to an object lying in the detection area that deviates from the detection distance. Nevertheless, in particular when using several such distance sensors (in particular switching sensors), different control gestures for different control instructions can be distinguished in a robust manner.
  • a method for controlling a cleaning robot comprises the acquisition, by means of a distance sensor of the cleaning robot, of distance data relating to an object that is guided into a detection area of the distance sensor.
  • the detection area is preferably accessible to a hand and / or foot of a user of the cleaning robot during operation of the cleaning robot.
  • the method comprises the detection, on the basis of the distance data, of a control gesture which is associated with a control instruction to the cleaning robot.
  • the method further comprises operating the cleaning robot in accordance with the detected control instruction.
  • Fig. 1a an exemplary cleaning robot 100 in a perspective view.
  • the underside 122 of the cleaning robot 100 typically has one or more drive units 101 (with one or more drive wheels) by means of which the cleaning robot 100 can be moved in order to clean different areas of a floor.
  • a cleaning robot 100 typically comprises one or more cleaning units 102 (eg with a cleaning brush) which are set up to clean the floor under the cleaning robot 100.
  • a cleaning robot 100 can comprise one or more distance sensors 110 which are set up to record distance data relating to a distance between the underside 122 of the cleaning robot 100 and the floor to be cleaned.
  • a control unit 130 (see Figure 1b ) of the cleaning robot 100 can be set up to recognize, on the basis of the distance data, an abyss lying in front of the cleaning robot 100 in the direction of movement of the cleaning robot 100 (for example a step).
  • a user interface can be arranged on the upper side 121 of the cleaning robot 100, which enables a user of the cleaning robot 100 to make control inputs.
  • one or more environment sensors can be arranged on a side wall 123 of the cleaning robot 100, which connects the top side 121 with the bottom side 122, which are set up to detect the environment of the cleaning robot 100.
  • the control unit 130 of the cleaning robot 100 can be set up to navigate the cleaning robot 100 through the environment on the basis of the sensor data of the one or more environment sensors.
  • Figure 1b shows schematically the top side 121 of a cleaning robot 100.
  • the cleaning robot 100 can have at least one distance sensor 120 on the top side 121, which is set up to detect distance data in relation to a distance of an object arranged above the cleaning robot 100 to the top side 121.
  • a distance sensor 120 can include a transmission module that is set up to emit a distance measurement signal.
  • a distance sensor 120 can comprise a receiving module which is set up to receive the distance measurement signal reflected on an object. The distance to the object can then be determined on the basis of the transit time of the distance measurement signal emitted and received again.
  • the distance measurement signal can be an optical and / or acoustic signal.
  • Figure 1c shows the side wall 123 of a cleaning robot 100 in a schematic manner. Furthermore shows Figure 1c a distance sensor 120 which is arranged on the upper side 122 and is configured to detect vertical distance data relating to a distance in the vertical direction upwards.
  • the cleaning robot 100 can include one or more distance sensors 120 which are set up to detect horizontal distance data with regard to a distance in the horizontal direction. The horizontal distance data can be used for the navigation of the cleaning robot 100, for example.
  • the distance sensor 120 of a cleaning robot 100 can be used to provide a robust and comfortable user interface for the cleaning robot 100 in a cost-effective manner.
  • a time profile of distance values can be determined for this purpose.
  • the distance to an object in front of the distance sensor 120 can be determined with a certain sampling frequency (e.g. 10 Hzr, 100 Hz or more) by means of a distance sensor 120.
  • This temporal sequence of distance values can be evaluated by the control unit 130 of the cleaning robot 100 in order to detect a gesture that is associated with a control instruction for controlling the cleaning robot 100.
  • the cleaning robot 100 can then be operated as a function of the control instruction.
  • a distance or distance sensor 120 can thus be provided on the cleaning robot 100 (possibly additionally), which can in particular measure vertically (upwards).
  • the cleaning robot 100 can be controlled via gestures.
  • a movement 200 with the hand 202 in the direction of the distance sensor 120 can indicate to the cleaning robot 100 that a switch should be made from a power mode to a silent mode.
  • a movement 200 with the hand 202 away from the distance sensor 120 can indicate that a switch should be made from the silent mode to the power mode.
  • Moving 200 the foot 201 over the cleaning robot 100 can, for example, cause the cleaning robot 100 to switch to a pause mode. Entries with the hand 202 or with the foot 201 can thereby can be distinguished on the basis of the different characteristic distances from the distance sensor 120.
  • Figure 2b shows the use of several distance sensors 120 with several distance measurement signals 221, 222.
  • the number of recognizable gestures can be increased by using several distance sensors in the vertical direction (which are at a certain distance from one another).
  • transverse movements 200 can also be detected by using a plurality of distance sensors 120 arranged next to one another (for example, wiping from left to right or wiping from top left to bottom right).
  • the control unit 130 can thus be set up to recognize a movement 200 of a body part 201, 202 of a user of the cleaning robot 100 within the detection range of the distance sensor 120 on the basis of the distance data of at least one distance sensor 120 of a cleaning robot 100.
  • a distinction can be made between different movements or gestures 200 from a plurality of movements or gestures 200.
  • the different movements or gestures 200 can be associated with corresponding different control instructions to the cleaning robot 100.
  • the control unit 130 of a cleaning robot 100 can thus detect a movement or a gesture 200 in the detection range of the distance sensor 120 on the basis of the distance data from a distance sensor 120.
  • the control unit 130 can operate the cleaning robot 100 in accordance with the control instruction associated with the movement or the gesture 200.
  • FIG. 10 shows exemplary distance data that may be captured by a distance sensor 120.
  • the distance data can comprise a time sequence 231 of distance values 230.
  • the distance values 230 can each display the distance of an object 201, 202 in the detection range of the distance sensor 120.
  • a temporal profile of the distance to an object 201, 202 can thus be recorded as distance data.
  • a specific control gesture 200 can then be detected on the basis of the time sequence 231 of distance values 230.
  • Different control gestures 200 can differ, for example, by the value range of the distance values 230. For example, by comparison with a distance threshold 232 it can be recognized whether it is a control gesture 200 that was performed with a hand 202 or with a foot 201.
  • the speed of a movement or gesture 200 can be taken into account in order to differentiate between different control gestures 200.
  • different control gestures 200 for different control instructions to a cleaning robot 100 can be distinguished in order to provide a comfortable and extensive user interface.
  • FIG. 10 shows a flow chart of an exemplary method 300 for controlling a cleaning robot 100, in particular a vacuum robot.
  • the method 300 comprises the acquisition 301, by means of a distance sensor 120 of the cleaning robot 100, of distance data relating to an object 201, 202 which is guided into a detection area of the distance sensor 120 and / or is moved in a detection area of the distance sensor 120.
  • the distance sensor 120 is preferably designed such that the detection area of the distance sensor 120 is accessible to a hand 202 and / or to a foot 201 of a user of the cleaning robot 100 while the cleaning robot 100 is in operation.
  • the distance sensor 120 can have a detection area which extends essentially or at least partially vertically upwards away from the top side 121 of the cleaning robot 100.
  • the detection area can be limited to a detection distance, wherein the detection distance can correspond to the typical height of a person (eg 2 meters or less).
  • the distance sensor 120 can thus be designed in such a way that a user of the cleaning robot 100 can comfortably guide a hand 202 and / or a foot 201 into the detection area, lead them out of the detection area and / or move them within the detection area (in particular around with the hand 202 and / or to perform a certain gesture with the foot 201).
  • the method 300 comprises detecting 302, based on the distance data, a control gesture 200 that is associated with a control instruction to the cleaning robot 100.
  • the distance data can display a specific chronological sequence of distance values of the object 201, 202 introduced into the detection area or of the object 201, 202 located in the detection area. It can thus be recognized on the basis of the distance data that a user (for example with his hand 202 and / or has performed a specific gesture, in particular a control gesture 200, with the foot 201).
  • the recognized control gesture 200 can correspond to a specific control instruction to the cleaning robot 100.
  • Exemplary control instructions are: the activation or the deactivation of a specific operating mode of the cleaning robot 100 and / or the stopping or the activation of the cleaning robot 100.
  • the method 300 further comprises operating 303 the cleaning robot 100 in accordance with the detected control instruction. In this way, comfortable and robust control of a cleaning robot 100 by a user can be made possible in a cost-efficient manner (by using a relatively inexpensive distance sensor).
  • the measures described in this document enable a user to conveniently control a cleaning robot 100 using gestures.
  • the gestures can be recognized in a cost-efficient manner by means of at least one distance sensor 120.
  • the control unit 130 can enable a user to learn individual gestures and / or to configure gestures and the control instructions associated therewith. This enables intuitive, individualized operation (without using an additional input device).

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Description

Die Erfindung betrifft ein Verfahren zur effizienten, robusten und komfortablen Steuerung eines Reinigungsroboters. Des Weiteren betrifft die Erfindung einen Reinigungsroboter mit einer kosteneffizienten, robusten und komfortablen Benutzerschnittstelle.The invention relates to a method for the efficient, robust and comfortable control of a cleaning robot. The invention also relates to a cleaning robot with a cost-effective, robust and comfortable user interface.

Ein Reinigungsroboter, insbesondere ein Saugroboter, weist typischerweise unterschiedliche Sensoren auf, um automatisiert in einem Raum navigieren zu können und um dabei den Boden des Raumes zu reinigen. Um auch unter Möbelstücke, etwa Tische oder Stühle, fahren und reinigen zu können, weisen Reinigungsroboter meist eine relativ geringe Höhe (z.B. von ca. 10 cm oder weniger) auf. Die Benutzerschnittstelle eines Reinigungsroboters ist typischerweise auf der, von dem zu reinigenden Boden abgewandten, Oberseite des Reinigungsroboters angeordnet. Die Benutzerschnittstelle kann ein oder mehrere Tasten aufweisen, die durch einen Nutzer des Roboters gedrückt werden können. Um eine Eingabe über die Benutzerschnittstelle zu tätigen, muss sich somit der Nutzer zu dem Reinigungsroboter herunterbeugen, was typischerweise wenig komfortabel ist. Alternativ kann eine Benutzerschnittstelle mit Spracheingabe bereitgestellt werden. Eine derartige Benutzerschnittstelle ist jedoch aufgrund der relativ hohen Geräuschemission eines Reinigungsroboters meist wenig robust. Ferner kann eine Fernbedienung für einen Reinigungsroboter bereitgestellt werden, die jedoch mit relativ hohen Zusatzkosten verbunden sein kann. Darüber hinaus ist aus der EP 2 680 097 A1 ein Reinigungsroboter mit einem optischen Sensor für die Gestenerkennung vorbekannt.A cleaning robot, in particular a vacuum robot, typically has different sensors in order to be able to navigate automatically in a room and to clean the floor of the room. In order to be able to drive and clean under pieces of furniture, such as tables or chairs, cleaning robots usually have a relatively low height (for example of about 10 cm or less). The user interface of a cleaning robot is typically arranged on the upper side of the cleaning robot facing away from the floor to be cleaned. The user interface can have one or more buttons that can be pressed by a user of the robot. In order to make an input via the user interface, the user has to bend down to the cleaning robot, which is typically not very comfortable. Alternatively, a user interface with voice input can be provided. However, such a user interface is usually not very robust due to the relatively high noise emissions of a cleaning robot. Furthermore, a remote control for a cleaning robot can be provided, but this can be associated with relatively high additional costs. In addition, from the EP 2 680 097 A1 a cleaning robot with an optical sensor for gesture recognition is previously known.

Das vorliegende Dokument befasst sich mit der technischen Aufgabe, eine kosteneffiziente, robuste und komfortable Benutzerschnittstelle für einen Reinigungsroboter bereitzustellen.This document deals with the technical task of providing a cost-effective, robust and comfortable user interface for a cleaning robot.

Die Aufgabe wird jeweils durch den Gegenstand der unabhängigen Patentansprüche gelöst. Vorteilhafte Ausführungsformen sind insbesondere in den abhängigen Patentansprüchen definiert, in nachfolgender Beschreibung beschrieben oder in der beigefügten Zeichnung dargestellt.The object is achieved in each case by the subject matter of the independent patent claims. Advantageous embodiments are particularly defined in the dependent claims, described in the following description or shown in the accompanying drawing.

Gemäß einem Aspekt der Erfindung wird ein Reinigungsroboter, insbesondere ein Saugroboter, zur Reinigung eines Untergrunds bzw. eines Bodens beschrieben. Der Reinigungsroboter kann eingerichtet sein, sich selbständig und/oder automatisch in einem Raum bzw. Gelände zu bewegen, um den Boden bzw. Untergrund des Raums bzw. Geländes zu reinigen (insbesondere zu saugen). Zu diesem Zweck kann der Reinigungsroboter ein oder mehrere Antriebseinheiten (z.B. mit ein oder mehreren Antriebsrädern) umfassen. Außerdem umfasst der Reinigungsroboter typischerweise ein oder mehrere Reinigungseinheiten, mit denen der Untergrund gereinigt werden kann. Des Weiteren kann der Reinigungsroboter ein oder mehrere Umfeldsensoren umfassen, um sich innerhalb des Raums bzw. Geländes zu orientieren. Die ein oder mehreren Antriebseinheiten bzw. Reinigungseinheiten sind typischerweise an der Unterseite des Reinigungsroboters angeordnet, die im Betrieb des Reinigungsroboters dem Untergrund zugewandt ist. Die ein oder mehreren Umfeldsensoren sind typischerweise an einer Seitenwand des Reinigungsroboters angeordnet, die im Wesentlichen senkrecht zu der Unterseite des Reinigungsroboters angeordnet sein kann. Des Weiteren kann an der Oberseite des Reinigungsroboters, die im Betrieb des Reinigungsroboters nach oben orientiert ist, eine Benutzerschnittstelle des Reinigungsroboters angeordnet sein, die es einem Nutzer des Reinigungsroboters ermöglicht, Steueranweisungen an den Reinigungsroboter zu übergeben (z.B. über ein oder mehrere Tasten). Beispielhafte Steueranweisungen sind die Einstellung eines bestimmten Betriebsmodus und/oder die Aktivierung bzw. der Stopp des Reinigungsroboters.According to one aspect of the invention, a cleaning robot, in particular a vacuum robot, for cleaning a subsurface or a floor is described. The cleaning robot can be set up to move independently and / or automatically in a room or site in order to clean (in particular vacuum) the floor or subsurface of the room or site. For this purpose, the cleaning robot can comprise one or more drive units (e.g. with one or more drive wheels). In addition, the cleaning robot typically includes one or more cleaning units with which the substrate can be cleaned. Furthermore, the cleaning robot can comprise one or more environment sensors in order to orientate itself within the room or site. The one or more drive units or cleaning units are typically arranged on the underside of the cleaning robot, which, when the cleaning robot is in operation, faces the underground. The one or more environment sensors are typically arranged on a side wall of the cleaning robot, which can be arranged essentially perpendicular to the underside of the cleaning robot. Furthermore, a user interface of the cleaning robot can be arranged on the upper side of the cleaning robot, which is oriented upwards when the cleaning robot is in operation, which enables a user of the cleaning robot to transfer control instructions to the cleaning robot (e.g. via one or more buttons). Exemplary control instructions are the setting of a specific operating mode and / or the activation or stop of the cleaning robot.

Der Reinigungsroboter umfasst zumindest einen Abstandssensor, der eingerichtet ist, Abstandsdaten bezüglich eines Objekts zu erfassen, das durch einen Nutzer in einen Erfassungsbereich des Abstandssensors geführt wird. Typischerweise ist der Abstandssensor eingerichtet, ein optisches und/oder ein akustisches Abstandsmesssignal, insbesondere ein Ultraschallsignal und/oder ein Infrarotsignal auszusenden, um die Abstandsdaten zu erfassen. Beispielsweise kann der Abstandssensor ein Sendemodul umfassen, das eingerichtet ist, ein (z.B. gepulstes) Abstandsmesssignal auszusenden. Des Weiteren kann der Abstandssensor ein Empfangsmodul umfassen, das eingerichtet ist, das an einem Objekt reflektierte Abstandsmesssignal zu empfangen. Die Abstanddaten können anzeigen, ob ein reflektiertes Abstandsmesssignal empfangen wird (und sich somit ein Objekt im Erfassungsbereich des Abstandssensors befindet). Des Weiteren können die Abstandsdaten ggf. die Laufzeit des ausgesendeten und wieder empfangenen Abstandsmesssignals anzeigen. Aus der Laufzeit kann ein konkreter Wert des Abstands eines Objekts ermittelt werden, das sich im Erfassungsbereich des Abstandssensors befindet. Die Abstandsdaten können somit anzeigen, ob sich zu einem bestimmten Zeitpunkt ein Objekt im Erfassungsbereich des Abstandssensors befindet oder nicht. Des Weiteren können die Abstandsdaten ggf. anzeigen, in welchem Abstand sich das Objekt von dem Abstandssensor befindet (mit einer bestimmten örtlichen Auflösung von z.B. 1cm, 5mm, 1mm oder weniger).The cleaning robot comprises at least one distance sensor which is set up to acquire distance data relating to an object that is guided into a detection area of the distance sensor by a user. Typically, the distance sensor is set up to emit an optical and / or an acoustic distance measurement signal, in particular an ultrasonic signal and / or an infrared signal, in order to acquire the distance data. For example, the distance sensor can comprise a transmission module which is set up to transmit a (for example, pulsed) distance measurement signal. Furthermore, the distance sensor can comprise a receiving module which is set up to receive the distance measurement signal reflected on an object. The distance data can indicate whether a reflected distance measurement signal is received (and thus an object is in the detection range of the distance sensor). Furthermore, the Distance data, if necessary, display the transit time of the distance measurement signal sent and received again. A specific value of the distance to an object that is located in the detection range of the distance sensor can be determined from the transit time. The distance data can thus indicate whether or not an object is located in the detection range of the distance sensor at a specific point in time. Furthermore, the distance data can possibly indicate the distance at which the object is located from the distance sensor (with a specific spatial resolution of, for example, 1 cm, 5 mm, 1 mm or less).

Der Abstandssensor kann eine bestimmte Abtastrate (z.B. 10Hz, 50Hz, 100Hz, 1000Hz oder mehr) aufweisen, mit der Abstandswerte erfasst werden. Die Abstandsdaten können somit eine zeitliche Sequenz von Abstandswerten umfassen. Dabei zeigt der Abstandswert für einen bestimmten Zeitpunkt an, ob sich an dem bestimmten Zeitpunkt ein Objekt im Erfassungsbereich des Abstandssensors befindet oder nicht. Des Weiteren kann der Abstandswert anzeigen, in welchem Abstand sich das Objekt an dem bestimmten Zeitpunkt zum Abstandssensor befindet (mit einer bestimmten örtlichen Auflösung von z.B. 1cm, 5mm, 1mm oder weniger).The distance sensor can have a certain sampling rate (e.g. 10Hz, 50Hz, 100Hz, 1000Hz or more) with which distance values are recorded. The distance data can thus comprise a time sequence of distance values. The distance value for a specific point in time indicates whether or not an object is located in the detection range of the distance sensor at the specific point in time. Furthermore, the distance value can indicate the distance at which the object is located at the specific point in time from the distance sensor (with a specific spatial resolution of e.g. 1 cm, 5 mm, 1 mm or less).

Der Abstandssensor ist vorzugsweise derart ausgelegt, dass der Erfassungsbereich während des Betriebs des Reinigungsroboters für eine Hand und/oder einen Fuß des Nutzers des Reinigungsroboters zugänglich ist. Insbesondere kann sich zu diesem Zweck der Erfassungsbereich während des Betriebs des Reinigungsroboters zumindest teilweise vertikal von dem Reinigungsroboter nach oben erstreckt. Der Erfassungsbereich kann sich z.B. entlang einer Erfassungsachse erstrecken, wobei die Erfassungsachse zumindest teilweise senkrecht zu der Oberseite des Reinigungsroboters nach oben verläuft. Der Erfassungsbereich kann z.B. einem Kreiszylinder um die Erfassungsachse entsprechen, wobei die Höhe des Kreiszylinders einem Erfassungsabstand des Abstandssensors entspricht, bis zu dem ein Objekt detektiert werden kann. Der Radius des (als Kreiszylinder ausgebildeten) Erfassungsbereichs ist dabei typischerweise um den Faktor 10, 100 oder mehr kleiner als der Erfassungsabstand. Mit anderen Worten, der Abstandssensor kann einen strahlenförmigen Erfassungsbereich aufweisen, der sich entlang einer Erfassungsachse erstreckt. Der (strahlenförmige) Erfassungsbereich ist dabei bevorzugt derart ausgelegt, dass ein Nutzer in komfortabler Weise eine Hand und/oder einen Fuß durch den Erfassungsbereich führen kann.The distance sensor is preferably designed in such a way that the detection area is accessible to a hand and / or foot of the user of the cleaning robot during operation of the cleaning robot. In particular, for this purpose, the detection area can extend at least partially vertically upward from the cleaning robot during operation of the cleaning robot. The detection area can extend, for example, along a detection axis, the detection axis running upwards at least partially perpendicular to the top of the cleaning robot. The detection area can correspond, for example, to a circular cylinder around the detection axis, the height of the circular cylinder corresponding to a detection distance of the distance sensor up to which an object can be detected. The radius of the detection area (designed as a circular cylinder) is typically smaller by a factor of 10, 100 or more than the detection distance. In other words, the distance sensor can have a radiation-shaped detection area which extends along a detection axis. The (radial) detection area is preferably designed such that a user can comfortably guide a hand and / or foot through the detection area.

Der Reinigungsroboter umfasst ferner eine Steuereinheit, die eingerichtet ist, auf Basis der Abstandsdaten eine Steuergeste zu detektieren, die mit einer Steueranweisung an den Reinigungsroboter assoziiert ist. Insbesondere kann auf Basis einer zeitlichen Sequenz von Abstandswerten erkannt werden, dass ein Nutzer ein Objekt in den, durch den und/oder aus dem Erfassungsbereich des Abstandssensors geführt hat, und dabei eine bestimmte Steuergeste mit dem Objekt (z.B. mit der Hand und/oder mit dem Fuß) vollzogen hat. Dabei können unterschiedliche Steueranweisungen an den Reinigungsroboter (z.B. zur Aktivierung von unterschiedlichen Betriebsmodi) mit unterschiedlichen Steuergesten assoziiert sein. Die Steuereinheit kann eingerichtet sein, auf Basis der Abstandsdaten zu ermitteln, welche Steuergeste durch den Nutzer vollzogen wurde.The cleaning robot further comprises a control unit which is set up to detect, on the basis of the distance data, a control gesture that is associated with a control instruction to the cleaning robot. In particular, on the basis of a temporal sequence of distance values, it can be recognized that a user has guided an object into, through and / or out of the detection area of the distance sensor, using a certain control gesture with the object (e.g. with the hand and / or the foot). Different control instructions to the cleaning robot (e.g. to activate different operating modes) can be associated with different control gestures. The control unit can be set up to determine which control gesture was made by the user on the basis of the distance data.

Die Steuereinheit ist weiter eingerichtet, in Reaktion auf die detektierte Steuergeste, den Reinigungsroboter gemäß der mit der Steuergeste assoziierten Steueranweisung zu betreiben.In response to the detected control gesture, the control unit is further configured to operate the cleaning robot in accordance with the control instruction associated with the control gesture.

Die Bereitstellung eines Reinigungsroboters mit ein oder mehreren Abstandssensoren ermöglicht eine kosteneffiziente und robuste Erkennung von Gesten. Es kann somit in kosteneffizienter und robuster Weise eine komfortable Benutzerschnittstelle für einen Reinigungsroboter bereitgestellt werden.The provision of a cleaning robot with one or more distance sensors enables cost-efficient and robust recognition of gestures. A convenient user interface for a cleaning robot can thus be provided in a cost-efficient and robust manner.

Wie bereits oben dargelegt, können die Abstandsdaten eine zeitliche Sequenz von Abstandswerten umfassen, wobei der Abstandswert an einem bestimmten Zeitpunkt von dem Abstand des zur Gestensteuerung verwendeten Objekts zu dem Abstandssensor an dem bestimmten Zeitpunkt abhängt. Die Steuereinheit kann eingerichtet sein, die Steuergeste auf Basis der Sequenz von Abstandswerten zu detektieren. Durch die Berücksichtigung einer zeitlichen Sequenz von Abstandswerten kann die Erkennung von relativ komplexen Steuergesten ermöglicht werden. Dies ermöglicht wiederum die Verwendung einer Vielzahl von unterschiedlichen Steuergesten für eine Vielzahl von unterschiedlichen Steueranweisungen. Somit kann der Komfort der Benutzersteuerung eines Reinigungsroboters weiter erhöht werden.As already explained above, the distance data can comprise a time sequence of distance values, the distance value at a specific point in time depending on the distance between the object used for gesture control and the distance sensor at the specific point in time. The control unit can be configured to detect the control gesture on the basis of the sequence of distance values. By taking into account a time sequence of distance values, relatively complex control gestures can be recognized. This in turn enables the use of a large number of different control gestures for a large number of different control instructions. The convenience of user control of a cleaning robot can thus be increased further.

Die Steuereinheit kann eingerichtet sein, eine erfasste zeitliche Sequenz von Abstandswerten mit einer Mehrzahl von unterschiedlichen Referenzsequenzen von Abstandswerten zu vergleichen, um die Steuergeste zu detektieren. Dabei kann die Mehrzahl von unterschiedlichen Referenzsequenzen einer entsprechenden Mehrzahl von unterschiedlichen Steuergesten entsprechen und/oder mit einer entsprechenden Mehrzahl von unterschiedlichen Steueranweisungen assoziiert sein. Im Rahmen des Vergleichs kann z.B. ein Distanzmaß zwischen der erfassten Sequenz und den unterschiedlichen Referenzsequenzen berechnet werden. Es kann dann die Referenzsequenz mit dem niedrigsten Distanzmaß ausgewählt werden, um eine bestimmte Steuergeste zu detektieren. Durch die Berücksichtigung von Referenzsequenzen kann eine besonders robuste und komfortable Steuerung anhand von unterschiedlichen Steuergesten ermöglicht werden.The control unit can be set up to compare a recorded temporal sequence of distance values with a plurality of different reference sequences of distance values in order to detect the control gesture. The plurality of different reference sequences can correspond to a corresponding plurality of different control gestures and / or be associated with a corresponding plurality of different control instructions. As part of the comparison, e.g. a distance measure between the captured sequence and the different reference sequences can be calculated. The reference sequence with the lowest distance measure can then be selected in order to detect a specific control gesture. By taking reference sequences into account, a particularly robust and comfortable control based on different control gestures can be made possible.

Die unterschiedlichen Referenzsequenzen und/oder die unterschiedlichen Steuergesten können sich in Bezug auf die enthaltenen ein oder mehreren Abstandswerte unterscheiden. Alternativ oder ergänzend können sich die unterschiedlichen Referenzsequenzen und/oder die unterschiedlichen Steuergesten in Bezug auf die zeitliche Abfolge von Abstandswerten und/oder in Bezug auf die Geschwindigkeit, mit der unterschiedliche Abstandswerte aufeinander folgen, unterscheiden. Es können somit unterschiedliche Parameter (Werte, Werteverlauf und/oder Geschwindigkeit) einer zeitlichen Sequenz von Abstandswerten berücksichtigt werden, um unterschiedliche Steuergesten für unterschiedliche Steueranweisungen zu unterscheiden. So kann eine robuste und komfortable Gestensteuerung ermöglicht werden.The different reference sequences and / or the different control gestures can differ with regard to the one or more distance values contained. As an alternative or in addition, the different reference sequences and / or the different control gestures can differ with regard to the temporal sequence of distance values and / or with regard to the speed with which different distance values follow one another. Different parameters (values, value profile and / or speed) of a time sequence of distance values can thus be taken into account in order to distinguish different control gestures for different control instructions. This enables robust and comfortable gesture control.

Die Steuereinheit kann eingerichtet sein, eine Steuergeste mittels eines maschinen-erlernten Klassifikators zu detektieren. Dabei kann der Klassifikator ausgebildet sein, innerhalb eines Werteraums von möglichen Abstandsdaten eine Mehrzahl von Teilräumen für eine entsprechende Mehrzahl von unterschiedlichen Steuergesten zu identifizieren. Der Klassifikator kann z.B. ein neuronales Netzwerk umfassen. Die Verwendung eines maschinen-erlernten Klassifikators ermöglicht eine besonders robuste Benutzersteuerung anhand von unterschiedlichen Steuergesten.The control unit can be set up to detect a control gesture by means of a machine-learned classifier. The classifier can be designed to identify a plurality of subspaces for a corresponding plurality of different control gestures within a value range of possible distance data. The classifier can e.g. comprise a neural network. The use of a machine-learned classifier enables particularly robust user control based on different control gestures.

Die Steuereinheit kann eingerichtet sein, auf Basis der Abstandsdaten und in Abhängigkeit von einem Abstands-Schwellenwert zu bestimmen, ob die Steuergeste durch die Hand oder durch den Fuß des Nutzers bewirkt wurde. Der Abstands-Schwellenwert kann z.B. der typischen Höhe des Knies und/oder der Hüfte eines Menschen entsprechen. Es kann somit der Wert des Abstands eines in den Erfassungsbereich des Abstandssensors geführten Objekts berücksichtigt werden und mit dem Abstands-Schwellenwert verglichen werden. Ein relativ kleiner Abstand kann als Steuereingabe mittels eines Fußes und ein relativ großer Abstand kann als Steuereingabe mittels einer Hand interpretiert werden. Die mit der Steuergeste assoziierte Steueranweisung kann dann davon abhängen, ob die Steuergeste durch die Hand oder durch den Fuß des Nutzers bewirkt wurde. Es kann somit einem Nutzer in komfortabler Weise ermöglicht werden, unterschiedliche Steueranweisungen mit dem Fuß bzw. mit der Hand zu bewirken. So kann der Komfort einer Benutzerschnittstelle eines Reinigungsroboters weiter erhöht werden.The control unit can be set up to determine, on the basis of the distance data and as a function of a distance threshold value, whether the control gesture caused by the hand or foot of the user. The distance threshold value can correspond, for example, to the typical height of the knee and / or the hip of a person. The value of the distance of an object guided into the detection range of the distance sensor can thus be taken into account and compared with the distance threshold value. A relatively small distance can be interpreted as a control input by means of a foot and a relatively large distance can be interpreted as a control input by means of a hand. The control instruction associated with the control gesture can then depend on whether the control gesture was effected by the hand or by the foot of the user. It can thus be made possible in a comfortable manner for a user to effect different control instructions with his foot or with his hand. The convenience of a user interface of a cleaning robot can thus be further increased.

Die Steuereinheit kann eingerichtet sein, auf Basis der Abstandsdaten zu bestimmen, ob sich ein Objekt auf den Abstandssensor zu bewegt oder ob sich ein Objekt von dem Abstandssensor weg bewegt. Es kann dann basierend darauf die durch einen Nutzer bewirkte Steuergeste detektiert werden. Mit anderen Worten, es kann zwischen unterschiedlichen Steuergesten (und damit assoziierten Steueranweisungen) unterschieden werden, die eine Bewegung zu dem Reinigungsroboter oder eine Bewegung von dem Reinigungsroboter weg aufweisen. Derartige Steuergesten können durch einen Nutzer in komfortabler und intuitiver Weise ausgeführt werden, so dass der Komfort einer Benutzerschnittstelle eines Reinigungsroboters weiter erhöht wird.The control unit can be set up to determine on the basis of the distance data whether an object is moving towards the distance sensor or whether an object is moving away from the distance sensor. The control gesture caused by a user can then be detected based on this. In other words, a distinction can be made between different control gestures (and control instructions associated therewith) which have a movement towards the cleaning robot or a movement away from the cleaning robot. Such control gestures can be carried out by a user in a comfortable and intuitive manner, so that the comfort of a user interface of a cleaning robot is further increased.

Wie bereits oben dargelegt, kann der Reinigungsroboter mehrere Abstandsensoren für eine Gestensteuerung aufweisen. Insbesondere kann der Reinigungsroboter einen ersten Abstandssensor zur Erfassung von ersten Abstandsdaten und einen zweiten Abstandssensor zur Erfassung von zweiten Abstandsdaten umfassen. Dabei können der erste Abstandssensor einen ersten Erfassungsbereich entlang einer ersten Erfassungsachse und der zweite Abstandssensor einen zweiten Erfassungsbereich entlang einer zweiten Erfassungsachse aufweisen. Die Steuereinheit kann dann eingerichtet sein, die Steuergeste auf Basis der ersten Abstandsdaten und auf Basis der zweiten Abstandsdaten zu detektieren. Die Verwendung von mehreren Abstandssensoren mit unterschiedlichen Erfassungsbereichen ermöglicht eine weitergehende Differenzierung zwischen unterschiedlichen Steuergesten, und somit die Eingabe einer erhöhten Anzahl von Steueranweisungen. Somit kann der Komfort einer Benutzerschnittstelle weiter erhöht werden.As already explained above, the cleaning robot can have several distance sensors for gesture control. In particular, the cleaning robot can comprise a first distance sensor for capturing first distance data and a second distance sensor for capturing second distance data. The first distance sensor can have a first detection area along a first detection axis and the second distance sensor can have a second detection area along a second detection axis. The control unit can then be set up to detect the control gesture on the basis of the first distance data and on the basis of the second distance data. The use of several distance sensors with different detection areas enables a further differentiation between different control gestures, and thus the input of a increased number of control statements. The convenience of a user interface can thus be further increased.

Die erste Erfassungsachse und die zweite Erfassungsachse können im Wesentlichen parallel zueinander verlaufen. Des Weiteren können die erste Erfassungsachse und die zweite Erfassungsachse derart versetzt zueinander verlaufen, dass sich der erste Erfassungsbereich und der zweite Erfassungsbereich nicht oder zumindest teilweise nicht überlappen. Es kann somit ein Zustand detektiert werden, in dem sich ein Objekt in dem ersten Erfassungsbereich aber nicht in dem zweiten Erfassungsbereich und/oder in dem sich ein Objekt in dem zweiten Erfassungsbereich aber nicht in dem ersten Erfassungsbereich befindet. So kann eine zuverlässige Erkennung von unterschiedlichen Steuergesten auf Basis der ersten und zweiten Abstandsdaten ermöglicht werden.The first detection axis and the second detection axis can run essentially parallel to one another. Furthermore, the first detection axis and the second detection axis can run offset from one another in such a way that the first detection area and the second detection area do not or at least partially do not overlap. A state can thus be detected in which an object is located in the first detection area but not in the second detection area and / or in which an object is in the second detection area but not in the first detection area. Reliable recognition of different control gestures on the basis of the first and second distance data can thus be made possible.

Die Steuereinheit kann insbesondere eingerichtet sein, auf Basis der ersten und zweiten Abstandsdaten eine zeitliche Reihenfolge zu ermitteln, in der ein Objekt in den ersten Erfassungsbereich des ersten Abstandssensors und in den zweiten Erfassungsbereich des zweiten Abstandssensors eintritt. Die Steuergeste kann dann in Abhängigkeit von der zeitlichen Reihenfolge ermittelt werden. So kann die Anzahl von unterscheidbaren Steuergesten (und damit die Anzahl von möglichen Steueranweisungen) weiter erhöht werden.The control unit can in particular be set up to determine, on the basis of the first and second distance data, a time sequence in which an object enters the first detection area of the first distance sensor and the second detection area of the second distance sensor. The control gesture can then be determined as a function of the chronological sequence. The number of distinguishable control gestures (and thus the number of possible control instructions) can thus be increased further.

Der Erfassungsbereich eines für die Gestensteuerung verwendeten Abstandssensors kann sich während des Betriebs des Reinigungsroboters zumindest teilweise horizontal von dem Reinigungsroboter weg erstrecken. Insbesondere kann ein Abstandssensor verwendet werden, der an der Seitenwand des Reinigungsroboters angeordnet ist. Ein Reinigungsroboter umfasst typischerweise mehrere derartiger Abstandssensoren (als Umfeldsensoren), um bei der Bewegung bzw. der Navigation des Reinigungsroboters zu erkennen, ob sich der Reinigungsroboter einem Hindernis nähert. Die Steuereinheit kann somit eingerichtet sein, den Reinigungsroboter in Abhängigkeit von den Abstandsdaten zu navigieren bzw. zu bewegen.The detection range of a distance sensor used for gesture control can extend at least partially horizontally away from the cleaning robot during operation of the cleaning robot. In particular, a distance sensor can be used which is arranged on the side wall of the cleaning robot. A cleaning robot typically comprises a plurality of such distance sensors (as environment sensors) in order to detect during the movement or navigation of the cleaning robot whether the cleaning robot is approaching an obstacle. The control unit can thus be set up to navigate or move the cleaning robot as a function of the distance data.

Die Abstandsdaten eines Abstandssensors, der einen im Wesentlichen horizontal verlaufenden (strahlenförmigen) Erfassungsbereich aufweist und der typischerweise für die Umfelderkennung des Reinigungsroboters verwendet wird, kann somit auch zur Bereitstellung einer Benutzerschnittstelle verwendet werden. Zu diesem Zweck kann die Steuereinheit eingerichtet sein, auf Basis der Abstandsdaten, insbesondere auf Basis der durch die Abstandsdaten angezeigten zeitlichen Sequenz von Abstandswerten, zu unterscheiden, ob es sich bei einem Objekt im Erfassungsbereich des Abstandssensors um ein Hindernis oder um ein die Steuergeste bewirkendes Objekt handelt. Dies kann z.B. mittels eines maschinen-erlernten Klassifikators erfolgen.The distance data of a distance sensor, which has an essentially horizontally running (radial) detection area and which is typically used for the environment recognition of the cleaning robot, can thus also be used for Providing a user interface can be used. For this purpose, the control unit can be set up to distinguish on the basis of the distance data, in particular on the basis of the time sequence of distance values displayed by the distance data, whether an object in the detection range of the distance sensor is an obstacle or an object causing the control gesture acts. This can be done using a machine-learned classifier, for example.

Die Gestensteuerung kann bei einem Abstandssensor mit einem horizontal verlaufenden (strahlenförmigen) Erfassungsbereich in komfortabler Weise durch einen Fuß eines Nutzers erfolgen. Dabei kann der Fuß in definierter Weise in den, durch den und/oder aus dem Erfassungsbereich geführt werden, um eine Steuergeste zu vollziehen (die sich von der Bewegung eines anderweitigen Hindernisses unterscheidet). So kann in besonders kosteneffizienter Weise eine robuste und komfortable Benutzerschnittstelle für einen Reinigungsroboter bereitgestellt werden. Insbesondere können so ein oder mehrere bereits vorhandene und für die Navigation genutzte Abstandssensoren in kostenneutraler Weise auch als Benutzerschnittstelle verwendet werden.In the case of a distance sensor with a horizontally running (radial) detection area, the gesture control can take place in a comfortable manner by a user's foot. The foot can be guided in a defined way into, through and / or out of the detection area in order to perform a control gesture (which differs from the movement of another obstacle). A robust and comfortable user interface for a cleaning robot can thus be provided in a particularly cost-efficient manner. In particular, one or more distance sensors that are already present and used for navigation can also be used as a user interface in a cost-neutral manner.

Der zur Gestensteuerung verwendete Abstandssensor kann ein Schaltsensor sein, der eingerichtet ist, ein Schaltsignal als Abstandsdaten zu generieren, wenn ein Objekt in den Erfassungsbereich des Abstandssensors eintritt bzw. aus dem Erfassungsbereich austritt. Die Verwendung eines kosteneffizienten Schaltsensors ist insbesondere bei der Verwendung von mehreren Abstandssensoren eine kosteneffiziente Möglichkeit eine Gestensteuerung für einen Reinigungsroboter zu ermöglichen.The distance sensor used for gesture control can be a switching sensor which is set up to generate a switching signal as distance data when an object enters or exits the detection area of the distance sensor. The use of a cost-efficient switching sensor is a cost-efficient way of enabling gesture control for a cleaning robot, particularly when using several distance sensors.

Die Abstandsdaten können somit in binärer Weise (ggf. nur) anzeigen, ob sich ein Objekt im Erfassungsbereich befindet oder nicht. Alternativ oder ergänzend können die Abstandsdaten nur anzeigen, ob sich ein Objekt in einem Abstand zu dem Abstandssensor befindet, der gleich wie oder kleiner als der Erfassungsabstand ist. Insbesondere können die Abstandsdaten nicht einen von dem Erfassungsabstand abweichenden Wert des Abstands eines im Erfassungsbereich liegenden Objekts anzeigen. Dennoch können, insbesondere bei Verwendung von mehreren derartigen Abstandssensoren (insbesondere von Schaltsensoren), in robuster Weise unterschiedliche Steuergesten für unterschiedliche Steueranweisungen unterschieden werden.The distance data can thus (possibly only) indicate in a binary manner whether or not an object is in the detection area. Alternatively or in addition, the distance data can only indicate whether an object is located at a distance from the distance sensor that is the same as or smaller than the detection distance. In particular, the distance data cannot display a value of the distance to an object lying in the detection area that deviates from the detection distance. Nevertheless, in particular when using several such distance sensors (in particular switching sensors), different control gestures for different control instructions can be distinguished in a robust manner.

Gemäß einem weiteren Aspekt wird ein Verfahren zur Steuerung eines Reinigungsroboters beschrieben. Das Verfahren umfasst das Erfassen, mittels eines Abstandssensors des Reinigungsroboters, von Abstandsdaten bezüglich eines Objekts, das in einen Erfassungsbereich des Abstandssensors geführt wird. Dabei ist der Erfassungsbereich während des Betriebs des Reinigungsroboters bevorzugt für eine Hand und/oder einen Fuß eines Nutzers des Reinigungsroboters zugänglich. Außerdem umfasst das Verfahren das Detektieren, auf Basis der Abstandsdaten, einer Steuergeste, die mit einer Steueranweisung an den Reinigungsroboter assoziiert ist. Das Verfahren umfasst ferner das Betreiben des Reinigungsroboters gemäß der detektierten Steueranweisung.According to a further aspect, a method for controlling a cleaning robot is described. The method comprises the acquisition, by means of a distance sensor of the cleaning robot, of distance data relating to an object that is guided into a detection area of the distance sensor. In this case, the detection area is preferably accessible to a hand and / or foot of a user of the cleaning robot during operation of the cleaning robot. In addition, the method comprises the detection, on the basis of the distance data, of a control gesture which is associated with a control instruction to the cleaning robot. The method further comprises operating the cleaning robot in accordance with the detected control instruction.

Es ist zu beachten, dass jegliche Aspekte des in diesem Dokument beschriebenen Verfahrens und/oder Reinigungsroboters in vielfältiger Weise miteinander kombiniert werden können. Insbesondere können die Merkmale der Patentansprüche in vielfältiger Weise miteinander kombiniert werden.It should be noted that all aspects of the method and / or cleaning robot described in this document can be combined with one another in many ways. In particular, the features of the claims can be combined with one another in many ways.

Im Weiteren wird die Erfindung anhand von in der beigefügten Zeichnung dargestellten Ausführungsbeispielen näher beschrieben. Dabei zeigen

Figur 1a
die Unterseite eines beispielhaften Reinigungsroboters in einer perspektivischen Ansicht;
Figur 1b
die Oberseite eines beispielhaften Reinigungsroboters;
Figur 1c
einen Reinigungsroboter in einer Seitenansicht;
Figur 2a
beispielhafte Gesten über einem Abstandssensor;
Figur 2b
beispielhafte Gesten über einer Sensoreinheit mit mehreren Abstandssensoren;
Figur 2c
beispielhafte Steuergesten; und
Figur 3
ein Ablaufdiagramm eines beispielhaften Verfahrens zur Steuerung eines Reinigungsroboters.
The invention is described in more detail below with reference to the exemplary embodiments shown in the accompanying drawing. Show it
Figure 1a
the underside of an exemplary cleaning robot in a perspective view;
Figure 1b
the top of an exemplary cleaning robot;
Figure 1c
a cleaning robot in a side view;
Figure 2a
exemplary gestures over a distance sensor;
Figure 2b
exemplary gestures over a sensor unit with multiple distance sensors;
Figure 2c
exemplary control gestures; and
Figure 3
a flowchart of an exemplary method for controlling a cleaning robot.

Wie eingangs dargelegt, befasst sich das vorliegende Dokument mit der zuverlässigen, komfortablen und effizienten Steuerung bzw. Bedienung eines Reinigungsroboters. In diesem Zusammenhang zeigt Fig. 1a einen beispielhaften Reinigungsroboter 100 in einer perspektivischen Ansicht. Insbesondere zeigt Fig. 1a die Unterseite 122 eines Reinigungsroboters 100, die im Reinigungsbetrieb des Reinigungsroboters 100 dem zu reinigenden Boden zugewandt ist. Die Unterseite 122 des Reinigungsroboters 100 weist typischerweise ein oder mehrere Antriebseinheiten 101 (mit ein oder mehreren Antriebsrädern) auf, durch die der Reinigungsroboter 100 bewegt werden kann, um unterschiedliche Bereiche eines Bodens zu reinigen. Außerdem umfasst ein Reinigungsroboter 100 typischerweise ein oder mehrere Reinigungseinheiten 102 (z.B. mit einer Reinigungsbürste), die eingerichtet sind, den Boden unter dem Reinigungsroboter 100 zu reinigen. Des Weiteren kann ein Reinigungsroboter 100 ein oder mehrere Abstandssensoren 110 umfassen, die eingerichtet sind, Abstandsdaten bezüglich eines Abstands der Unterseite 122 des Reinigungsroboters 100 zu dem zu reinigenden Boden zu erfassen. Eine Steuereinheit 130 (siehe Fig. 1b) des Reinigungsroboters 100 kann eingerichtet sein, auf Basis der Abstandsdaten einen in Bewegungsrichtung des Reinigungsroboters 100 vor dem Reinigungsroboter 100 liegenden Abgrund (z.B. eine Treppenstufe) zu erkennen.As stated at the beginning, the present document deals with the reliable, comfortable and efficient control and operation of a cleaning robot. In this context shows Fig. 1a an exemplary cleaning robot 100 in a perspective view. In particular shows Fig. 1a the bottom 122 of a Cleaning robot 100, which faces the floor to be cleaned when the cleaning robot 100 is in operation. The underside 122 of the cleaning robot 100 typically has one or more drive units 101 (with one or more drive wheels) by means of which the cleaning robot 100 can be moved in order to clean different areas of a floor. In addition, a cleaning robot 100 typically comprises one or more cleaning units 102 (eg with a cleaning brush) which are set up to clean the floor under the cleaning robot 100. Furthermore, a cleaning robot 100 can comprise one or more distance sensors 110 which are set up to record distance data relating to a distance between the underside 122 of the cleaning robot 100 and the floor to be cleaned. A control unit 130 (see Figure 1b ) of the cleaning robot 100 can be set up to recognize, on the basis of the distance data, an abyss lying in front of the cleaning robot 100 in the direction of movement of the cleaning robot 100 (for example a step).

An der Oberseite 121 des Reinigungsroboters 100 kann eine Benutzerschnittstelle angeordnet sein, die es einem Nutzer des Reinigungsroboters 100 ermöglicht, Steuereingaben zu tätigen. Des Weiteren können an einer Seitenwand 123 des Reinigungsroboters 100, die die Oberseite 121 mit der Unterseite 122 verbindet, ein oder mehrere Umfeldsensoren angeordnet sein, die eingerichtet sind, ein Umfeld des Reinigungsroboters 100 zu erfassen. Die Steuereinheit 130 des Reinigungsroboters 100 kann eingerichtet sein, den Reinigungsroboter 100 auf Basis der Sensordaten der ein oder mehreren Umfeldsensoren durch das Umfeld zu navigieren.A user interface can be arranged on the upper side 121 of the cleaning robot 100, which enables a user of the cleaning robot 100 to make control inputs. Furthermore, one or more environment sensors can be arranged on a side wall 123 of the cleaning robot 100, which connects the top side 121 with the bottom side 122, which are set up to detect the environment of the cleaning robot 100. The control unit 130 of the cleaning robot 100 can be set up to navigate the cleaning robot 100 through the environment on the basis of the sensor data of the one or more environment sensors.

Fig. 1b zeigt in schematischer Weise die Oberseite 121 eines Reinigungsroboters 100. Der Reinigungsroboter 100 kann an der Oberseite 121 zumindest einen Abstandssensor 120 aufweisen, der eingerichtet ist, Abstandsdaten in Bezug zu einem Abstand eines über dem Reinigungsroboter 100 angeordneten Objektes zu der Oberseite 121 zu erfassen. Ein Abstandssensor 120 kann ein Sendemodul umfassen, das eingerichtet ist, ein Abstandsmesssignal zu emittieren. Des Weiteren kann ein Abstandssensor 120 ein Empfangsmodul umfassen, das eingerichtet ist, das an einem Objekt reflektierte Abstandsmesssignal zu empfangen. Anhand der Laufzeit des emittierten und wieder empfangen Abstandsmesssignals kann dann der Abstand des Objekts ermittelt werden. Das Abstandsmesssignal kann ein optische und/oder akustisches Signal sein. Figure 1b shows schematically the top side 121 of a cleaning robot 100. The cleaning robot 100 can have at least one distance sensor 120 on the top side 121, which is set up to detect distance data in relation to a distance of an object arranged above the cleaning robot 100 to the top side 121. A distance sensor 120 can include a transmission module that is set up to emit a distance measurement signal. Furthermore, a distance sensor 120 can comprise a receiving module which is set up to receive the distance measurement signal reflected on an object. The distance to the object can then be determined on the basis of the transit time of the distance measurement signal emitted and received again. The distance measurement signal can be an optical and / or acoustic signal.

Fig. 1c zeigt die Seitenwand 123 eines Reinigungsroboters 100 in schematischer Weise. Des Weiteren zeigt Fig. 1c einen an der Oberseite 122 angeordneten Abstandssensor 120, der eingerichtet ist, vertikale Abstandsdaten bezüglich eines Abstands in vertikaler Richtung nach oben zu erfassen. Außerdem kann der Reinigungsroboter 100 ein oder mehrere Abstandssensoren 120 umfassen, die eingerichtet sind, horizontale Abstandsdaten bezüglich eines Abstands in horizontaler Richtung zu erfassen. Die horizontalen Abstandsdaten können z.B. für die Navigation der Reinigungsroboters 100 verwendet werden. Figure 1c shows the side wall 123 of a cleaning robot 100 in a schematic manner. Furthermore shows Figure 1c a distance sensor 120 which is arranged on the upper side 122 and is configured to detect vertical distance data relating to a distance in the vertical direction upwards. In addition, the cleaning robot 100 can include one or more distance sensors 120 which are set up to detect horizontal distance data with regard to a distance in the horizontal direction. The horizontal distance data can be used for the navigation of the cleaning robot 100, for example.

Der Abstandssensor 120 eines Reinigungsroboters 100 kann dazu verwendet werden, in kosteneffizienter Weise eine robuste und komfortable Benutzerschnittstelle für den Reinigungsroboter 100 bereitzustellen. Insbesondere kann zu diesem Zweck ein zeitlicher Verlauf von Abstandswerten ermittelt werden. Beispielsweise kann mit einer bestimmten Abtastfrequenz (z.B. 10Hzr, 100Hz oder mehr) mittels eines Abstandssensors 120 der Abstand zu einem Objekt vor dem Abstandssensor 120 ermittelt werden. Diese zeitliche Sequenz von Abstandswerten kann durch die Steuereinheit 130 des Reinigungsroboters 100 ausgewertet werden, um eine Geste zu detektieren, die mit einer Steueranweisung zur Steuerung des Reinigungsroboters 100 assoziiert ist. Der Reinigungsroboter 100 kann dann in Abhängigkeit von der Steueranweisung betrieben werden.The distance sensor 120 of a cleaning robot 100 can be used to provide a robust and comfortable user interface for the cleaning robot 100 in a cost-effective manner. In particular, a time profile of distance values can be determined for this purpose. For example, the distance to an object in front of the distance sensor 120 can be determined with a certain sampling frequency (e.g. 10 Hzr, 100 Hz or more) by means of a distance sensor 120. This temporal sequence of distance values can be evaluated by the control unit 130 of the cleaning robot 100 in order to detect a gesture that is associated with a control instruction for controlling the cleaning robot 100. The cleaning robot 100 can then be operated as a function of the control instruction.

Es kann somit am Reinigungsroboter 100 (ggf. zusätzlich) ein Distanz- bzw. Abstandssensor 120 bereitgestellt werden, der insbesondere vertikal (nach oben) messen kann. Wie in den Figuren 2a und 2b dargestellt kann durch die Erkennung von Körperteilen 201, 202 eines Nutzers, z.B. einer Hand 202 oder eines Fußes 201, der Reinigungsroboter 100 über Gesten gesteuert werden. Beispielsweise kann eine Bewegung 200 mit der Hand 202 in Richtung zum Abstandssensor 120 dem Reinigungsroboter 100 anzeigen, dass von einem Power- in einen Silent-Modus gewechselt werden soll. Andererseits kann eine Bewegung 200 mit der Hand 202 von dem Abstandssensor 120 weg anzeigen, dass von dem Silent-Modus in den Power-Modus gewechselt werden soll. Ein Bewegen 200 des Fußes 201 über dem Reinigungsroboter 100 kann z.B. bewirken, dass der Reinigungsroboter 100 in einen Pause-Modus wechselt. Eingaben mit der Hand 202 bzw. mit dem Fuß 201 können dabei auf Basis der unterschiedlichen charakteristischen Abstände zum Abstandssensor 120 unterschieden werden.A distance or distance sensor 120 can thus be provided on the cleaning robot 100 (possibly additionally), which can in particular measure vertically (upwards). As in the Figures 2a and 2b represented by the recognition of body parts 201, 202 of a user, for example a hand 202 or a foot 201, the cleaning robot 100 can be controlled via gestures. For example, a movement 200 with the hand 202 in the direction of the distance sensor 120 can indicate to the cleaning robot 100 that a switch should be made from a power mode to a silent mode. On the other hand, a movement 200 with the hand 202 away from the distance sensor 120 can indicate that a switch should be made from the silent mode to the power mode. Moving 200 the foot 201 over the cleaning robot 100 can, for example, cause the cleaning robot 100 to switch to a pause mode. Entries with the hand 202 or with the foot 201 can thereby can be distinguished on the basis of the different characteristic distances from the distance sensor 120.

Fig. 2b zeigt die Verwendung von mehreren Abstandssensoren 120 mit mehreren Abstandsmesssignalen 221, 222. Durch die Verwendung von mehreren Abstandssensoren in vertikaler Richtung (die einen bestimmten Abstand zueinander aufweisen) kann die Anzahl von erkennbaren Gesten erhöht werden. Zusätzlich zu einer Auf- und Abbewegung 200 können durch die Verwendung von mehreren, nebeneinander angeordneten Abstandssensoren 120 auch Querbewegungen 200 erkannt werden (z.B. ein Wischen von links nach rechts oder ein Wischen von links oben nach rechts unten). Figure 2b shows the use of several distance sensors 120 with several distance measurement signals 221, 222. The number of recognizable gestures can be increased by using several distance sensors in the vertical direction (which are at a certain distance from one another). In addition to an up and down movement 200, transverse movements 200 can also be detected by using a plurality of distance sensors 120 arranged next to one another (for example, wiping from left to right or wiping from top left to bottom right).

Die Steuereinheit 130 kann somit eingerichtet sein, auf Basis der Abstandsdaten zumindest eines Abstandssensors 120 eines Reinigungsroboters 100 eine Bewegung 200 eines Körperteils 201, 202 eines Nutzers des Reinigungsroboters 100 innerhalb des Erfassungsbereichs des Abstandssensors 120 zu erkennen. Dabei kann zwischen unterschiedlichen Bewegungen bzw. Gesten 200 aus einer Mehrzahl von Bewegungen bzw. Gesten 200 unterschieden werden. Die unterschiedlichen Bewegungen bzw. Gesten 200 können mit entsprechenden unterschiedlichen Steueranweisungen an den Reinigungsroboter 100 assoziiert sein. Somit kann die Steuereinheit 130 eines Reinigungsroboters 100 auf Basis der Abstandsdaten eines Abstandssensors 120 eine Bewegung bzw. eine Geste 200 im Erfassungsbereich des Abstandssensors 120 erkennen. Des Weiteren kann die Steuereinheit 130 den Reinigungsroboter 100 entsprechend der mit der Bewegung bzw. der Geste 200 assoziierten Steueranweisung betreiben.The control unit 130 can thus be set up to recognize a movement 200 of a body part 201, 202 of a user of the cleaning robot 100 within the detection range of the distance sensor 120 on the basis of the distance data of at least one distance sensor 120 of a cleaning robot 100. A distinction can be made between different movements or gestures 200 from a plurality of movements or gestures 200. The different movements or gestures 200 can be associated with corresponding different control instructions to the cleaning robot 100. The control unit 130 of a cleaning robot 100 can thus detect a movement or a gesture 200 in the detection range of the distance sensor 120 on the basis of the distance data from a distance sensor 120. Furthermore, the control unit 130 can operate the cleaning robot 100 in accordance with the control instruction associated with the movement or the gesture 200.

Fig. 2c zeigt beispielhafte Abstandsdaten, die durch einen Abstandssensor 120 erfasst werden können. Die Abstandsdaten können eine zeitliche Sequenz 231 von Abstandswerten 230 umfassen. Die Abstandswerte 230 können dabei jeweils den Abstand eines Objektes 201, 202 im Erfassungsbereich des Abstandssensors 120 anzeigen. Es kann somit ein zeitlicher Verlauf des Abstands eines Objektes 201, 202 als Abstandsdaten erfasst werden. Es kann dann auf Basis der zeitlichen Sequenz 231 von Abstandswerten 230 eine bestimmte Steuergeste 200 detektiert werden. Dabei können sich unterschiedliche Steuergesten 200 z.B. durch den Wertebereich der Abstandswerte 230 unterscheiden. Beispielsweise kann durch Vergleich mit einem Abstands-Schwellenwert 232 erkannt werden, ob es sich um eine Steuergeste 200 handelt, die mit einer Hand 202 oder mit einem Fuß 201 vollzogen wurde. Alternativ oder ergänzend kann die Geschwindigkeit einer Bewegung bzw. Geste 200 berücksichtigt werden, um unterschiedliche Steuergesten 200 zu unterscheiden. Somit können auf Basis der Abstandsdaten unterschiedliche Steuergesten 200 für unterschiedliche Steueranweisungen an einen Reinigungsroboter 100 unterschieden werden, um eine komfortable und umfangreiche Benutzerschnittstelle bereitzustellen. Figure 2c FIG. 10 shows exemplary distance data that may be captured by a distance sensor 120. The distance data can comprise a time sequence 231 of distance values 230. The distance values 230 can each display the distance of an object 201, 202 in the detection range of the distance sensor 120. A temporal profile of the distance to an object 201, 202 can thus be recorded as distance data. A specific control gesture 200 can then be detected on the basis of the time sequence 231 of distance values 230. Different control gestures 200 can differ, for example, by the value range of the distance values 230. For example, by comparison with a distance threshold 232 it can be recognized whether it is a control gesture 200 that was performed with a hand 202 or with a foot 201. As an alternative or in addition, the speed of a movement or gesture 200 can be taken into account in order to differentiate between different control gestures 200. Thus, on the basis of the distance data, different control gestures 200 for different control instructions to a cleaning robot 100 can be distinguished in order to provide a comfortable and extensive user interface.

Fig. 3 zeigt ein Ablaufdiagramm eines beispielhaften Verfahrens 300 zur Steuerung eines Reinigungsroboters 100, insbesondere eines Saugroboters. Das Verfahren 300 umfasst das Erfassen 301, mittels eines Abstandssensors 120 des Reinigungsroboters 100, von Abstandsdaten bezüglich eines Objekts 201, 202, das in einen Erfassungsbereich des Abstandssensors 120 geführt wird und/oder in einem Erfassungsbereich des Abstandssensors 120 bewegt wird. Dabei ist der Abstandssensor 120 bevorzugt derart ausgelegt, dass der Erfassungsbereich des Abstands- bzw. Distanzsensors 120 während des Betriebs des Reinigungsroboters 100 für eine Hand 202 und/oder für einen Fuß 201 eines Nutzers des Reinigungsroboters 100 zugänglich ist. Insbesondere kann der Abstandssensor 120 einen Erfassungsbereich aufweisen, der sich im Wesentlichen oder zumindest teilweise vertikal nach oben von der Oberseite 121 des Reinigungsroboters 100 weg erstreckt. Der Erfassungsbereich kann dabei auf einen Erfassungsabstand beschränkt sein, wobei der Erfassungsabstand der typischen Größe eines Menschen (z.B. 2 Meter oder weniger) entsprechen kann. Des Abstandssensor 120 kann somit derart ausgelegt sein, dass ein Nutzer des Reinigungsroboters 100 in komfortabler Weise eine Hand 202 und/oder einen Fuß 201 in den Erfassungsbereich hineinführen, aus dem Erfassungsbereich herausführen und/oder innerhalb des Erfassungsbereichs bewegen kann (insbesondere um mit der Hand 202 und/oder mit dem Fuß 201 eine bestimmte Geste zu vollziehen). Fig. 3 FIG. 10 shows a flow chart of an exemplary method 300 for controlling a cleaning robot 100, in particular a vacuum robot. The method 300 comprises the acquisition 301, by means of a distance sensor 120 of the cleaning robot 100, of distance data relating to an object 201, 202 which is guided into a detection area of the distance sensor 120 and / or is moved in a detection area of the distance sensor 120. The distance sensor 120 is preferably designed such that the detection area of the distance sensor 120 is accessible to a hand 202 and / or to a foot 201 of a user of the cleaning robot 100 while the cleaning robot 100 is in operation. In particular, the distance sensor 120 can have a detection area which extends essentially or at least partially vertically upwards away from the top side 121 of the cleaning robot 100. The detection area can be limited to a detection distance, wherein the detection distance can correspond to the typical height of a person (eg 2 meters or less). The distance sensor 120 can thus be designed in such a way that a user of the cleaning robot 100 can comfortably guide a hand 202 and / or a foot 201 into the detection area, lead them out of the detection area and / or move them within the detection area (in particular around with the hand 202 and / or to perform a certain gesture with the foot 201).

Außerdem umfasst das Verfahren 300 das Detektieren 302, auf Basis der Abstandsdaten, einer Steuergeste 200, die mit einer Steueranweisung an den Reinigungsroboter 100 assoziiert ist. Die Abstandsdaten können eine bestimmte zeitliche Abfolge von Abstandswerten des in den Erfassungsbereich eingeführten Objektes 201, 202 bzw. des in dem Erfassungsbereich befindlichen Objektes 201, 202 anzeigen. Es kann somit auf Basis der Abstandsdaten erkannt werden, dass ein Nutzer (z.B. mit der Hand 202 und/oder mit dem Fuß 201) eine bestimmte Geste, insbesondere eine Steuergeste 200, vollzogen hat. Die erkannte Steuergeste 200 kann dabei einer bestimmten Steueranweisung an den Reinigungsroboter 100 entsprechen. Beispielhafte Steueranweisungen sind: das Aktivieren bzw. das Deaktivieren eines bestimmen Betriebsmodus des Reinigungsroboters 100 und/oder das Stoppen bzw. das Aktivieren des Reinigungsroboters 100.In addition, the method 300 comprises detecting 302, based on the distance data, a control gesture 200 that is associated with a control instruction to the cleaning robot 100. The distance data can display a specific chronological sequence of distance values of the object 201, 202 introduced into the detection area or of the object 201, 202 located in the detection area. It can thus be recognized on the basis of the distance data that a user (for example with his hand 202 and / or has performed a specific gesture, in particular a control gesture 200, with the foot 201). The recognized control gesture 200 can correspond to a specific control instruction to the cleaning robot 100. Exemplary control instructions are: the activation or the deactivation of a specific operating mode of the cleaning robot 100 and / or the stopping or the activation of the cleaning robot 100.

Das Verfahren 300 umfasst ferner das Betreiben 303 des Reinigungsroboters 100 gemäß der detektierten Steueranweisung. Somit kann in kosteneffizienter Weise (durch Verwendung eines relativ kostengünstigen Abstandssensors) eine komfortable und robuste Steuerung eines Reinigungsroboters 100 durch einen Nutzer ermöglicht werden.The method 300 further comprises operating 303 the cleaning robot 100 in accordance with the detected control instruction. In this way, comfortable and robust control of a cleaning robot 100 by a user can be made possible in a cost-efficient manner (by using a relatively inexpensive distance sensor).

Die in diesem Dokument beschriebenen Maßnahmen ermöglichen es einem Nutzer, einen Reinigungsroboter 100 in komfortabler Weise über Gesten zu steuern. Die Gesten können dabei in kosteneffizienter Weise mittels zumindest eines Abstandssensors 120 erkannt werden. Die Steuereinheit 130 kann es einem Nutzer ermöglichen, individuelle Gesten anzulernen und/oder Gesten und die damit verbundenen Steueranweisungen zu konfigurieren. So kann eine intuitive, individualisierte Bedienung ermöglicht werden (ohne Verwendung eines zusätzlichen Eingabegeräts).The measures described in this document enable a user to conveniently control a cleaning robot 100 using gestures. The gestures can be recognized in a cost-efficient manner by means of at least one distance sensor 120. The control unit 130 can enable a user to learn individual gestures and / or to configure gestures and the control instructions associated therewith. This enables intuitive, individualized operation (without using an additional input device).

Die vorliegende Erfindung ist nicht auf die gezeigten Ausführungsbeispiele beschränkt. Insbesondere ist zu beachten, dass die Beschreibung und die Figuren nur das Prinzip des vorgeschlagenen Reinigungsroboters 100 und/oder Verfahrens 300 veranschaulichen sollen.The present invention is not restricted to the exemplary embodiments shown. In particular, it should be noted that the description and the figures are only intended to illustrate the principle of the proposed cleaning robot 100 and / or method 300.

Claims (15)

  1. Cleaning robot (100) for cleaning an undersurface, wherein the cleaning robot (100) comprises
    - at least one distance sensor (120), which is configured to capture distance data relating to an object (201, 202) moved by a user into a capture range of the distance sensor (120), wherein the distance sensor (120) is preferably designed such that the capture range is accessible to a hand (202) and/or a foot (201) of the user of the cleaning robot (100) during operation of the cleaning robot (100); and
    - a control unit (130), which is configured
    - to detect, on the basis of the distance data, a control gesture (200) that is associated with a control instruction to the cleaning robot (100); and
    - to operate the cleaning robot (100) in accordance with the control instruction in response.
  2. Cleaning robot (100) according to claim 1, wherein the capture range during operation of the cleaning robot (100) extends at least partially vertically upwards from the cleaning robot (100).
  3. Cleaning robot (100) according to one of the preceding claims, wherein the distance sensor (120) is configured to emit an optical and/or an acoustic distance measurement signal, in particular an ultrasound signal and/or an infrared signal, in order to capture the distance data.
  4. Cleaning robot (100) according to one of the preceding claims, wherein
    - the distance data comprises a time sequence (231) of distance values (230);
    - the distance value (230) at a specific point in time depends on a distance of the object (201, 202) from the distance sensor (120) at the specific point in time; and
    - the control unit (130) is configured to detect the control gesture (200) on the basis of the sequence (231) of distance values (230).
  5. Cleaning robot (100) according to claim 4, wherein
    - the control unit (130) is configured to compare the time sequence (231) of distance values (230) with a plurality of different reference sequences of distance values (230) in order to detect the control gesture (200); and
    - the plurality of different reference sequences corresponds to a corresponding plurality of different control gestures (200) and is associated with a corresponding plurality of different control instructions.
  6. Cleaning robot (100) according to claim 5, wherein the different reference sequences differ
    - in respect of the one or more distance values (230) that they contain; and/or
    - in respect of a time sequence of different distance values (230); and/or
    - in respect of a speed with which different distance values (230) follow one another.
  7. Cleaning robot (100) according to one of the preceding claims, wherein
    - the control unit (130) is configured to detect the control gesture (200) by means of a machine-learning classifier;
    - the classifier is embodied to identify, within a value range of possible distance data, a plurality of sub-spaces for a corresponding plurality of different control gestures (200); and
    - the classifier comprises in particular a neural network.
  8. Cleaning robot (100) according to one of the preceding claims, wherein
    - the control unit (130) is configured to determine, on the basis of the distance data and depending on a distance threshold value (232), whether the control gesture (200) was performed by the user's hand (202) or foot (201); and
    - the control instruction associated with the control gesture (200) depends on whether the control gesture (200) was performed by the user's hand (202) or foot (201).
  9. Cleaning robot (100) according to one of the preceding claims, wherein the control unit (130) is configured
    - to determine, on the basis of the distance data, whether an object (201, 202) is moving towards the distance sensor (120) or whether an object (201, 202) is moving away from the distance sensor (120); and
    - to detect the control gesture (200) on this basis.
  10. Cleaning robot (100) according to one of the preceding claims, wherein
    - the cleaning robot (100) comprises a first distance sensor (120) for capturing first distance data and a second distance sensor (120) for capturing second distance data; and
    - the control unit (130) is configured to detect the control gesture (200) on the basis of the first distance data and on the basis of the second distance data.
  11. Cleaning robot (100) according to claim 10, wherein
    - the first distance sensor (120) has a first capture range along a first capture axis;
    - the second distance sensor (120) has a second capture range along a second capture axis;
    - the first capture axis and the second capture axis extend in particular in parallel to one another; and
    - the first capture axis and the second capture axis extend offset to one another such that the first capture range and the second capture range do not overlap.
  12. Cleaning robot (100) according to one of claims 10 to 11, wherein the control unit (130) is configured
    - to determine, on the basis of the first and second distance data, a temporal order in which an object (201, 202) enters into the first capture range of the first distance sensor (120) and into the second capture range of the second distance sensor (120); and
    - to determine the control gesture (200) depending on the temporal order.
  13. Cleaning robot (100) according to one of the preceding claims, wherein
    - the capture range during operation of the cleaning robot (100) extends at least partially horizontally away from the cleaning robot (100); and/or
    - the control unit (130) is configured to navigate the cleaning robot (100) depending on the distance data; and/or
    - the control unit (130) is configured to make a distinction, on the basis of the distance data, in particular on the basis of a time sequence (231) of distance values (230) indicated by the distance data, as to whether an object (201, 202) in the capture range of the distance sensor (120) is an obstacle or an object (201, 202) performing the control gesture 200).
  14. Cleaning robot (100) according to one of the preceding claims, wherein
    - the distance data indicates in binary form whether or not an object (201, 202) is located in the capture range; and/or
    - the capture range extends as far as a capture distance from the distance sensor (120); and/or
    - the capture distance is equal to or less than a typical human shoulder height; and/or
    - the distance data only indicates whether an object (201, 202) is located within a distance from the distance sensor (120), which distance is equal to or less than the capture distance; and/or
    - the distance data does not indicate a value deviating from the capture distance for the distance of an object (201, 202) located in the capture range; and/or
    - the distance sensor (120) is a switch sensor, which is configured to generate a switching signal as distance data when an object (201, 202) enters into the capture range of the distance sensor (120).
  15. Method (300) for controlling a cleaning robot (100), wherein the method (300) comprises
    - capturing (301). by means of a distance sensor (120) of the cleaning robot (100), distance data relating to an object (201, 202) moved into a capture range of the distance sensor (120); wherein the distance sensor (120) is preferably designed such that the capture range is accessible to a hand (202) and/or a foot (201) of a user of the cleaning robot (100) during operation of the cleaning robot (100); and
    - detecting (302), on the basis of the distance data, a control gesture (200) that is associated with a control instruction to the cleaning robot (100); and
    - operating (303) the cleaning robot (100) in accordance with the detected control instruction.
EP18214814.8A 2018-01-17 2018-12-20 Cleaning robot and method for controlling the same Active EP3524114B1 (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
DE102018200726.8A DE102018200726A1 (en) 2018-01-17 2018-01-17 Cleaning robot and method for controlling a cleaning robot

Publications (2)

Publication Number Publication Date
EP3524114A1 EP3524114A1 (en) 2019-08-14
EP3524114B1 true EP3524114B1 (en) 2020-09-30

Family

ID=64755262

Family Applications (1)

Application Number Title Priority Date Filing Date
EP18214814.8A Active EP3524114B1 (en) 2018-01-17 2018-12-20 Cleaning robot and method for controlling the same

Country Status (2)

Country Link
EP (1) EP3524114B1 (en)
DE (1) DE102018200726A1 (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113303708A (en) * 2020-02-27 2021-08-27 佛山市云米电器科技有限公司 Control method for maintenance device, and storage medium
CN111918163B (en) * 2020-07-24 2022-06-21 青岛歌尔智能传感器有限公司 Man-machine interaction control method, device, equipment and computer readable storage medium
CN113934307B (en) * 2021-12-16 2022-03-18 佛山市霖云艾思科技有限公司 Method for starting electronic equipment according to gestures and scenes
DE102022203701A1 (en) 2022-04-12 2023-10-12 Volkswagen Aktiengesellschaft Cleaning robot for cleaning a surface
CN118760162A (en) * 2023-11-14 2024-10-11 科沃斯家用机器人有限公司 Self-mobile device control method, self-mobile device, and storage medium

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101842459B1 (en) * 2011-04-12 2018-05-14 엘지전자 주식회사 Robot cleaner and method for controlling the same
DE202012005255U1 (en) * 2012-05-29 2012-06-26 Youse Gmbh Operating device with a gesture monitoring unit
DE102012105608A1 (en) * 2012-06-27 2014-01-02 Miele & Cie. Kg Self-propelled cleaning device and method for operating a self-propelled cleaning device
KR101385981B1 (en) * 2012-08-14 2014-05-07 (주)동부로봇 Cleaning robot for having gesture recignition function, and the contol method
DE102012108008A1 (en) * 2012-08-30 2014-03-06 Miele & Cie. Kg Self-propelled suction device for automated cleaning of surface, has sensor for detecting characteristics of environment of suction device, where sensor is arranged to detect liquid located on surface to be cleaned
US9375847B2 (en) * 2013-01-18 2016-06-28 Irobot Corporation Environmental management systems including mobile robots and methods using same
KR101306501B1 (en) * 2013-04-26 2013-09-09 주식회사 모뉴엘 A robot vacuum cleaner and its control method
KR101551576B1 (en) * 2014-06-16 2015-09-08 엘지전자 주식회사 Robot cleaner, apparatus and method for recognizing gesture
DE102014212418A1 (en) * 2014-06-27 2015-12-31 Robert Bosch Gmbh Autonomous service robot
US9860077B2 (en) * 2014-09-17 2018-01-02 Brain Corporation Home animation apparatus and methods
KR101649665B1 (en) * 2015-04-29 2016-08-30 엘지전자 주식회사 Moving robot and controlling method thereof

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
None *

Also Published As

Publication number Publication date
DE102018200726A1 (en) 2019-07-18
EP3524114A1 (en) 2019-08-14

Similar Documents

Publication Publication Date Title
EP3524114B1 (en) Cleaning robot and method for controlling the same
EP3412191B1 (en) Automatic soil preparation equipment
EP3441840B1 (en) Method for operating a self-propelled cleaning device
EP3497529B1 (en) Method for operating an independently moving surface treatment device
EP2680097B1 (en) Self-propelled cleaning device and method for operating the same
DE10157016A1 (en) Robot cleaner, calculates travel distance and travel trajectory based on metal component detector output to regulate robot driving mechanism
DE102013200457B4 (en) Operating device for a motor vehicle with a gesture monitoring unit
EP3144436A1 (en) Method and assembly for operating a sanitary device
EP3733037B1 (en) System comprising a manually guided soil working implement, an exclusively automatically operated soil working implement and a computing device
DE102016101954A1 (en) Electrically adjustable furniture
EP3683645A1 (en) System comprising a first ground processing device and a second ground processing device and method for operating such a system
EP3162266B1 (en) Cleaning device and method for use of said cleaning device
WO2018114432A1 (en) Operator control apparatus, which can be operated in a contact-free manner, for a motor vehicle, and also motor vehicle and operating method for the operator control apparatus
DE102021203447A1 (en) Method for adjusting a vehicle door and system for adjusting a vehicle door
DE102007032533B4 (en) Method for operating a medical diagnostic and / or therapeutic device and medical diagnostic and / or therapeutic device
EP3437535B1 (en) Vacuum cleaner
EP2929367B1 (en) Sensor device for a computer system, computer system comprising a sensor device and operating method for a sensor device
EP4182764A1 (en) Control of a cleaning robot
EP3995065A1 (en) Automatically moving cleaning device
DE102015110292A1 (en) Floor element, floor structure, cooking appliance, system with floor element or floor construction and method for controlling such a system
EP3646126B1 (en) Mobile unit, mobile terminal, and associated method
WO2021244862A1 (en) Method for bypassing impassable obstacles by a robot
DE102013208999B4 (en) Method and device for a capacitive non-contact input system based on a single sensor surface
DE102016110494A1 (en) Operating system for operating a medical device, and medical device
EP4410166A1 (en) Method for operating a mobile self-propelled device

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN PUBLISHED

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20200214

RBV Designated contracting states (corrected)

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

INTG Intention to grant announced

Effective date: 20200520

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE PATENT HAS BEEN GRANTED

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

Free format text: NOT ENGLISH

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: AT

Ref legal event code: REF

Ref document number: 1317900

Country of ref document: AT

Kind code of ref document: T

Effective date: 20201015

Ref country code: DE

Ref legal event code: R096

Ref document number: 502018002612

Country of ref document: DE

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

Free format text: LANGUAGE OF EP DOCUMENT: GERMAN

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200930

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200930

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200930

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20201231

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20201230

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20201230

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200930

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200930

REG Reference to a national code

Ref country code: NL

Ref legal event code: MP

Effective date: 20200930

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG4D

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200930

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200930

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210201

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200930

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200930

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200930

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200930

Ref country code: AL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200930

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200930

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210130

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200930

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200930

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 502018002612

Country of ref document: DE

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200930

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200930

REG Reference to a national code

Ref country code: BE

Ref legal event code: MM

Effective date: 20201231

26N No opposition filed

Effective date: 20210701

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20201220

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200930

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20201220

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200930

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210130

Ref country code: TR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200930

Ref country code: MT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200930

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200930

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200930

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20201231

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20211231

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20211231

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: GB

Payment date: 20231220

Year of fee payment: 6

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: FR

Payment date: 20231220

Year of fee payment: 6

Ref country code: DE

Payment date: 20231231

Year of fee payment: 6