WO2020131687A2 - Methods and systems for defining virtual boundaries for a robotic device - Google Patents

Methods and systems for defining virtual boundaries for a robotic device Download PDF

Info

Publication number
WO2020131687A2
WO2020131687A2 PCT/US2019/066516 US2019066516W WO2020131687A2 WO 2020131687 A2 WO2020131687 A2 WO 2020131687A2 US 2019066516 W US2019066516 W US 2019066516W WO 2020131687 A2 WO2020131687 A2 WO 2020131687A2
Authority
WO
WIPO (PCT)
Prior art keywords
robotic device
boundary
detectable
environment
virtual boundary
Prior art date
Application number
PCT/US2019/066516
Other languages
French (fr)
Other versions
WO2020131687A3 (en
Inventor
Aurle Y. GAGNE
Stephen D. HERR
Original Assignee
Diversey, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Diversey, Inc. filed Critical Diversey, Inc.
Publication of WO2020131687A2 publication Critical patent/WO2020131687A2/en
Publication of WO2020131687A3 publication Critical patent/WO2020131687A3/en

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0238Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors

Definitions

  • TITLE METHODS AND SYSTEMS FOR DEFINING VIRTUAL BOUNDARIES FOR
  • the present disclosure is in the technical field of robotic device navigation, particularly navigation of robotic devices (e.g., robotic cleaning devices). More
  • the present disclosure is directed to a method for defining virtual boundaries for the robotic devices when navigating in environments.
  • Robotic devices have the ability to minimize the human effort involved in performing everyday tasks.
  • robotic devices may be used as cleaning devices to help maintain and clean surfaces, such as hard floor surfaces, carpets, and the like. While robotic devices are useful, it can be challenging for robotic devices to operate in a variety of different locations.
  • the system may include robotic device that includes a user interface, one or more sensors, a processor, and a non-transitory computer readable medium.
  • the non-transitory computer readabl e maxim may include programming instructions that are configured to cause the processor to execute the methods described in this disclosure.
  • the method may include receiving a selection of a mode corresponding to a virtual boundary (for example, via the user interface of the robotic device).
  • the virtual boundary may delineate an avoid area to be avoided by the robotic device during navigation in the environment.
  • the method may also include identifying at least one detectable boundary in the environment for defining the virtual boundary, defining the virtual boundary relative to the at least one detectable boundary based on the mode, and operating the robotic device in the environment by causing the robotic device to move in the environment while ensuring that the robotic device does not cross the virtual boundary to enter the avoid area.
  • the robotic device may receive the selection of the mode via the user interface.
  • the robotic device may identify the at least one detectable boundary by analyzing sensor data collected by the one or more sensors.
  • defining the virtual boundary relative to the at least one detectable boundary' may include identifying one or more rule sets corresponding to the mode.
  • the one or more rule sets may include rules for defining one or more of the following: a shape of the virtual boundary, a position of the virtual boundary relative to the robotic device, and/or a position of the virtual boundary- relative to the at least one detectable boundary'.
  • identifying at least one detectable boundary- in the environment for defining the virtual boundary may include identifying one or more detectable boundaries in the environment; and identifying the at least one detectable boundary from the amongst the one or more detectable boundaries based on one or more rules corresponding to at least one of the following: position of the avoid area in the environment, position of the at least one detectable boundary, characteristics of sensor data collected from the avoid area, and/or shape of the at least on detectable boundary.
  • the one or more detectable boundaries in the environment may be identified based on sensor data collected by the robotic device, and/or map data corresponding to the environment.
  • the detectable boundary may not be located in the avoid area if characteristics of sensor data collected from the environment are not reliable.
  • the virtual boundary may be defined relative to the at least one detectable boundary by traversing an initial path through the permissible area to determine a shape and a position of the virtual boundary rel ative to the robotic device or relative to the at least one detectable boundary.
  • the virtual boundary may be defined relative to the at least one detectable boundary' by defining a linear virtual boundary at a location relative to the robotic device and/or relative to the at least one detectable boundary'.
  • a shape of the virtual boundary may be a replication of a shape of the at least one detectable boundary, a mirror image of a shape of the at least one detectable boundary, and/or l inear.
  • the at least one detectable boundary may be identified to be located to a left side of the robotic device, and the virtual boundary may be defined relative to the at least one detectable boundary' by defining the virtual boundary to a right side of the robotic device at a predetermined distance from an initial position of the robotic device in the environment.
  • the predetermined distance may be equal to a distance between the initial position of the robotic device in the environment and the at least one detectable boundary.
  • the at least one detectable boundary may be identified to be located to a left side of the robotic device; and the virtual boundary may be defined relative to the at least one detectable boundary' by defining the virtual boundary to the left side of the robotic device at a predetermined distance from an initial position of the robotic device in the environment.
  • the predetermined distance may be is less than a distance between the initial position of the robotic device in the environment and the at least one detectable boundary
  • the at least one detectable boundary may be identified to be located to a right side of the robotic device; and the virtual boundary' may be defined relative to the at least one detectable boundary by defining the virtual boundary to a left side of the robotic device at a predetermined distance from an initial position of the robotic device in the environment.
  • the predetermined distance may be equal to a distance between the initial position of the roboti c device in the environment and the at least one detectable boundary.
  • the at least one detectable boundary may be identified to be located to a right side of the robotic device; and the virtual boundary may be defined relative to the at least one detectable boundary by defining the virtual boundary to the right side of the robotic device at a predetermined distance from an initial position of the robotic device in the environment.
  • the predetermined distance may be is less than a distance between the initial position of the robotic device in the environment and the at least one detectable boundary.
  • FIG. 1 depicts a block diagram of an example robotic device, in accordance with the embodiments described herein;
  • FIG. 2 depicts an embodiment of an example system that includes the robotic device shown in FIG. 1 , in accordance with the embodiment described herein;
  • FIG. 3 depicts a perspecti ve view of an environment in which a roboti c device can operate, in accordance with the embodiment described herein,
  • FIGS. 4A - 41 illustrate various example modes of operation of a robotic device, in accordance with the embodiment described herein;
  • FIG. 5 depicts an example method for defining a virtual boundary in an
  • FIG. 6 depicts an example of internal hardware that may be included in any of the electronic components of the system, in accordance with the embodiments described herein.
  • the present disclosure describes embodiments for the creation of virtual boundaries for aiding in the navigation of robotic devices such as robotic cleaning devices in an environment.
  • the virtual boundaries can be generated based on environmental data of the environments (e.g., one or more detectable boundaries) in which the robotic devices are intended to navigate.
  • the robotic devices can then navigate autonomously within the environments based on the virtual boundary.
  • the terms“computing device” and“electronic device” refer to a device having a processor and a non-transitory, computer-readable medium (i.e., memory).
  • the memory may contain programming instructions in the form of a software application that, when executed by the processor, causes the device to perform one or more processing operations according to the programming instructions.
  • An electronic device also may include additional components such as a touch-sensitive display device that serves as a user interface, as well as a camera for capturing images.
  • An electronic device also may include one or more communication hardware components such as a transmitter and/or receiver that will enable the device to send and/or receive signals to and/or from other devices, whether via a communications network or via near-field or short-range communication protocols. If so, the programming instructions may be stored on the remote device and executed on the processor of the computing device as in a thin client or Internet of Things (loT)
  • LoT Internet of Things
  • Example components of an electronic device are discussed below in the context of FIG. 6.
  • the terms“memory,”“memory device,”“computer-readable medium” and“data store” each refer to a non -transitory device on which computer-readable data, programming instructions or both are stored. Unless the context specifically states that a single device is required or that multiple devices are required, the terms“memory,”“memory device,” “computer-readable medium” and“data store” include both the singular and plural embodiments, as well as portions of such devices such as memory sectors.
  • A“processor” or“processing device” is a hardware component of an electronic device that is configured to execute programming instructions.
  • the term“processor” may refer to a single processor or to multiple processors that together implement various steps of a process. Unless the context specifically states that a single processor is required or that multiple processors are required, the term“processor” includes both the singular and plural embodiments.
  • the term“robot” or“robotic device” refers to an electro-mechanical machine guided by a processor.
  • the robotic device may also include a memory that may contain programming instructions in the form of a software application that, when executed by the processor, causes the device to perform one or more processing operations according to the programming instructions.
  • a robotic device also may include one or more
  • AGV Internet of Things
  • An AGV is generally a mobile robot that follows markers or wires in the floor, or uses electromagnetic emitter-detectors, including for example sonar, a vision system or lasers for navigation.
  • Mobile robots can be found in industry, military and security environments. They also appear as consumer products, for entertainment or to perform certain tasks like vacuum cleaning and home assistance.
  • Mobile robotic devices may interact or interface with humans to provide a number of services that range from home assistance to commercial assistance and more.
  • a mobile robotic device can assist elderly people with everyday tasks, including, but not limited to, maintaining a medication regime, mobility assistance, communication assistance (e.g., video conferencing, telecommunications, Internet access, etc.), home or site monitoring (inside and/or outside), person monitoring, and/or providing a personal emergency response system (PERS).
  • the mobile robotic device can provide videoconferencing (e.g., in a hospital setting), a point of sale terminal, interactive information/marketing terminal, etc.
  • Mobile robotic devices need to navigate in a robust or reliable manner, for example, to avoid obstacles and reach intended destinations.
  • the term“virtual boundary” refers to a non-physical border that acts as a demarcation between two or more areas of an environment.
  • the virtual boundary of the current disclosure may be an arbitrarily shaped border such as, without limitation, a straight line, a curved line, a zigzag line, or any other shape.
  • Virtual barriers can keep a robotic device from exiting or entering a particular area, e.g., to prevent a cleaning robot from moving from a hallway into a cubicle area, from a tiled area onto a carpeted floor, etc.
  • the virtual boundary may be temporary in that, upon satisfaction of one or more conditions, the robotic device may be permitted to cross the virtual boundary.
  • a first component may be an“upper” component and a second component may be a“lower” component when a light fixture is oriented in a first direction.
  • the relative orientations of the components may be reversed, or the components may be on the same plane, if the orientation of a light fixture that contains the components is changed.
  • the cl aims are intended to include all orientations of a device containing such components.
  • the embodiments disclosed herein can be used to navigate a robotic device in an environment that does not include a landmark, boundary, or other physical parameters to delineate or demarcate an area of the environment that should be avoided (referred to in this document as an“avoid area”) and/or an area within which travel is permited (referred to in this document as a“permissible area”) by the robotic device.
  • the environment to be navigated may include a tiled floor area that can be cleaned by water (permissible area) adjacent to a carpeted area that cannot be cleaned by water (avoid area), with no detectable boundary or landmark between the tiled floor and the carpeted area that the robotic device’s sensors can detect.
  • the boundary' may be detectable by hardware such as cameras and processes such as edge detection or other image processing techniques, but doing so to detect the boundary' between the tiled floor and the carpeted area may add processing time, require prior learning and/or memory' space, or the like.
  • the defining the virtual boundary may be performed in addition to detections performed by one or more sensors of the robotic device.
  • the environment to be navigated may include an area that should be avoided by the robotic device, but the data collected by the sensors of the robotic device corresponding to the area to be avoided may be unreliable such that it cannot be used to navigate the robotic device.
  • the permissible area may be bordered by dynamic moving objects that would provide unreliable data for navigation and localization .
  • the avoid area may include objects that are not easily detectable by the sensors of the robotic device due to their size, composition, shape, etc.
  • the avoid area may be an area of an otherwise contiguous surface that is considered low-traffic. It may be desirable to avoid cleaning the low-traffic areas to, for example, preserve a floor coating finish.
  • an avoid area may simply be an area that includes one or more objects that can act as detectable boundaries, but which does not need to be cleaned for certain reasons.
  • the embodiments disclosed in this disclosure include methods that allow ' a robotic device to define a virtual boundary between permissible areas and avoid areas based at least partly, on detectable boundaries or landmarks in the environment.
  • the detectable boundaries and/or landmarks may or may not be located in the avoid area.
  • This enables the robotic device to avoid certain areas in the environment in the absence of and/or without using complex sensing devices that require computation time or power.
  • the embodiments disclosed herein also can be used to help ensure that the navigation methods do not overwhelm the memory and/or processing capabilities of the robotic device since not all robotic devices have sufficient memory or processing power to handle or process large numbers of data points gathered by complex three-dimensional sensors.
  • the methods disclosed in this disclosure may obviate the need for an environment to be mapped before operating a robotic device in the environment and/or for a user to manually define a permissible area.
  • FIG. 1 illustrates a block diagram of components of an example embodiment of a robotic device 100.
  • the components and interaction of components described with respect to the robotic device 100 may be implemented in any other embodiments of robotic devices.
  • the embodiments of robotic devices described herein are also not limited to the components and interaction of components described with respect to the robotic device 100, but can be implemented in a number of other ways.
  • the robotic device 100 may be an autonomous device that is capable of automatically navigating its
  • the robotic device may include, without limitation, one or more sensors 102, a processing device 104, a memory 106, a power source 108, a communications interface 110, a user interface 112, and one or more vehicle function devices 114.
  • the robotic device 100 is located on the floor 116.
  • the robotic device 100 is configured to move across the floor 116.
  • the one or more sensors 102 may include one or more sensors located on the robotic device 100 and may be configured to provide information about the robotic device 100 itself and/or the environment around the robotic device 100.
  • the one or more sensors 102 may include a proximity sensor configured to detect a distance from the robotic device to any object in a field of the proximity sensor.
  • proximity sensors include infrared sensors, light detection and ranging
  • the one or more sensors 102 may also include sensors to detect an orientation or heading of the robotic device 100, such as a gyroscope or a compass, or to detect a speed and/or acceleration of the robotic device 100, such as an accelerometer or encoders.
  • the one or more sensors 102 may also include sensors that detect characteristics about the environment around the robotic device 100, such as a temperature sensor (e.g., a
  • thermocouple or a thermistor e.g , a thermocouple or a thermistor
  • a humidity sensor e.g , a hygrometer
  • a pressure sensor e.g., a barometer, a piezoelectric sensor
  • infrared (IR) sensor or any other sensor.
  • the one or more sensors 102 may include a landmark sensor that is configured to sense a detectable boundary or landmark in an environment and/or detect the position, orientation, etc of the robotic device 100 with respect to the detectable boundary or landmark such as a wall, fence, shelf, line on the floor, etc.
  • a landmark sensor that is configured to sense a detectable boundary or landmark in an environment and/or detect the position, orientation, etc of the robotic device 100 with respect to the detectable boundary or landmark such as a wall, fence, shelf, line on the floor, etc.
  • landmark sensors may include, without limitation, LIDAR sensors, sonar sensors, cameras, infrared sensors, laser detection and ranging (LADAR) sensors, radio frequency (RF) sensors, or the like.
  • LADAR laser detection and ranging
  • RF radio frequency
  • the processing device 104 may be configured to control one or more functions of the robotic device 100 such as, without limitation, navigation in an environment, cleaning (if a cleaning robotic device), communication with a user or an external system, or the like.
  • the processing device 104 is configured to control the movements of the robotic device 100 based on, without limitation, readings from the one or more sensors 102, a digital map of the environment, readings from one or more sensors in the environment, a predefined path of movement, or any other information, or combinations thereof.
  • the processing device may receive information from the one or more sensors 102 and analyze it to control the navigation of the robotic device 100.
  • the robotic device 100 also includes memory 106 and the processing device may write information to and/or read information from the memory 106. For example, one or more rules for generating a virtual boundary may be stored in the memory 106 and the processing device 104 may read the data from the memory 106 to aid in controlling movements of the robotic device.
  • the processing device 104 may communicate with each of the other components of the robotic device 100, via for example, a communication bus or any other suitable mechanism.
  • a communications interface 110 may be configured to facilitate communication of data into and out of the robotic device 100.
  • the communications interface 110 may include, without limitation, a WiFi transceiver, a Bluetooth transceiver, an Ethernet port, a USB port, and/or any other type of wired and/or wireless communication interfaces.
  • the communications interface 110 is configured to transmit data to and receive data from computing devices and/or networks that are not included in the robotic device 100.
  • the user interface 112 may include any type of input and/or output devices that permit a user to input commands into or receive information from the robotic device 100
  • the user input/output devices 112 may include, without limitation, a push button, a toggle switch, a touchscreen display, an LED light interface, a keyboard, a microphone, a speaker, or any other kind of input and/or output device.
  • the user input/output devices 112 may permit a user to control the operation of the robotic device 100, define setings (e.g , modes) of the robotic device 100, receive information about operations of the robotic device 100, troubleshoot problems with the robotic device 100, or the like.
  • the vehicle functional devices 114 of the robotic device 100 may include any device that is capable of causing the robotic device 100 to function in a particular way.
  • the vehicle functional devices 114 may include one or more motors that drive wheels of the robotic device 100 to cause it to move.
  • the vehicle functional devices 114 may include a steering mechanism to control a direction of movement of the robotic device 100.
  • the vehicle functional devices 114 may include a cleaning device configured to clean a surface on which the robotic device 100 moves (e.g., a sweeper, vacuum, mop, polisher, fluid dispenser, squeegee, or the like).
  • the vehicle functional devices 114 can include any number of other functional devices that cause the robotic device 100 to function.
  • the processing device 104 may also be configured to control operation of the vehicle functional devices 114.
  • the power source 108 is configured to provide power to the other components of the robotic device 100.
  • the power source 108 may be coupled to and capable of providing power to each of the one or more sensors 102, the computing device 104, the memory 106, the communications interface 110, the user interface 112, and/or the vehicle function devices 114.
  • the power source 108 may include one or more of a rechargeable battery, a non-rechargeable battery, a solar cell panel, an internal combustion engine, a chemical reaction power generator, or any other device configured to provide power to the robotic device 100 and its components.
  • FIG. 2 illustrates an example embodiment of a system 200 that includes the robotic device 100.
  • the system may include a network 210 that is in communication with the communications interlace 110 of the robotic device 100.
  • the network 210 may include a wireless network, a wired network, or any combination of wired and/or wireless networks.
  • the system 200 also includes a remote computing device 220 that is located remotely from the robotic device 100, and is in communication with the robotic device 100 via the network 210
  • the remote computing device 220 may include, without limitation, a laptop computer, a desktop computer, a server, a mobile phone, a tablet, or any other type of computing device.
  • the robotic device 100 may operate in a facility (e.g., a building, a campus of buildings, etc.), where the network 210 may include a private network to the facility (e.g , a WiFi network associated with the facility), and the remote computing device 220 may be a computing device located in the facility at a location different from the operation of the robotic device 100.
  • a facility e.g., a building, a campus of buildings, etc.
  • the network 210 may include a private network to the facility (e.g , a WiFi network associated with the facility)
  • the remote computing device 220 may be a computing device located in the facility at a location different from the operation of the robotic device 100.
  • the robotic device 100 may operate in a facility (e.g., a building, a campus of buildings, etc.) where the network 210 may include a public network (e.g., the Internet), and the remote computing device 220 may be located somewhere other than the facility (e.g., in a“cloud” data center, in a facility of a distributor of the robotic device 100, etc.). It will be understood that many other arrangements of the network 210 and the remote computing device 220 are within the scope of this disclosure. It will be understood that the remote computing device 220 may be a single computing device or may be a number of computing devices that are capable of interacting with each other.
  • a facility e.g., a building, a campus of buildings, etc.
  • the network 210 may include a public network (e.g., the Internet)
  • the remote computing device 220 may be located somewhere other than the facility (e.g., in a“cloud” data center, in a facility of a distributor of the robotic device 100, etc.). It
  • FIG. 3 illustrates a perspective view of an example environment 300 in which a robotic device 100 such as a robotic cleaning device is configured to move and operate.
  • the environment 300 is in the form of a room that includes a floor 310 and one or more detectable boundaries and/or landmarks (e.g., a wall 312, a wail 314, and a shelf 316).
  • the terms“detectable boundary” or“landmark” interchangeably refer to any object, detectable boundary, landmark, or other differentiating boundary (e.g., detectable boundary formed by, for example, a change in color, a change in texture, a change in material, a change in orientation, a change in height, a change in density, or the like), that is detectable by one or more sensors of the robotic device.
  • the one or more sensors of the robotic device using any now or hereafter known methods.
  • Examples of the detectable boundary may include, without limitation, furniture pieces, wails, stair wall edges, a line painted on the floor, edge between types of flooring (e.g., hard floor and carpeted floor, between different color tiles, between different types of carpet, or the like).
  • types of flooring e.g., hard floor and carpeted floor, between different color tiles, between different types of carpet, or the like.
  • the environment 100 may also include a ceiling, other walls, doors, windows, and the like, which are not depicted in the view shown in FIG. 3
  • the environment 300 may include at least one avoid area 320 that should be avoided during navigation of the robotic device.
  • the avoid area 320 may include, without limitation, a carpeted area, a clustered area that does not have enough navigable space and/or has objects that cannot be relied upon as landmarks during navigation (e.g., wires, toys, or the like); an entrance to another area (e.g., a cubicle); an area that includes moving objects (e.g., people) that renders sensor data corresponding to the area unrealizable for navigation memeposes; or any other area that has no detectable boundary' on at least one side but should be avoided during navigation of the robotic device.
  • the avoid area 320 is bounded on at least one side by a virtual boundary 321. As described below, the virtual boundary is defined by the robotic device relative to at least one of the one or more detectable boundaries in the environment.
  • the virtual boundary 321 is a demarcation between a permissible area where the robotic device can travel and an avoid area where the robotic device is not allowed to travel.
  • a virtual boundary may correspond in whole or in part to one or more detectable boundaries but it does not need to do so.
  • the system may use a detectable boundary' or landmark to define the virtual boundary', where the detectable boundary or landmark may be located anywhere in the environment 300 such as, without limitation, in a permissible area, in an avoid area, and/or any other area.
  • the robotic device 100 is located on the floor 310 and is configured to move across the floor 310.
  • the robotic device 100 may be one of a SWINGBOT, an AEROBOT, or a DUOBOT, cleaning robots.
  • the virtual boundary 321 may be defined as a function of distance, orientation, and/or shape of a detectable boundary or landmark.
  • the robotic device 100 is configured to define the virtual boundary 321 relative to one or more detectable boundaries based on one or more rale sets. Each rale set may correspond to a user-selectable mode of the robotic device.
  • a user may select the mode based on, for example and without limitation, the location of one or more detectable boundaries or landmarks in the environment; location of the avoid area; reliability of sensor data collected from avoid area; size and/or configuration of the permissible area and/or the avoid area, current position and/or orientation of the mobile device; or combinations thereof.
  • the robotic device may define a virtual boundary based on at least the following rales: (1) ignore sensor data collected from left side of the robotic device; and (2) defi ne a virtual boundary to the left side of the robotic device at a predefined distance from an initial position of the robotic device, and relative to a detectable boundary or landmark. Since sensor data collected from the left side of the robotic device is ignored, the selected detectable boundary or landmark will be located on the right side of the robotic device.
  • the robotic device may define a virtual boundary based on at least the following rules: (1) ignore sensor data collected from left side of the robotic device; and (2) define a virtual boundary to the left of the robotic device at a predefined distance from an initial position of the robotic device, and relative to a detectable boundary or landmark. Since sensor data collected from the left side of the robotic device is ignored, the selected detectable boundary or landmark wall be located on the right side of the robotic device.
  • the robotic device may define a virtual boundary based on at least the following rules: (1) select detectable boundary' or landmark located on the left side of the robotic device; and (2) define a virtual boundary to the right side of the robotic device relative to the detectable boundary or landmark such that the virtual boundary is at a distance from the robotic device that is equal to an initial distance of the detectable boundary or landmark from the robotic device.
  • the roboti c device may define a virtual boundary ? based on at least the following rules: (1 ) ignore sensor data collected from right side of the robotic device; and (2) define a virtual boundary to the right side of the robotic device at a predefined distance from an initial position of the robotic device, and relative to a detectable boundary or landmark. Since sensor data collected from the right side of the robotic device is ignored, the selected detectable boundary or landmark will be located on the left side of the robotic device.
  • the robotic device may define a vi rtual boundary' based on at least the following rules: (1) ignore sensor data collected from right side of the robotic device; and (2) define a virtual boundary to the right side of the robotic device at a predefined distance from an initial position of the robotic device, and relative to a detectable boundary or landmark. Since sensor data collected from the right side of the robotic device is ignored, the selected detectable boundary or landmark will be located on the left side of the robotic device.
  • the robotic device may define a virtual boundary based on at least the following rules: (1) select detectable boundary' or landmark located on the right side of the robotic device; and (2) define virtual boundary to the left side of the robotic device relative to the detectable boundary' or landmark such that the virtual boundary' is at a distance from the robotic device that is equal to an initial distance of the detectable boundary' or landmark from the robotic device.
  • the robotic device may define a virtual boundary' based on at least the following rules: (1) use all sensor data collected from the environment, (2) define a virtual boundary, relative to a detectable boundary' or landmark, in an area between the robotic device’s current position and the avoid area (that is located to the right side of the robotic device), and (3) the virtual boundary is defined at a predefined distance from an initial position of the robotic device in the area between the robotic device’s current position and the avoid area. Since all sensor data is used, the detectable boundary or landmark may be located on either side of the robotic device.
  • the robotic device may define a virtual boundary based on at least the following rules: (1) ignore sensor data collected from right side of the robotic device; and (2) define a virtual boundary in an area between the robotic device’s current position and the avoid area (that is located on the right side of the robotic device) relative to a detectable boundary or landmark that is located on the left side of the robotic device.
  • the virtual boundary may be at a predefined distance from an initial position of the robotic device.
  • the robotic device may define a virtual boundary based on at least the following rules: (1) use all sensor data collected from the environment, (2) define a virtual boundary, relative to a detectable boundary' or landmark, in an area between the robotic device’s current position and the avoid area (that is located to the left side of the robotic device, and (3) the virtual boundary is defined at a predefined distance from an initial posi tion of the robotic device in the area between the robotic device’s current position and the avoid area. Since all sensor data is used, the detectable boundary or landmark may be located on either side of the robotic device.
  • the robotic device may define a virtual boundary' based on at least the following rules: (1) ignore sensor data collected from left side of the robotic device; and (2) define a virtual boundary in an area between the robotic device’s current position and the avoid area (that is located on the left side of the robotic device) relative to a detectable boundary or landmark that is located on the right side of the robotic device.
  • the virtual boundary may be at a predefined distance from an initial position of the robotic device.
  • the robotic device may define a virtual boundary based on at least the following rales: (1) use all sensor data collected from the environment; (2) define a virtual boundary relative to a detectable boundary or landmark that is closest to the robotic device; (3) define the virtual boundary in an area that is on a side of the robotic device that does not include the detectable boundary or landmark; and (4) the virtual boundary is defined such that it is at a distance that is equal to the distance of the detectable boundary or landmark from an initial position of the robotic device.
  • the shape of the virtual boundary may be arbitrary.
  • the virtual boundary may be a straight line or any other shape irrespective of the shape or configuration of the corresponding detectable boundary or landmark.
  • the shape of the virtual boundary may be the same as the shape or configuration of the corresponding detectable boundary or landmark.
  • the shape of the virtual boundary may be a mirror image of the shape or configuration of the corresponding detectable boundary or landmark .
  • Each of the above described modes may also include rules for defining the shape of the virtual boundary.
  • the distance of the virtual boundary- from the robotic device may be a function of the distance (e.g., equal to) of the robotic device from the detectable boundary or landmark.
  • the robotic device in the mirror left mode, may define a virtual boundary that is a straight line parallel to a detectable boundary on the left of the robotic device, such that the virtual boundary is located on the right side of the robotic device at a distance from the robotic device that is equal to the distance between the robotic device and the detectable boundary.
  • the robotic device may detect the detectable boundary and determine its distance from the detectable boundary using one or more of its sensors (e.g , ultrasonic sensor).
  • the distance of the virtual boundary from the roboti c device may be predetermined based on a mode, adjustable by the user, and/or automatically determined by the robotic device based on sensor data collected by the robotic device.
  • a user may provide the distance to an avoid area and the robotic device may define the virtual boundary at a distance that is less than or equal to the distance to the avoid area.
  • the virtual boundary may be defined at a default distance such as at a distance equal to the distance to the detectable boundary.
  • the distance of a virtual boundary is always less than the distance to a detectable boundary in modes where the virtual boundary' and a detectable boundary are on the same side of the robotic device. In other words, a virtually boundary' will not be placed beyond a detectable boundary'.
  • FIGS. 4 A - 4D illustrate the respective“Ignore Left”,“Ignore Right”,“Mirror Right”, and“Mirror Left” modes of a robotic device.
  • the virtual boundary' 421 is defined on the left of the robotic device 100 such that it is at a
  • FIG. 4C illustrates a particular case of the embodiment shown in FIG. 4A where the distance of the virtual boundary 421 from the robotic device 100 is equal to the distance between the robotic device 100 in its initial position and the detectable boundary'.
  • an operator may place the robotic device 100 at a desired position within the permissible area 422, then an operator can select the appropriate mode, and then the robotic device 100 defines the virtual boundary 421 on the left side of the robotic device 100 relative to its initial position, e.g. adjacent to or a predetermined distance from the robotic device 100, as illustrated.
  • the virtual boundary 421 prevents the robotic device 100 from moving outside the permissible area 422 and into the avoid area
  • FIG. 4B illustrates a virtual boundary 421 that is defined on the right side of the robotic device 100 such that it is at a predetermined distance from an initial position of the robotic device 100 and follows a straight line path parallel to the detectable boundary.
  • FIG. 4D illustrates a particular case of the embodiment shown in FIG. 4A where the distance of the virtual boundary 421 from the robotic device 100 is equal to the distance between the robotic device 100 in its initial position and the detectable boundary.
  • Setup for the modes illustrated in FIGS. 4B and 4D is similar to the setup for the modes illustrated in FIGS 4A and 4C, albeit with the virtual boundary 421 being located on the right side of the robotic device 100.
  • FIGS. 4B illustrates a virtual boundary 421 that is defined on the right side of the robotic device 100 such that it is at a predetermined distance from an initial position of the robotic device 100 and follows a straight line path parallel to the detectable boundary.
  • FIG. 4D illustrates a particular case of the embodiment shown in FIG. 4A where the distance of the virtual
  • the virtual boundary 421 prevents the robotic device 100 from moving outside the permissible area 422 and into an avoid area 420 on the right of the virtual boundary 421. It will be understood that the straight line shape of the virtual boundary is shown as example and is not limiting
  • FIGS. 4E - 4G illustrate the“Allow Left” mode of a robotic device 100.
  • the robotic device 100 defines a straight line virtual boundary 421 using either the left detectable boundary and/or the right detectable boundary at a predetermined distance from an initial position of the robotic device and on the right side of the robotic device.
  • the distance may be provided by a user such that the virtual boundary
  • the robotic device 100 defines a straight line virtual boundary 421 parallel to a right detectable boundary, and at a predetermined distance from an initial position of the robotic device 100 on the right side of the robotic device 100.
  • the predetermined distance may be included in a mode and/or defined by a user such that the virtual boundary 421 exists between the robotic device’s current position and an avoid area 420 on the right of the robotic device so as to prevent the robotic device 100 from moving outside the permissible area 422 and into the avoid area 420.
  • the robotic device 100 defines a straight line virtual boundary 421 parallel to a left detectable boundary, and at a predetermined distance from an ini tial posi tion of the robotic device 100 on the right side of the robotic device 100, where the shape of the virtual boundary is the same as that of the left detectable boundary'.
  • the predetermined distance may be included in a mode and/or defined by a user such that the virtual boundary ' 421 exists between the robotic device’s current position and an avoid area 420 on the right of the robotic device so as to prevent the robotic device 100 from moving outside the permissible area 422 and into the avoid area 420.
  • FIGS. 4H and 41 illustrate the“Mirror” mode of a robotic device.
  • the detectable boundary ' is not in the shape of a straight line.
  • the robotic device 100 defines a virtual boundary 421 on the right side of the robotic device 100
  • the virtual boundary 421 in these example embodiments either replicates (FIG. 4H) or mirrors (FIG. 41) the shape of a detectable boundary ' that is closest to the robotic device 100 and is located at a distance from the robotic device that is equal to the distance between the robotic device and the selected detectable boundary'.
  • an operator may place the robotic device 100 at a desired positron within the permissible area 422, then an operator can select the appropriate“Mirror” mode, and then the robotic device 100 will traverse an initial path through the permissible area 422 to define the virtual boundary 421 on the right side of the robotic device 100.
  • the robotic device 100 will use the one or more sensors 102 to follow the contour of the detectable boundary.
  • the robotic device 100 replicates the detectable boundary so that the virtual boundary 421 follows a similar contour and is positioned on the right side of the robotic device 100 at a distance that is the same or similar to the distance between the robotic device 100 and the virtual boundary.
  • the robotic device 100 travels forward in a straight line instead of following the contour of the detectable boundary.
  • the robotic device 100 will still use the one or more sensors 102 to detect a distance to the detectable boundary and mirror the detectable boundary so that the virtual boundary 421 follows a mirrored contour positioned on the right side of the robotic device 100 at a distance that is the same or similar to the distance between the robotic device 100 and the virtual boundary.
  • FIG. 5 illustrates an example method for defining a virtual boundary in an environment based on one or more detectable boundaries. While the method 500 is described for the sake of convenience and not with an intent of limiting the disclosure as comprising a series and/or a number of steps, it is to be understood that the process does not need to be performed as a series of steps and/or the steps do not need to be performed in the order shown and described with respect to FIG. 5 but the process may be integrated and/or one or more steps may be performed together, simultaneously, or the steps may be performed in the order disclosed or in an alternate order. Likewise, any setup processes described above may be integrated and/or one or more steps may be performed together, simultaneously, or the steps may be performed in the order disclosed or in an alternate order.
  • a robotic device may be positioned in an environment such that it can observe or monitor at least one detectable boundary ' or landmark in the environment (e.g., using one or more sensors). This may be in a permissible area of the environment.
  • the robotic device may be configured to position itself automatically so it can observe or monitor a detectable boundary or landmark. Additionally and/or alternatively, it may be positioned by the user accordingly.
  • the robotic device may receive a selection of a virtual boundary definition mode from a user.
  • the virtu al boundary definition modes may be one or more of those described above (e.g , mirror left, mirror right, allow left, allow right, etc.) and/or other modes.
  • the mode selected will cause the robotic device to access and use a mode-specific rule set to define the virtual boundary.
  • a user may select the mode based on, for example and without limitation, the location of one or more detectable boundaries or landmarks in the environment; location of the avoid area, reliability of sensor data collected from avoid area; size and/or configuration of the permissible area and/or the avoid area, current position and/or orientation of the robotic device, or combinations thereof.
  • the robotic device may automatically select an appropriate mode without any user input and/or based on a user input that does not include a mode selection. For example, if the sensor data collected from the left side of the robotic device is unreliable, constantly changing, and/or unavailable, the robotic device may choose a mode (e.g., Ignore Left mode) that uses sensor data corresponding to a detectable boundary from the right side of the robotic device. Similarly, the robotic device may receive an identification of an avoid area and automatically select a mode that creates a virtual boundary between the robotic device and the avoid area based on, for example, the reliability of sensor data corresponding to detectable boundaries in the environment.
  • a mode e.g., Ignore Left mode
  • the robotic device may identify at least one detectable boundary or landmark to be used for defining the virtual boundary (506).
  • each mode includes rules for identification of the detectable boundary or landmark. For example, if an“Ignore Left” mode selection is received, the robotic device cannot use a detectable boundary or landmark located to the left of the robotic device.
  • a user may provide a selection of the detectable boundary or landmark to be used for defining the virtual boundary.
  • the robotic device may prompt a user to confirm the identification of the detectable boundary or landmark to be used for defining the virtual boundary
  • the robotic device may use one or more of its sensors (e.g., the landmark sensors) to detect the position, shape, orientation, distance, or other similar information about a plurality of detectable boundaries or landmarks in the environment.
  • the robotic device may use pre-loaded and/or pre-mapped data corresponding to the environment to identify detectable boundaries and landmarks in the environment. For example, the robotic device may receive a mode selection as mirror left and use the rule set corresponding to the mirror left mode to define the virtual boundary.
  • this may include detecting a detectabl e boundary on the left side of the robotic device (using one or more sensors), determining a distance between the robotic device and the detectable boundary, and defining a virtual boundary on the right side of the robotic device that is at a distance equal to the determined distance between the robotic device and the detectable boundary.
  • the defined virtual boundary may also mirror or replicate the shape of the detected detectable boundary or maybe any other arbitrary shape.
  • the robotic device may then use the rules corresponding to the received mode selection to identify the at least one detectable boundary or landmark to be used for defining the virtual boundary .
  • the robotic device may define a virtual boundary relative to the identified detectable boundary or landmark based on the rules corresponding to the received mode selection.
  • the robotic device may identify, without limitation, the shape, location, distance, etc. of the virtual boundary based on the rules corresponding to the received mode selection (described above). In certain embodiments, the robotic device may- save the defined virtual boundary for future operations of the robotic device.
  • the step of defining the virtual boundary relative to the identified detectable boundary or landmark may ⁇ be performed over the course of making an initial pass through the permissible area to appropriately characterize the identified detectable boundary or landmark in order to define the shape of the virtual boundary
  • the robotic device may navigate the environment such that it does not cross into an avoid area demarcated by the defined virtual boundary.
  • the robotic device may use the landmark sensors to maintain its desired position and orientation relative to the identified detectable boundary or landmark.
  • the robotic device navigates the environment such that it does not cross into an avoid area demarcated by the defined virtual boundary
  • the robotic device need not actually define a virtual boundary and/or save it. Rather, it may use the rales corresponding to the received mode selection to navigate the environment relative to the identified detectable boundary- or landmark without actually defining the virtual boundary'.
  • FIG. 6 depicts an example of internal hardware that may be included in any of the electronic components of the system, such as a robotic device, sensor, etc. having a processing capability, or a local or remote computing device that is in communication with the robotic device.
  • An electrical bus 600 serves as an information highway interconnecting the other illustrated components of the hardware.
  • Processor 605 is a central processing device of the system, configured to perform calculations and logic operations required to execute programming instructions.
  • processors and“processing device” may refer to a single processor or any number of processors in a set of processors that collectively perform a set of operations, such as a central processing unit (CPU), a graphics processing unit (GPU), a remote server, or a combination of these.
  • CPU central processing unit
  • GPU graphics processing unit
  • RAM random access memory
  • flash memory hard drives and other devices capable of storing electronic data constitute examples of memory devices 625 that may store the programming instructions
  • a memory device may include a single device or a collection of devices across which data and/or instructions are stored.
  • Various embodiments of the invention may include a computer- readable medium containing programming instructions that are configured to cause one or more processors, robotic devices and/or sensors to perform the functions described in the context of the previous figures.
  • An optional display interface 630 may permit information from the bus 600 to be displayed on a display device 635 in visual, graphic or alphanumeric format.
  • An audio interface and audio output (such as a speaker) also may be provided.
  • Communication with external devices may occur using various communication devices 640 such as a wireless antenna, an RFID tag and/or short-range or near-field communication transceiver, each of which may optionally communicatively connect with other components of the device via one or more communication system.
  • the communication device(s) 640 may be configured to be communicatively connected to a communications network, such as the Internet, a local area network or a cellular telephone data network.
  • the hardware may also include a user interface sensor 645 that allows for receipt of data from input devices 650 such as a keyboard, a mouse, a joystick, a touchscreen, a touch pad, a remote control, a pointing device and/or microphone.
  • input devices 650 such as a keyboard, a mouse, a joystick, a touchscreen, a touch pad, a remote control, a pointing device and/or microphone.
  • the electronic device is the smartphone or another image capturing device
  • digital images of a document or other image content may be acquired via a camera 620 that can capture video and/or still images.
  • one or more components of the system 600 may be located remotely from other components of the system 600 components, such as in a distributed system. Furthermore, one or more of the components may be combined and additional components performing functions described herein may be included in the system 600. Thus, the system 600 can be adapted to accommodate a variety of needs and circumstances.
  • the depicted and described architectures and descriptions are provided for exemplary purposes only and are not limiting to the various embodiments described herein
  • Embodiments described herein may be implemented in various ways, including as computer program products that comprise articles of manufacture
  • a computer program product may include a non-transitory computer-readable storage medium storing
  • Such non-transitory computer-readable storage media include all computer-readable media (including volatile and non-volatile media)
  • embodiments of the embodiments described herein may also be implemented as methods, apparatus, systems, computing devices, and the like. As such, embodiments described herein may take the form of an apparatus, system, computing device, and the like executing instructions stored on a computer readable storage medium to perform certain steps or operations. Thus, embodiments described herein may be implemented entirely in hardware, entirely in a computer program product, or in an embodiment that comprises combination of computer program products and hardware performing certain steps or operations.
  • Embodiments described herein may be made with reference to block diagrams and flowchart illustrations.
  • blocks of a block diagram and fl owchart illustrations may be impl emented in the form of a computer program product, in an entirely hardware embodiment, in a combination of hardware and computer program products, or in apparatus, systems, computing devices, and the like carrying out instructions, operations, or steps.
  • Such instructions, operations, or steps may be stored on a computer readable storage medium for execution buy a processing element in a computing device.
  • retrieval, loading, and execution of code may be performed sequentially such that one instruction is retrieved, loaded, and executed at a time.
  • retrieval, loading, and/or execution may be performed in parallel such that multiple instructions are retrieved, loaded, and/or executed together.
  • embodiments can produce specifically configured machines performing the steps or operations specified in the block diagrams and flowchart illustrations. Accordingly, the block diagrams and flowchart illustrations support various combinations of embodiments for performing the specified instructions, operations, or steps

Abstract

A robotic device includes a user interface, one or more sensors, a processor, and a non¬ transitory computer readable medium. The robotic device is configured to receive, via the user interface, a selection of a mode corresponding to a viitual boundary. The virtual boundary may delineate an avoid area to be avoided by the robotic device during navigation in the environment. The robotic device is also configured to identify at least one detectable boundary in the environment for defining the virtual boundary, and define the virtual boundary relative to the at least one detectable boundary based on the mode. The robotic device may move in the environment without crossing the virtual boundary to enter the avoid area.

Description

TITLE: METHODS AND SYSTEMS FOR DEFINING VIRTUAL BOUNDARIES FOR
A ROBOTIC DEVICE
CROSS-REFERENCE TO RELATED APPLICATION
[0001] This application claims priority to U.S. Provisional Patent Application No.
62/780,518, filed December 17, 2018, the disclosure of which is incorporated herein by reference in its entirety.
BACKGROUND
[0001] The present disclosure is in the technical field of robotic device navigation, particularly navigation of robotic devices (e.g., robotic cleaning devices). More
particularly, the present disclosure is directed to a method for defining virtual boundaries for the robotic devices when navigating in environments.
[0002] Robotic devices have the ability to minimize the human effort involved in performing everyday tasks. For example, robotic devices may be used as cleaning devices to help maintain and clean surfaces, such as hard floor surfaces, carpets, and the like. While robotic devices are useful, it can be challenging for robotic devices to operate in a variety of different locations.
[0003] Navigation in environments is typically done with the aid of sensors that are on board the robotic devices. In the case of floor cleaning, such robots are generally confined within (i) vertical walls and other obstacles within the rooms of a dwelling, (ii) IR-detected staircases (cliffs) leading downward: and/or (iii) user-placed detectable barriers such as directed IR beams, physical barriers or magnetic tape. Walls provide much of the confinement perimeter. Some robots may try to map the dwelling using a complex system of sensors and/or active or passive beacons (e.g., sonar, RFID or bar code detection, or various kinds of machine vision). These sensors provide information by which the robotic device is able to navigate. However, real-time processing of sensor data can be slow, causing the robotic device to move slowly to allow time for processing of the sensor data.
SUMMARY
[0004] This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This summary is not intended to identify key features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
[0005] In an embodiment, systems and methods for controlling the navigation of a robotic device in an environment is disclosed. The system may include robotic device that includes a user interface, one or more sensors, a processor, and a non-transitory computer readable medium. The non-transitory computer readabl e mediu may include programming instructions that are configured to cause the processor to execute the methods described in this disclosure. The method may include receiving a selection of a mode corresponding to a virtual boundary (for example, via the user interface of the robotic device). The virtual boundary may delineate an avoid area to be avoided by the robotic device during navigation in the environment. The method may also include identifying at least one detectable boundary in the environment for defining the virtual boundary, defining the virtual boundary relative to the at least one detectable boundary based on the mode, and operating the robotic device in the environment by causing the robotic device to move in the environment while ensuring that the robotic device does not cross the virtual boundary to enter the avoid area. [0006] Optionally, the robotic device may receive the selection of the mode via the user interface. Furthermore, the robotic device may identify the at least one detectable boundary by analyzing sensor data collected by the one or more sensors.
[0007] In certain embodiments, defining the virtual boundary relative to the at least one detectable boundary' (based on the mode) may include identifying one or more rule sets corresponding to the mode. Optionally, the one or more rule sets may include rules for defining one or more of the following: a shape of the virtual boundary, a position of the virtual boundary relative to the robotic device, and/or a position of the virtual boundary- relative to the at least one detectable boundary'.
[0008] In one or more embodiments, identifying at least one detectable boundary- in the environment for defining the virtual boundary may include identifying one or more detectable boundaries in the environment; and identifying the at least one detectable boundary from the amongst the one or more detectable boundaries based on one or more rules corresponding to at least one of the following: position of the avoid area in the environment, position of the at least one detectable boundary, characteristics of sensor data collected from the avoid area, and/or shape of the at least on detectable boundary.
Optionally, the one or more detectable boundaries in the environment may be identified based on sensor data collected by the robotic device, and/or map data corresponding to the environment. In certain embodiments, the detectable boundary may not be located in the avoid area if characteristics of sensor data collected from the environment are not reliable.
[0009] In certain embodiments, the virtual boundary may be defined relative to the at least one detectable boundary by traversing an initial path through the permissible area to determine a shape and a position of the virtual boundary rel ative to the robotic device or relative to the at least one detectable boundary. [0010] In certain more embodiments, the virtual boundary may be defined relative to the at least one detectable boundary' by defining a linear virtual boundary at a location relative to the robotic device and/or relative to the at least one detectable boundary'. Optionally a shape of the virtual boundary may be a replication of a shape of the at least one detectable boundary, a mirror image of a shape of the at least one detectable boundary, and/or l inear.
[0011] In at least one embodiment, the at least one detectable boundary may be identified to be located to a left side of the robotic device, and the virtual boundary may be defined relative to the at least one detectable boundary' by defining the virtual boundary to a right side of the robotic device at a predetermined distance from an initial position of the robotic device in the environment. Optionally, the predetermined distance may be equal to a distance between the initial position of the robotic device in the environment and the at least one detectable boundary.
[0012] In one or more embodiments, the at least one detectable boundary may be identified to be located to a left side of the robotic device; and the virtual boundary may be defined relative to the at least one detectable boundary' by defining the virtual boundary to the left side of the robotic device at a predetermined distance from an initial position of the robotic device in the environment. Optionally, the predetermined distance may be is less than a distance between the initial position of the robotic device in the environment and the at least one detectable boundary
[0013] In at least one embodiment, the at least one detectable boundary may be identified to be located to a right side of the robotic device; and the virtual boundary' may be defined relative to the at least one detectable boundary by defining the virtual boundary to a left side of the robotic device at a predetermined distance from an initial position of the robotic device in the environment. Optionally, the predetermined distance may be equal to a distance between the initial position of the roboti c device in the environment and the at least one detectable boundary.
[0014] In one or more embodiments, the at least one detectable boundary may be identified to be located to a right side of the robotic device; and the virtual boundary may be defined relative to the at least one detectable boundary by defining the virtual boundary to the right side of the robotic device at a predetermined distance from an initial position of the robotic device in the environment. Optionally, the predetermined distance may be is less than a distance between the initial position of the robotic device in the environment and the at least one detectable boundary.
BRIEF DESCRIPTION OF THE DRAWINGS
[0015] The foregoing aspects and many of the attendant advantages of the disclosed subject matter will become more readily appreciated as the same become better understood by reference to the following detailed description, when taken in conjunction with the accompanying drawings, wherein:
[0016] FIG. 1 depicts a block diagram of an example robotic device, in accordance with the embodiments described herein;
[0017] FIG. 2 depicts an embodiment of an example system that includes the robotic device shown in FIG. 1 , in accordance with the embodiment described herein;
[0018] FIG. 3 depicts a perspecti ve view of an environment in which a roboti c device can operate, in accordance with the embodiment described herein,
[0019] FIGS. 4A - 41 illustrate various example modes of operation of a robotic device, in accordance with the embodiment described herein; [0020] FIG. 5 depicts an example method for defining a virtual boundary in an
environment, in accordance with the embodiment described herein;
[0021] FIG. 6 depicts an example of internal hardware that may be included in any of the electronic components of the system, in accordance with the embodiments described herein.
DETAILED DESCRIPTIO
[0022] The present disclosure describes embodiments for the creation of virtual boundaries for aiding in the navigation of robotic devices such as robotic cleaning devices in an environment. The virtual boundaries can be generated based on environmental data of the environments (e.g., one or more detectable boundaries) in which the robotic devices are intended to navigate. The robotic devices can then navigate autonomously within the environments based on the virtual boundary.
[0023] As used in this document, any word in singular form, along with the singular forms “a,”‘"an” and“the,” include the plural reference unless the context clearly dictates otherwise. Unless defined otherwise, all technical and scientific terms used herein have the same meanings as commonly understood by one of ordinary skill in the art. AH publications mentioned in this document are incorporated by reference. Nothing in this document is to be construed as an admission that the embodiments described in this document are not entitled to antedate such disclosure by virtue of prior invention. As used in this document, the term “comprising” means“including, but not limited to.”
[0024] The terms“computing device” and“electronic device” refer to a device having a processor and a non-transitory, computer-readable medium (i.e., memory). The memory may contain programming instructions in the form of a software application that, when executed by the processor, causes the device to perform one or more processing operations according to the programming instructions. An electronic device also may include additional components such as a touch-sensitive display device that serves as a user interface, as well as a camera for capturing images. An electronic device also may include one or more communication hardware components such as a transmitter and/or receiver that will enable the device to send and/or receive signals to and/or from other devices, whether via a communications network or via near-field or short-range communication protocols. If so, the programming instructions may be stored on the remote device and executed on the processor of the computing device as in a thin client or Internet of Things (loT)
arrangement. Example components of an electronic device are discussed below in the context of FIG. 6.
[0025] The terms“memory,”“memory device,”“computer-readable medium” and“data store” each refer to a non -transitory device on which computer-readable data, programming instructions or both are stored. Unless the context specifically states that a single device is required or that multiple devices are required, the terms“memory,”“memory device,” “computer-readable medium” and“data store” include both the singular and plural embodiments, as well as portions of such devices such as memory sectors.
[0026] A“processor” or“processing device” is a hardware component of an electronic device that is configured to execute programming instructions. The term“processor” may refer to a single processor or to multiple processors that together implement various steps of a process. Unless the context specifically states that a single processor is required or that multiple processors are required, the term“processor” includes both the singular and plural embodiments.
[0027] As used herein, the term“robot” or“robotic device” refers to an electro-mechanical machine guided by a processor. The robotic device may also include a memory that may contain programming instructions in the form of a software application that, when executed by the processor, causes the device to perform one or more processing operations according to the programming instructions. A robotic device also may include one or more
communication hardware components such as a transmitter and/or receiver that will enable the device to send and/or receive signals to and/or from other devices, whether via a communications network or via near-field or short-range communication protocols. If so, the programming instructions may be stored on the remote device and executed on the processor of the robotic device as in a thin client or Internet of Things (IoT) arrangement. Mobile robotic devices have the capability to move around in their environment and are not fixed to one physical location. An example of a mobile robotic device that is in common use today is an automated guided vehicle or automatic guided vehicle (AGV). An AGV is generally a mobile robot that follows markers or wires in the floor, or uses electromagnetic emitter-detectors, including for example sonar, a vision system or lasers for navigation. Mobile robots can be found in industry, military and security environments. They also appear as consumer products, for entertainment or to perform certain tasks like vacuum cleaning and home assistance.
[0028] Mobile robotic devices may interact or interface with humans to provide a number of services that range from home assistance to commercial assistance and more. In the example of home assistance, a mobile robotic device can assist elderly people with everyday tasks, including, but not limited to, maintaining a medication regime, mobility assistance, communication assistance (e.g., video conferencing, telecommunications, Internet access, etc.), home or site monitoring (inside and/or outside), person monitoring, and/or providing a personal emergency response system (PERS). For commercial assistance, the mobile robotic device can provide videoconferencing (e.g., in a hospital setting), a point of sale terminal, interactive information/marketing terminal, etc. Mobile robotic devices need to navigate in a robust or reliable manner, for example, to avoid obstacles and reach intended destinations.
[0029] The term“virtual boundary” refers to a non-physical border that acts as a demarcation between two or more areas of an environment. The virtual boundary of the current disclosure may be an arbitrarily shaped border such as, without limitation, a straight line, a curved line, a zigzag line, or any other shape. Virtual barriers can keep a robotic device from exiting or entering a particular area, e.g., to prevent a cleaning robot from moving from a hallway into a cubicle area, from a tiled area onto a carpeted floor, etc. The virtual boundary may be temporary in that, upon satisfaction of one or more conditions, the robotic device may be permitted to cross the virtual boundary.
[0030] When used in this document, terms such as“top” and“bottom,”“upper” and “lower”, or“front” and“rear,” are not intended to have absolute orientations but are instead intended to describe relative positions of various components with respect to each other.
For example, a first component may be an“upper” component and a second component may be a“lower” component when a light fixture is oriented in a first direction. The relative orientations of the components may be reversed, or the components may be on the same plane, if the orientation of a light fixture that contains the components is changed.
The cl aims are intended to include all orientations of a device containing such components.
[0031] The embodiments disclosed herein can be used to navigate a robotic device in an environment that does not include a landmark, boundary, or other physical parameters to delineate or demarcate an area of the environment that should be avoided (referred to in this document as an“avoid area”) and/or an area within which travel is permited (referred to in this document as a“permissible area”) by the robotic device. For example, the environment to be navigated may include a tiled floor area that can be cleaned by water (permissible area) adjacent to a carpeted area that cannot be cleaned by water (avoid area), with no detectable boundary or landmark between the tiled floor and the carpeted area that the robotic device’s sensors can detect. Alternatively, the boundary' may be detectable by hardware such as cameras and processes such as edge detection or other image processing techniques, but doing so to detect the boundary' between the tiled floor and the carpeted area may add processing time, require prior learning and/or memory' space, or the like.
Optionally, the defining the virtual boundary may be performed in addition to detections performed by one or more sensors of the robotic device. Similarly, the environment to be navigated may include an area that should be avoided by the robotic device, but the data collected by the sensors of the robotic device corresponding to the area to be avoided may be unreliable such that it cannot be used to navigate the robotic device. For example, the permissible area may be bordered by dynamic moving objects that would provide unreliable data for navigation and localization . Alternatively and/or additionally, the avoid area may include objects that are not easily detectable by the sensors of the robotic device due to their size, composition, shape, etc. In yet another embodiment, the avoid area may be an area of an otherwise contiguous surface that is considered low-traffic. It may be desirable to avoid cleaning the low-traffic areas to, for example, preserve a floor coating finish. Finally, an avoid area may simply be an area that includes one or more objects that can act as detectable boundaries, but which does not need to be cleaned for certain reasons.
[0032] As such, the embodiments disclosed in this disclosure include methods that allow' a robotic device to define a virtual boundary between permissible areas and avoid areas based at least partly, on detectable boundaries or landmarks in the environment. The detectable boundaries and/or landmarks may or may not be located in the avoid area. This enables the robotic device to avoid certain areas in the environment in the absence of and/or without using complex sensing devices that require computation time or power. Hence, the embodiments disclosed herein also can be used to help ensure that the navigation methods do not overwhelm the memory and/or processing capabilities of the robotic device since not all robotic devices have sufficient memory or processing power to handle or process large numbers of data points gathered by complex three-dimensional sensors. Furthermore, the methods disclosed in this disclosure may obviate the need for an environment to be mapped before operating a robotic device in the environment and/or for a user to manually define a permissible area.
[0033] FIG. 1 illustrates a block diagram of components of an example embodiment of a robotic device 100. The components and interaction of components described with respect to the robotic device 100 may be implemented in any other embodiments of robotic devices. In addition, the embodiments of robotic devices described herein are also not limited to the components and interaction of components described with respect to the robotic device 100, but can be implemented in a number of other ways. In an embodiment, the robotic device 100 may be an autonomous device that is capable of automatically navigating its
environment.
[0034] The robotic device may include, without limitation, one or more sensors 102, a processing device 104, a memory 106, a power source 108, a communications interface 110, a user interface 112, and one or more vehicle function devices 114. In the depiction shown in FIG. 1 , the robotic device 100 is located on the floor 116. The robotic device 100 is configured to move across the floor 116.
[0035] In certain embodiments, the one or more sensors 102 may include one or more sensors located on the robotic device 100 and may be configured to provide information about the robotic device 100 itself and/or the environment around the robotic device 100.
For example, the one or more sensors 102 may include a proximity sensor configured to detect a distance from the robotic device to any object in a field of the proximity sensor. Examples of proximity sensors include infrared sensors, light detection and ranging
(LID All) sensors, global positioning system (GPS) devices, cameras, other electromagnetic energy sensors, sonar sensors, other forms of acoustic sensors, and other forms of proximity- sensors The one or more sensors 102 may also include sensors to detect an orientation or heading of the robotic device 100, such as a gyroscope or a compass, or to detect a speed and/or acceleration of the robotic device 100, such as an accelerometer or encoders. The one or more sensors 102 may also include sensors that detect characteristics about the environment around the robotic device 100, such as a temperature sensor (e.g., a
thermocouple or a thermistor), a humidity sensor (e.g , a hygrometer), a pressure sensor (e.g., a barometer, a piezoelectric sensor), infrared (IR) sensor, or any other sensor.
[0036] In at least one embodiment, the one or more sensors 102 may include a landmark sensor that is configured to sense a detectable boundary or landmark in an environment and/or detect the position, orientation, etc of the robotic device 100 with respect to the detectable boundary or landmark such as a wall, fence, shelf, line on the floor, etc.
Examples of landmark sensors may include, without limitation, LIDAR sensors, sonar sensors, cameras, infrared sensors, laser detection and ranging (LADAR) sensors, radio frequency (RF) sensors, or the like.
[0037] In some embodiments, the processing device 104 may be configured to control one or more functions of the robotic device 100 such as, without limitation, navigation in an environment, cleaning (if a cleaning robotic device), communication with a user or an external system, or the like. In some embodiments, the processing device 104 is configured to control the movements of the robotic device 100 based on, without limitation, readings from the one or more sensors 102, a digital map of the environment, readings from one or more sensors in the environment, a predefined path of movement, or any other information, or combinations thereof. For example, in an embodiment, the processing device may receive information from the one or more sensors 102 and analyze it to control the navigation of the robotic device 100. The robotic device 100 also includes memory 106 and the processing device may write information to and/or read information from the memory 106. For example, one or more rules for generating a virtual boundary may be stored in the memory 106 and the processing device 104 may read the data from the memory 106 to aid in controlling movements of the robotic device.
[0038] The processing device 104 may communicate with each of the other components of the robotic device 100, via for example, a communication bus or any other suitable mechanism.
[0039] In one or more embodiments, a communications interface 110 may be configured to facilitate communication of data into and out of the robotic device 100 In some
embodiments, the communications interface 110 may include, without limitation, a WiFi transceiver, a Bluetooth transceiver, an Ethernet port, a USB port, and/or any other type of wired and/or wireless communication interfaces. The communications interface 110 is configured to transmit data to and receive data from computing devices and/or networks that are not included in the robotic device 100.
[0040] In certain embodiments, the user interface 112 may include any type of input and/or output devices that permit a user to input commands into or receive information from the robotic device 100 In some embodiments, the user input/output devices 112 may include, without limitation, a push button, a toggle switch, a touchscreen display, an LED light interface, a keyboard, a microphone, a speaker, or any other kind of input and/or output device. The user input/output devices 112 may permit a user to control the operation of the robotic device 100, define setings (e.g , modes) of the robotic device 100, receive information about operations of the robotic device 100, troubleshoot problems with the robotic device 100, or the like.
[0041] In one or more embodiments, the vehicle functional devices 114 of the robotic device 100 may include any device that is capable of causing the robotic device 100 to function in a particular way. In some embodiments, the vehicle functional devices 114 may include one or more motors that drive wheels of the robotic device 100 to cause it to move. In some other embodiments, the vehicle functional devices 114 may include a steering mechanism to control a direction of movement of the robotic device 100. In some embodiments, the vehicle functional devices 114 may include a cleaning device configured to clean a surface on which the robotic device 100 moves (e.g., a sweeper, vacuum, mop, polisher, fluid dispenser, squeegee, or the like). The vehicle functional devices 114 can include any number of other functional devices that cause the robotic device 100 to function. In some embodiments, the processing device 104 may also be configured to control operation of the vehicle functional devices 114.
[0042] In some embodiments, the power source 108 is configured to provide power to the other components of the robotic device 100. The power source 108 may be coupled to and capable of providing power to each of the one or more sensors 102, the computing device 104, the memory 106, the communications interface 110, the user interface 112, and/or the vehicle function devices 114. The power source 108 may include one or more of a rechargeable battery, a non-rechargeable battery, a solar cell panel, an internal combustion engine, a chemical reaction power generator, or any other device configured to provide power to the robotic device 100 and its components.
[0043] FIG. 2 illustrates an example embodiment of a system 200 that includes the robotic device 100. The system may include a network 210 that is in communication with the communications interlace 110 of the robotic device 100. The network 210 may include a wireless network, a wired network, or any combination of wired and/or wireless networks. The system 200 also includes a remote computing device 220 that is located remotely from the robotic device 100, and is in communication with the robotic device 100 via the network 210 In some embodiments, the remote computing device 220 may include, without limitation, a laptop computer, a desktop computer, a server, a mobile phone, a tablet, or any other type of computing device.
[0044] In some embodiments, the robotic device 100 may operate in a facility (e.g., a building, a campus of buildings, etc.), where the network 210 may include a private network to the facility (e.g , a WiFi network associated with the facility), and the remote computing device 220 may be a computing device located in the facility at a location different from the operation of the robotic device 100. In some other embodiments, the robotic device 100 may operate in a facility (e.g., a building, a campus of buildings, etc.) where the network 210 may include a public network (e.g., the Internet), and the remote computing device 220 may be located somewhere other than the facility (e.g., in a“cloud” data center, in a facility of a distributor of the robotic device 100, etc.). It will be understood that many other arrangements of the network 210 and the remote computing device 220 are within the scope of this disclosure. It will be understood that the remote computing device 220 may be a single computing device or may be a number of computing devices that are capable of interacting with each other.
[0045] FIG. 3 illustrates a perspective view of an example environment 300 in which a robotic device 100 such as a robotic cleaning device is configured to move and operate. In the depicted embodiment, the environment 300 is in the form of a room that includes a floor 310 and one or more detectable boundaries and/or landmarks (e.g., a wall 312, a wail 314, and a shelf 316). As used herein, the terms“detectable boundary” or“landmark” interchangeably refer to any object, detectable boundary, landmark, or other differentiating boundary (e.g., detectable boundary formed by, for example, a change in color, a change in texture, a change in material, a change in orientation, a change in height, a change in density, or the like), that is detectable by one or more sensors of the robotic device. In certain other embodiments, the one or more sensors of the robotic device using any now or hereafter known methods. Examples of the detectable boundary may include, without limitation, furniture pieces, wails, stair wall edges, a line painted on the floor, edge between types of flooring (e.g., hard floor and carpeted floor, between different color tiles, between different types of carpet, or the like).
[0046] The environment 100 may also include a ceiling, other walls, doors, windows, and the like, which are not depicted in the view shown in FIG. 3
[0047] As shown in FIG. 3, the environment 300 may include at least one avoid area 320 that should be avoided during navigation of the robotic device. Examples of the avoid area 320 may include, without limitation, a carpeted area, a clustered area that does not have enough navigable space and/or has objects that cannot be relied upon as landmarks during navigation (e.g., wires, toys, or the like); an entrance to another area (e.g., a cubicle); an area that includes moving objects (e.g., people) that renders sensor data corresponding to the area unrealizable for navigation puiposes; or any other area that has no detectable boundary' on at least one side but should be avoided during navigation of the robotic device. In an embodiment, the avoid area 320 is bounded on at least one side by a virtual boundary 321. As described below, the virtual boundary is defined by the robotic device relative to at least one of the one or more detectable boundaries in the environment.
[0048] As discussed above, the virtual boundary 321 is a demarcation between a permissible area where the robotic device can travel and an avoid area where the robotic device is not allowed to travel. A virtual boundary may correspond in whole or in part to one or more detectable boundaries but it does not need to do so. However, in certain embodiments, the system may use a detectable boundary' or landmark to define the virtual boundary', where the detectable boundary or landmark may be located anywhere in the environment 300 such as, without limitation, in a permissible area, in an avoid area, and/or any other area.
[0049] In the embodiment shown in FIG. 3, the robotic device 100 is located on the floor 310 and is configured to move across the floor 310. In some examples, the robotic device 100 may be one of a SWINGBOT, an AEROBOT, or a DUOBOT, cleaning robots.
[0050] In one or more embodiments, the virtual boundary 321 may be defined as a function of distance, orientation, and/or shape of a detectable boundary or landmark. Specifically, the robotic device 100 is configured to define the virtual boundary 321 relative to one or more detectable boundaries based on one or more rale sets. Each rale set may correspond to a user-selectable mode of the robotic device. In an embodiment, a user may select the mode based on, for example and without limitation, the location of one or more detectable boundaries or landmarks in the environment; location of the avoid area; reliability of sensor data collected from avoid area; size and/or configuration of the permissible area and/or the avoid area, current position and/or orientation of the mobile device; or combinations thereof.
[0051] Non-limiting examples of user-selectable modes and corresponding rule sets are described below'.
[0052] Ignore left mode: If the sensor data from an avoid area is unreliable and a detectable boundary or landmark is located to the right side of the robotic device, the robotic device may define a virtual boundary based on at least the following rales: (1) ignore sensor data collected from left side of the robotic device; and (2) defi ne a virtual boundary to the left side of the robotic device at a predefined distance from an initial position of the robotic device, and relative to a detectable boundary or landmark. Since sensor data collected from the left side of the robotic device is ignored, the selected detectable boundary or landmark will be located on the right side of the robotic device. Alternatively and/or additionally, if the sensor data from an avoid area is unreliable and the avoid area to located to the left side of the robotic device, the robotic device may define a virtual boundary based on at least the following rules: (1) ignore sensor data collected from left side of the robotic device; and (2) define a virtual boundary to the left of the robotic device at a predefined distance from an initial position of the robotic device, and relative to a detectable boundary or landmark. Since sensor data collected from the left side of the robotic device is ignored, the selected detectable boundary or landmark wall be located on the right side of the robotic device.
[0053] Mirror left mode: If a detectable boundary or landmark is located to the left side of the robotic device and/or an avoid area is located to the right side of the robotic device, the robotic device may define a virtual boundary based on at least the following rules: (1) select detectable boundary' or landmark located on the left side of the robotic device; and (2) define a virtual boundary to the right side of the robotic device relative to the detectable boundary or landmark such that the virtual boundary is at a distance from the robotic device that is equal to an initial distance of the detectable boundary or landmark from the robotic device.
[0054] Ignore right mode: If the sensor data from an avoid area is unreliable and a detectable boundary or landmark is located to the left side of the roboti c device, the roboti c device may define a virtual boundary? based on at least the following rules: (1 ) ignore sensor data collected from right side of the robotic device; and (2) define a virtual boundary to the right side of the robotic device at a predefined distance from an initial position of the robotic device, and relative to a detectable boundary or landmark. Since sensor data collected from the right side of the robotic device is ignored, the selected detectable boundary or landmark will be located on the left side of the robotic device. Alternatively and/or additionally, if the sensor data from an avoid area is unreliable and the avoid area is located to the right side of the robotic device, the robotic device may define a vi rtual boundary' based on at least the following rules: (1) ignore sensor data collected from right side of the robotic device; and (2) define a virtual boundary to the right side of the robotic device at a predefined distance from an initial position of the robotic device, and relative to a detectable boundary or landmark. Since sensor data collected from the right side of the robotic device is ignored, the selected detectable boundary or landmark will be located on the left side of the robotic device.
[0055] Mirror right mode: If a detectable boundary or landmark is located to the right side of the robotic device and/or an avoid area is located to the left side of the robotic device, the robotic device may define a virtual boundary based on at least the following rules: (1) select detectable boundary' or landmark located on the right side of the robotic device; and (2) define virtual boundary to the left side of the robotic device relative to the detectable boundary' or landmark such that the virtual boundary' is at a distance from the robotic device that is equal to an initial distance of the detectable boundary' or landmark from the robotic device.
[0056] Allowed left mode: If the sensor data from an avoid area is reliable, and the avoid area is located to the right side of the robotic device, the robotic device may define a virtual boundary' based on at least the following rules: (1) use all sensor data collected from the environment, (2) define a virtual boundary, relative to a detectable boundary' or landmark, in an area between the robotic device’s current position and the avoid area (that is located to the right side of the robotic device), and (3) the virtual boundary is defined at a predefined distance from an initial position of the robotic device in the area between the robotic device’s current position and the avoid area. Since all sensor data is used, the detectable boundary or landmark may be located on either side of the robotic device.
[0057 ] Allowed left, ignore right mode: If the sensor data from an avoid area is unreliable and the avoid area is located to the right side of the robotic device, the robotic device may define a virtual boundary based on at least the following rules: (1) ignore sensor data collected from right side of the robotic device; and (2) define a virtual boundary in an area between the robotic device’s current position and the avoid area (that is located on the right side of the robotic device) relative to a detectable boundary or landmark that is located on the left side of the robotic device. The virtual boundary may be at a predefined distance from an initial position of the robotic device.
[0058] Allowed right mode: If the sensor data from an avoid area is reliable, and the avoid area is located to the left side of the robotic device, the robotic device may define a virtual boundary based on at least the following rules: (1) use all sensor data collected from the environment, (2) define a virtual boundary, relative to a detectable boundary' or landmark, in an area between the robotic device’s current position and the avoid area (that is located to the left side of the robotic device, and (3) the virtual boundary is defined at a predefined distance from an initial posi tion of the robotic device in the area between the robotic device’s current position and the avoid area. Since all sensor data is used, the detectable boundary or landmark may be located on either side of the robotic device.
[0059] Allowed right, ignore left mode: If the sensor data from an avoid area is unreliable and the avoid area is located to the left side of the robotic device, the robotic device may define a virtual boundary' based on at least the following rules: (1) ignore sensor data collected from left side of the robotic device; and (2) define a virtual boundary in an area between the robotic device’s current position and the avoid area (that is located on the left side of the robotic device) relative to a detectable boundary or landmark that is located on the right side of the robotic device. The virtual boundary may be at a predefined distance from an initial position of the robotic device.
[0060] Mirror mode: If the sensor data from an avoid area is reliable, the robotic device may define a virtual boundary based on at least the following rales: (1) use all sensor data collected from the environment; (2) define a virtual boundary relative to a detectable boundary or landmark that is closest to the robotic device; (3) define the virtual boundary in an area that is on a side of the robotic device that does not include the detectable boundary or landmark; and (4) the virtual boundary is defined such that it is at a distance that is equal to the distance of the detectable boundary or landmark from an initial position of the robotic device.
[0061] As discussed above, the shape of the virtual boundary may be arbitrary. In certain embodiments, the virtual boundary may be a straight line or any other shape irrespective of the shape or configuration of the corresponding detectable boundary or landmark.
Alternatively and/or additionally, the shape of the virtual boundary may be the same as the shape or configuration of the corresponding detectable boundary or landmark. In other embodiments, the shape of the virtual boundary may be a mirror image of the shape or configuration of the corresponding detectable boundary or landmark . Each of the above described modes may also include rules for defining the shape of the virtual boundary.
[0062] In some embodiments, the distance of the virtual boundary- from the robotic device may be a function of the distance (e.g., equal to) of the robotic device from the detectable boundary or landmark. For example, in the mirror left mode, the robotic device may define a virtual boundary that is a straight line parallel to a detectable boundary on the left of the robotic device, such that the virtual boundary is located on the right side of the robotic device at a distance from the robotic device that is equal to the distance between the robotic device and the detectable boundary. The robotic device may detect the detectable boundary and determine its distance from the detectable boundary using one or more of its sensors (e.g , ultrasonic sensor). Alternatively, the distance of the virtual boundary from the roboti c device may be predetermined based on a mode, adjustable by the user, and/or automatically determined by the robotic device based on sensor data collected by the robotic device. In certain embodiments, a user may provide the distance to an avoid area and the robotic device may define the virtual boundary at a distance that is less than or equal to the distance to the avoid area. In some embodiments, if a distance is not specified, the virtual boundary may be defined at a default distance such as at a distance equal to the distance to the detectable boundary.
[0063] It should be noted that the distance of a virtual boundary is always less than the distance to a detectable boundary in modes where the virtual boundary' and a detectable boundary are on the same side of the robotic device. In other words, a virtually boundary' will not be placed beyond a detectable boundary'.
[0064] FIGS. 4 A - 4D illustrate the respective“Ignore Left”,“Ignore Right”,“Mirror Right”, and“Mirror Left” modes of a robotic device. As shown in FIG. 4A, the virtual boundary' 421 is defined on the left of the robotic device 100 such that it is at a
predetermined distance from an initial position of the robotic device 100 and follows a straight line path parallel to the detectable boundary'. FIG. 4C illustrates a particular case of the embodiment shown in FIG. 4A where the distance of the virtual boundary 421 from the robotic device 100 is equal to the distance between the robotic device 100 in its initial position and the detectable boundary'. In an initial setup process for the modes illustrated in FIGS. 4A and 4C, an operator may place the robotic device 100 at a desired position within the permissible area 422, then an operator can select the appropriate mode, and then the robotic device 100 defines the virtual boundary 421 on the left side of the robotic device 100 relative to its initial position, e.g. adjacent to or a predetermined distance from the robotic device 100, as illustrated. In FIGS. 4A and 4C, the virtual boundary 421 prevents the robotic device 100 from moving outside the permissible area 422 and into the avoid area
420 on the left of the virtual boundary 421. FIG. 4B illustrates a virtual boundary 421 that is defined on the right side of the robotic device 100 such that it is at a predetermined distance from an initial position of the robotic device 100 and follows a straight line path parallel to the detectable boundary. FIG. 4D illustrates a particular case of the embodiment shown in FIG. 4A where the distance of the virtual boundary 421 from the robotic device 100 is equal to the distance between the robotic device 100 in its initial position and the detectable boundary. Setup for the modes illustrated in FIGS. 4B and 4D is similar to the setup for the modes illustrated in FIGS 4A and 4C, albeit with the virtual boundary 421 being located on the right side of the robotic device 100. In FIGS. 4B and 4D, the virtual boundary 421 prevents the robotic device 100 from moving outside the permissible area 422 and into an avoid area 420 on the right of the virtual boundary 421. It will be understood that the straight line shape of the virtual boundary is shown as example and is not limiting
[0065] FIGS. 4E - 4G illustrate the“Allow Left” mode of a robotic device 100. As shown in FIG 4E, the robotic device 100 defines a straight line virtual boundary 421 using either the left detectable boundary and/or the right detectable boundary at a predetermined distance from an initial position of the robotic device and on the right side of the robotic device. For example, the distance may be provided by a user such that the virtual boundary
421 exists between the robotic device’s current position and an avoid area 420 on the right of the robotic device so as to prevent the robotic device 100 from moving outside the permissible area 422 and into the avoid area 420. As shown in FIG 4F, the robotic device 100 defines a straight line virtual boundary 421 parallel to a right detectable boundary, and at a predetermined distance from an initial position of the robotic device 100 on the right side of the robotic device 100. For example, the predetermined distance may be included in a mode and/or defined by a user such that the virtual boundary 421 exists between the robotic device’s current position and an avoid area 420 on the right of the robotic device so as to prevent the robotic device 100 from moving outside the permissible area 422 and into the avoid area 420. As shown in FIG 4G, the robotic device 100 defines a straight line virtual boundary 421 parallel to a left detectable boundary, and at a predetermined distance from an ini tial posi tion of the robotic device 100 on the right side of the robotic device 100, where the shape of the virtual boundary is the same as that of the left detectable boundary'. For example, the predetermined distance may be included in a mode and/or defined by a user such that the virtual boundary' 421 exists between the robotic device’s current position and an avoid area 420 on the right of the robotic device so as to prevent the robotic device 100 from moving outside the permissible area 422 and into the avoid area 420.
[0066] FIGS. 4H and 41 illustrate the“Mirror” mode of a robotic device. In certain embodiments, in the mirror mode the detectable boundary' is not in the shape of a straight line. As shown in FIGS. 4H and 41, the robotic device 100 defines a virtual boundary 421 on the right side of the robotic device 100 The virtual boundary 421 in these example embodiments either replicates (FIG. 4H) or mirrors (FIG. 41) the shape of a detectable boundary' that is closest to the robotic device 100 and is located at a distance from the robotic device that is equal to the distance between the robotic device and the selected detectable boundary'. In an initial setup process for the“Mirror” modes illustrated in FIGS. 4H and 41, an operator may place the robotic device 100 at a desired positron within the permissible area 422, then an operator can select the appropriate“Mirror” mode, and then the robotic device 100 will traverse an initial path through the permissible area 422 to define the virtual boundary 421 on the right side of the robotic device 100. In FIG. 4H, the robotic device 100 will use the one or more sensors 102 to follow the contour of the detectable boundary. In following this initial path, the robotic device 100 replicates the detectable boundary so that the virtual boundary 421 follows a similar contour and is positioned on the right side of the robotic device 100 at a distance that is the same or similar to the distance between the robotic device 100 and the virtual boundary. In FIG. 41, the robotic device 100 travels forward in a straight line instead of following the contour of the detectable boundary. In following this initial path, the robotic device 100 will still use the one or more sensors 102 to detect a distance to the detectable boundary and mirror the detectable boundary so that the virtual boundary 421 follows a mirrored contour positioned on the right side of the robotic device 100 at a distance that is the same or similar to the distance between the robotic device 100 and the virtual boundary.
[0067] FIG. 5 illustrates an example method for defining a virtual boundary in an environment based on one or more detectable boundaries. While the method 500 is described for the sake of convenience and not with an intent of limiting the disclosure as comprising a series and/or a number of steps, it is to be understood that the process does not need to be performed as a series of steps and/or the steps do not need to be performed in the order shown and described with respect to FIG. 5 but the process may be integrated and/or one or more steps may be performed together, simultaneously, or the steps may be performed in the order disclosed or in an alternate order. Likewise, any setup processes described above may be integrated and/or one or more steps may be performed together, simultaneously, or the steps may be performed in the order disclosed or in an alternate order.
[0068] At 502, a robotic device may be positioned in an environment such that it can observe or monitor at least one detectable boundary' or landmark in the environment (e.g., using one or more sensors). This may be in a permissible area of the environment. The robotic device may be configured to position itself automatically so it can observe or monitor a detectable boundary or landmark. Additionally and/or alternatively, it may be positioned by the user accordingly.
[0069] At 504, the robotic device may receive a selection of a virtual boundary definition mode from a user. The virtu al boundary definition modes may be one or more of those described above (e.g , mirror left, mirror right, allow left, allow right, etc.) and/or other modes. The mode selected will cause the robotic device to access and use a mode-specific rule set to define the virtual boundary. As discussed above, a user may select the mode based on, for example and without limitation, the location of one or more detectable boundaries or landmarks in the environment; location of the avoid area, reliability of sensor data collected from avoid area; size and/or configuration of the permissible area and/or the avoid area, current position and/or orientation of the robotic device, or combinations thereof.
[0070] While the above disclosure describes receiving the mode selection from a user, the disclosure is not so limiting. In certain embodiments, the robotic device may automatically select an appropriate mode without any user input and/or based on a user input that does not include a mode selection. For example, if the sensor data collected from the left side of the robotic device is unreliable, constantly changing, and/or unavailable, the robotic device may choose a mode (e.g., Ignore Left mode) that uses sensor data corresponding to a detectable boundary from the right side of the robotic device. Similarly, the robotic device may receive an identification of an avoid area and automatically select a mode that creates a virtual boundary between the robotic device and the avoid area based on, for example, the reliability of sensor data corresponding to detectable boundaries in the environment. [0071] Based on the mode selection received, the robotic device may identify at least one detectable boundary or landmark to be used for defining the virtual boundary (506). As described above, each mode includes rules for identification of the detectable boundary or landmark. For example, if an“Ignore Left” mode selection is received, the robotic device cannot use a detectable boundary or landmark located to the left of the robotic device. In certain embodiments, a user may provide a selection of the detectable boundary or landmark to be used for defining the virtual boundary. Alternatively and/or additionally, the robotic device may prompt a user to confirm the identification of the detectable boundary or landmark to be used for defining the virtual boundary
[0072] In certain embodiments, the robotic device may use one or more of its sensors (e.g., the landmark sensors) to detect the position, shape, orientation, distance, or other similar information about a plurality of detectable boundaries or landmarks in the environment. Alternatively and/or additionally, the robotic device may use pre-loaded and/or pre-mapped data corresponding to the environment to identify detectable boundaries and landmarks in the environment. For example, the robotic device may receive a mode selection as mirror left and use the rule set corresponding to the mirror left mode to define the virtual boundary. As discussed above, this may include detecting a detectabl e boundary on the left side of the robotic device (using one or more sensors), determining a distance between the robotic device and the detectable boundary, and defining a virtual boundary on the right side of the robotic device that is at a distance equal to the determined distance between the robotic device and the detectable boundary. The defined virtual boundary may also mirror or replicate the shape of the detected detectable boundary or maybe any other arbitrary shape. The robotic device may then use the rules corresponding to the received mode selection to identify the at least one detectable boundary or landmark to be used for defining the virtual boundary . [0073] At 508, the robotic device may define a virtual boundary relative to the identified detectable boundary or landmark based on the rules corresponding to the received mode selection. Specifically, the robotic device may identify, without limitation, the shape, location, distance, etc. of the virtual boundary based on the rules corresponding to the received mode selection (described above). In certain embodiments, the robotic device may- save the defined virtual boundary for future operations of the robotic device. The step of defining the virtual boundary relative to the identified detectable boundary or landmark may¬ be performed over the course of making an initial pass through the permissible area to appropriately characterize the identified detectable boundary or landmark in order to define the shape of the virtual boundary
[0074] At 510, the robotic device may navigate the environment such that it does not cross into an avoid area demarcated by the defined virtual boundary. During navigation, the robotic device may use the landmark sensors to maintain its desired position and orientation relative to the identified detectable boundary or landmark.
[0075] While the current disclosure describes that a virtual boundary is defined and the robotic device navigates the environment such that it does not cross into an avoid area demarcated by the defined virtual boundary, it will be understood to those skilled in the art that the robotic device need not actually define a virtual boundary and/or save it. Rather, it may use the rales corresponding to the received mode selection to navigate the environment relative to the identified detectable boundary- or landmark without actually defining the virtual boundary'.
[0076] FIG. 6 depicts an example of internal hardware that may be included in any of the electronic components of the system, such as a robotic device, sensor, etc. having a processing capability, or a local or remote computing device that is in communication with the robotic device. An electrical bus 600 serves as an information highway interconnecting the other illustrated components of the hardware. Processor 605 is a central processing device of the system, configured to perform calculations and logic operations required to execute programming instructions. As used in this document and in the claims, the terms “processor” and“processing device” may refer to a single processor or any number of processors in a set of processors that collectively perform a set of operations, such as a central processing unit (CPU), a graphics processing unit (GPU), a remote server, or a combination of these. Read only memory (ROM), random access memory (RAM), flash memory, hard drives and other devices capable of storing electronic data constitute examples of memory devices 625 that may store the programming instructions A memory device may include a single device or a collection of devices across which data and/or instructions are stored. Various embodiments of the invention may include a computer- readable medium containing programming instructions that are configured to cause one or more processors, robotic devices and/or sensors to perform the functions described in the context of the previous figures.
[0077] An optional display interface 630 may permit information from the bus 600 to be displayed on a display device 635 in visual, graphic or alphanumeric format. An audio interface and audio output (such as a speaker) also may be provided. Communication with external devices may occur using various communication devices 640 such as a wireless antenna, an RFID tag and/or short-range or near-field communication transceiver, each of which may optionally communicatively connect with other components of the device via one or more communication system. The communication device(s) 640 may be configured to be communicatively connected to a communications network, such as the Internet, a local area network or a cellular telephone data network. [0078] The hardware may also include a user interface sensor 645 that allows for receipt of data from input devices 650 such as a keyboard, a mouse, a joystick, a touchscreen, a touch pad, a remote control, a pointing device and/or microphone. In embodiments where the electronic device is the smartphone or another image capturing device, digital images of a document or other image content may be acquired via a camera 620 that can capture video and/or still images.
[0079] The features and functions disclosed above, as well as alternatives, may be combined into many other different systems or applications. Various presently unforeseen or unanticipated alternatives, modifications, variations or improvements may be made by those skilled in the art, each of which is also intended to be encompassed by the disclosed embodiments.
[0080] As will be appreciated by those skilled in the art, one or more components of the system 600 may be located remotely from other components of the system 600 components, such as in a distributed system. Furthermore, one or more of the components may be combined and additional components performing functions described herein may be included in the system 600. Thus, the system 600 can be adapted to accommodate a variety of needs and circumstances. The depicted and described architectures and descriptions are provided for exemplary purposes only and are not limiting to the various embodiments described herein
[0081] It will be appreciated that the above-disclosed and other features and functions may be combined into many other different systems or applications. All such applications and alternatives are also intended to be encompassed by the disclosure of this patent document.
[0082] Embodiments described herein may be implemented in various ways, including as computer program products that comprise articles of manufacture A computer program product may include a non-transitory computer-readable storage medium storing
applications, programs, program modules, scripts, source code, program code, object code, byte code, compiled code, interpreted code, machine code, executable instructions, and/or the like (also referred to herein as executable instructions, instructions for execution, computer program products, progra code, and/or similar terms used herein
interchangeably). Such non-transitory computer-readable storage media include all computer-readable media (including volatile and non-volatile media)
[0083] As should be appreciated, various embodiments of the embodiments described herein may also be implemented as methods, apparatus, systems, computing devices, and the like. As such, embodiments described herein may take the form of an apparatus, system, computing device, and the like executing instructions stored on a computer readable storage medium to perform certain steps or operations. Thus, embodiments described herein may be implemented entirely in hardware, entirely in a computer program product, or in an embodiment that comprises combination of computer program products and hardware performing certain steps or operations.
[0084] Embodiments described herein may be made with reference to block diagrams and flowchart illustrations. Thus, it should be understood that blocks of a block diagram and fl owchart illustrations may be impl emented in the form of a computer program product, in an entirely hardware embodiment, in a combination of hardware and computer program products, or in apparatus, systems, computing devices, and the like carrying out instructions, operations, or steps. Such instructions, operations, or steps may be stored on a computer readable storage medium for execution buy a processing element in a computing device.
For example, retrieval, loading, and execution of code may be performed sequentially such that one instruction is retrieved, loaded, and executed at a time. In some exemplary embodiments, retrieval, loading, and/or execution may be performed in parallel such that multiple instructions are retrieved, loaded, and/or executed together. Thus, such
embodiments can produce specifically configured machines performing the steps or operations specified in the block diagrams and flowchart illustrations. Accordingly, the block diagrams and flowchart illustrations support various combinations of embodiments for performing the specified instructions, operations, or steps
[0085] The principles, representative embodiments, and modes of operation of the present disclosure have been described in the foregoing description. However, aspects of the present disclosure which are intended to be protected are not to be construed as limited to the particular embodiments disclosed. Further, the embodiments described herein are to be regarded as illustrative rather than restrictive. It will be appreciated that variations and changes may be made by others, and equivalents employed, without departing from the spirit of the present disclosure. Accordingly, it is expressly intended that all such variations, changes, and equivalents fall within the spirit and scope of the present disclosure, as claimed.

Claims

CLAIMS What is claimed is:
1. A method comprising, by a processor of a robotic device:
receiving a selection of a mode corresponding to a virtual boundary, wherein the virtual boundary delineates an avoid area to be avoided by the robotic device during navigation in an environment;
identifying at least one detectable boundary' in the environment for defining the virtual boundary;
defining, based on the mode, the virtual boundary relative to the at least one detectable boundary; and
operating the robotic device in the environment by causing the robotic device to move in the environment while ensuring that the robotic device does not cross the virtual boundary to enter the avoid area.
2. The method of claim 1, wherein defining, based on the mode, the virtual boundary relative to the at least one detectable boundary' comprises identifying one or more rule sets corresponding to the mode, wherein the one or more rule sets include rales for defining at least one of the following: a shape of the virtual boundary, a position of the virtual boundary relative to the robotic device, or a position of the virtual boundary relative to the at least one detectable boundary'.
3. The method of claim 1, wherein identifying, based on the mode, at least one detectabl e boundary in the environment for defining the virtual boundary comprises:
identifying one or more detectable boundaries in the environment; and identifying the at least one detectable boundary from the amongst the one or more detectable boundaries based on one or more rules corresponding to at least one of the following: position of the avoid area in the environment, position of the at least one detectable boundary, characteristics of sensor data collected from the avoid area, or shape of the at least on detectable boundary
4. The method of claim 3, wherein identifying the one or more detectable boundaries in the environment comprises identifying the one or more detectable boundaries in the environment based on at least one of the following:
sensor data collected by the robotic device, or
map data corresponding to the environment.
5. The method of claim 3, wherein the one or more rules compri se a rule that the detectable boundary cannot be located in the avoid area if characteristics of sensor data collected from the environment are not reliable.
6. The method of claim 1, wherein defining the virtual boundary relative to the at least one detectable boundary comprises traversing, by the robotic device, an initial path through the permissible area to determine a shape and a position of the virtual boundary relative to the robotic device or relative to the at least one detectable boundary.
7. The method of claim 1, wherein defining the virtual boundary relative to the at least one detectable boundary comprises defining a linear virtual boundary at a location relative to the robotic device or relative to the at least one detectable boundary.
8 The method of claim 8, wherein a shape of the virtual boundary is selected from the group comprising: replication of a shape of the at least one detectable boundary, mirror image of a shape of the at least one detectable boundary, and linear.
9. The method of claim 1, wherein:
identifying the at least one detectable boundary in the environment for defining the virtual boundary comprises identifying the at least one detectable boundary' to a left side of the robotic device; and
defining the virtual boundary relative to the at least one detectable boundary comprises defining the virtual boundary to a right side of the robotic device at a
predetermined distance from an initial position of the robotic device in the environment.
10. The method of claim 9, wherein the predetermined distance is equal to a distance between the initial position of the robotic device in the environment and the at least one detectable boundary.
11. The method of claim 1, wherein:
identifying the at least one detectable boundary in the environment for defining the virtual boundary comprises identifying the at least one detectable boundary to a left side of the robotic device; and
defining the virtual boundary relative to the at least one detectable boundary comprises defining the virtual boundary to the left side of the robotic device at a predetermined distance from an initial position of the robotic device in the environment.
12 The method of claim 11, wherein the predetermined distance is less than a distance between the initial position of the robotic device in the environment and the at least one detectable boundary.
13. The method of claim 1, wherein:
identifying the at least one detectable boundary in the environment for defining the virtual boundary comprises identifying the at least one detectable boundary to a right side of the robotic device; and
defining the virtual boundary relative to the at least one detectable boundary comprises defining the virtual boundary to a left side of the robotic device at a
predetermined distance from an initial position of the robotic device in the environment.
14. The method of claim 13, wherein the predetermined di stance is equal to a distance between the initial position of the robotic device in the environment and the at least one detectable boundary.
15. The method of claim 1, wherein:
identifying the at least one detectable boundary in the environment for defining the virtual boundary comprises identifying the at least one detectable boundary to a right side of the robotic device; and
defining the virtual boundary relative to the at least one detectable boundary comprises defining the virtual boundary to the right side of the robotic device at a predetermined distance from an initial position of the robotic device in the environment.
16. The method of claim 15, wherein the predetermined distance is less than a distance between the initial position of the robotic device in the environment and the at least one detectable boundary.
17. A robotic device comprising:
a user interface;
one or more sensors;
a processor; and
a non-transitory computer readable medium comprising programming instructions that when executed cause the processor to:
receive, via the user interface, a selection of a mode corresponding to a virtual boundary, wherein the virtual boundary delineates an avoid area to be avoided by the robotic device during navigation in an environment,
identify, using the one or more sensors and based on the mode, at least one detectable boundary' in the environment for defining the virtual boundary,
define, based on the mode, the virtual boundary relative to the at least one detectable boundary, and
cause the robotic device to move in the environment while ensuring that the robotic device does not cross the virtual boundary to enter the avoid area.
18. The robotic device of claim 17, wherein the programming instructions that cause the processor to define, using the one or more sensors and based on the mode, the virtual boundary' relative to the at least one detectable boundary' further comprise programming instructions to cause the processor to identify one or more rule sets corresponding to the mode, wherein the one or more rule sets include rules for defining at least one of the following: a shape of the virtual boundary, a position of the virtual boundary relative to the robotic device, or a position of the virtual boundary relative to the at least one detectable boundary.
19. The robotic device of claim 18, wherein the one or more rules comprise a rule that the detectable boundary cannot be located in the avoid area if characteristics of sensor data collected from the environment are not reliable.
20. The robotic device of clai 17, wherein the programming instructions that cause the processor to identify, based on the mode, at least one detectable boundary in the
environment for defining the virtual boundary further comprise programming instructions to cause the processor to:
identify one or more detectable boundaries in the environment; and
identify the at least one detectable boundary from the amongst the one or more detectable boundaries based on one or more rules corresponding to at least one of the following: position of the avoid area in the environment, position of the at least one detectable boundary, characteristics of sensor data collected from the avoid area, or shape of the at least on detectable boundary.
21. The robotic device of claim 17, wherein the programming instructions that cause the processor to identify the one or more detectable boundaries in the environment comprises further comprise programming instructions to cause the processor to identify the one or more detectable boundaries in the environment based on at least one of the following:
sensor data collected by the robotic device; or map data corresponding to the environment.
22. The robotic device of claim 17, wherein the one or more rules comprise a rule that the detectable boundary cannot be located in the avoid area if characteristics of sensor data collected from the environment are not reliable.
23. The robotic device of claim 17, wherein the programming instructions that cause the processor to define the virtual boundary relative to the at least one detectable boundary further comprise programming instructions to cause the processor to cause the robotic device to traverse an initial path through the permissible area to determine a shape and a position of the virtual boundary relative to the robotic device or relative to the at least one detectable boundary.
24. The robotic device of claim 17, wherein the programming instructions that cause the processor to define the virtual boundary relative to the at least one detectable boundary further comprise programming instructions to cause the processor to define a linear virtual boundary' at a location relative to the robotic device or relative to the at least one detectable boundary.
25. The robotic device of claim 24, wherein a shape of the virtual boundary' is selected from the group comprising: replication of a shape of the at least one detectable boundary, mirror image of a shape of the at least one detectable boundary, and linear.
26. The robotic device of claim 17, wherein: the programming instructions that cause the processor to identify the at least one detectable boundary in the environment for defining the virtual boundary further comprise programming instructions to cause the processor to identify the at least one detectable boundary to a left side of the robotic device: and
the programming instructions that cause the processor to define the virtual boundary relative to the at least one detectable boundary further comprise programming instructions to cause the processor to define the virtual boundary to a right side of the robotic device at a predetermined distance from an initial position of the robotic device in the environment.
27. The robotic device of claim 26, wherein the predetermined distance is equal to a distance between the initial position of the robotic device in the environment and the at least one detectable boundary.
28. The robotic device of claim 17, wherein:
the programming instructions that cause the processor to identify the at least one detectable boundary in the environment for defining the virtual boundary further comprise programming instructions to cause the processor to identify the at least one detectable boundary to a left side of the robotic device; and
he programming instructions that cause the processor to define the virtual boundary relative to the at least one detectable boundary further comprise programming instructions to cause the processor to define the virtual boundary to the left side of the robotic device at a predetermined distance from an initial position of the robotic device in the environment.
29. The robotic device of claim 28, wherein the predetermined distance is less than a distance between the initial position of the robotic device in the environment and the at least one detectable boundary.
30. The robotic device of claim 17, wherein:
the programming instructions that cause the processor to identify the at least one detectable boundary in the environment for defining the virtual boundary further comprise programming instructions to cause the processor to identify the at least one detectable boundary to a right side of the robotic device, and
the programming instructions that cause the processor to define the virtual boundary relative to the at least one detectable boundary further comprise programming instructions to cause the processor to define the virtual boundary to a left side of the robotic device at a predetermined distance from an initial position of the robotic device in the environment.
31. The robotic device of claim 30, wherein the predetermined distance is equal to a distance between the initial position of the robotic device in the environment and the at least one detectable boundary
32. The robotic device of claim 17, wherein:
the programming instructions that cause the processor to identify the at least one detectable boundary in the environment for defining the virtual boundary further comprise programming instructions to cause the processor to identify the at least one detectable boundary to a right side of the robotic device; and
he programming instructions that cause the processor to define the virtual boundary relative to the at least one detectable boundary further comprise programming instructions to cause the processor to define the virtual boundary to the right side of the robotic device at a predetermined distance from an initial position of the robotic device in the environment.
33. The robotic device of claim 32, wherein the predetermined distance is less than a distance between the initial position of the robotic device in the environment and the at least one detectable boundary.
PCT/US2019/066516 2018-12-17 2019-12-16 Methods and systems for defining virtual boundaries for a robotic device WO2020131687A2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201862780518P 2018-12-17 2018-12-17
US62/780,518 2018-12-17

Publications (2)

Publication Number Publication Date
WO2020131687A2 true WO2020131687A2 (en) 2020-06-25
WO2020131687A3 WO2020131687A3 (en) 2020-08-06

Family

ID=69174596

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2019/066516 WO2020131687A2 (en) 2018-12-17 2019-12-16 Methods and systems for defining virtual boundaries for a robotic device

Country Status (1)

Country Link
WO (1) WO2020131687A2 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112540612A (en) * 2020-09-28 2021-03-23 深圳市银星智能科技股份有限公司 Virtual wall signal adjusting method, virtual wall equipment, robot and navigation system thereof
CN113093743A (en) * 2021-03-30 2021-07-09 西北农林科技大学 Navigation control method based on virtual radar model and deep neural network
WO2024051705A1 (en) * 2022-09-08 2024-03-14 云鲸智能(深圳)有限公司 Cleaning robot and control method therefor, device, system and storage medium

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20150104311A (en) * 2014-03-05 2015-09-15 엘지전자 주식회사 Robor cleaner and method for controlling the same
TWM532256U (en) * 2016-03-15 2016-11-21 群耀光電科技(蘇州)有限公司 Compound type virtual wall and lighthouse system for self-propelled device

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112540612A (en) * 2020-09-28 2021-03-23 深圳市银星智能科技股份有限公司 Virtual wall signal adjusting method, virtual wall equipment, robot and navigation system thereof
CN113093743A (en) * 2021-03-30 2021-07-09 西北农林科技大学 Navigation control method based on virtual radar model and deep neural network
WO2024051705A1 (en) * 2022-09-08 2024-03-14 云鲸智能(深圳)有限公司 Cleaning robot and control method therefor, device, system and storage medium

Also Published As

Publication number Publication date
WO2020131687A3 (en) 2020-08-06

Similar Documents

Publication Publication Date Title
US11465284B2 (en) Restricting movement of a mobile robot
US11830618B2 (en) Interfacing with a mobile telepresence robot
US20190332114A1 (en) Robot Contextualization of Map Regions
EP3552072B1 (en) Robotic cleaning device with operating speed variation based on environment
US20200306989A1 (en) Magnetometer for robot navigation
WO2020131687A2 (en) Methods and systems for defining virtual boundaries for a robotic device
Zhang et al. An indoor navigation aid for the visually impaired
US11537141B2 (en) Robotic cleaning device with dynamic area coverage
US20210191415A1 (en) Area profile map learning for robotic device
Konam et al. Uav and service robot coordination for indoor object search tasks

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19839206

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19839206

Country of ref document: EP

Kind code of ref document: A2