GB2545134A - Discovery and monitoring of an environment using a plurality of robots - Google Patents

Discovery and monitoring of an environment using a plurality of robots Download PDF

Info

Publication number
GB2545134A
GB2545134A GB1704304.3A GB201704304A GB2545134A GB 2545134 A GB2545134 A GB 2545134A GB 201704304 A GB201704304 A GB 201704304A GB 2545134 A GB2545134 A GB 2545134A
Authority
GB
United Kingdom
Prior art keywords
robot
robots
environment
unvisited
tile
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB1704304.3A
Other versions
GB201704304D0 (en
GB2545134B (en
Inventor
Q Guo Shang
Isci Canturk
Lenchner Jonathan
Mukherjee Maharaj
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
International Business Machines Corp
Original Assignee
International Business Machines Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by International Business Machines Corp filed Critical International Business Machines Corp
Publication of GB201704304D0 publication Critical patent/GB201704304D0/en
Publication of GB2545134A publication Critical patent/GB2545134A/en
Application granted granted Critical
Publication of GB2545134B publication Critical patent/GB2545134B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0088Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots characterized by the autonomous decision making process, e.g. artificial intelligence, predefined behaviours
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0287Control of position or course in two dimensions specially adapted to land vehicles involving a plurality of land vehicles, e.g. fleet or convoy travelling
    • G05D1/0291Fleet control
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
    • G05B19/41815Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM] characterised by the cooperation between machine tools, manipulators and conveyor or other workpiece supply system, workcell
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/0274Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0287Control of position or course in two dimensions specially adapted to land vehicles involving a plurality of land vehicles, e.g. fleet or convoy travelling
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/39Robotics, robotics to robotics hand
    • G05B2219/39146Swarm, multiagent, distributed multitask fusion, cooperation multi robots
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/39Robotics, robotics to robotics hand
    • G05B2219/39168Multiple robots searching an object
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/20Monitoring the location of vehicles belonging to a group, e.g. fleet of vehicles, countable or determined number of vehicles

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Remote Sensing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • General Engineering & Computer Science (AREA)
  • Manufacturing & Machinery (AREA)
  • Quality & Reliability (AREA)
  • Business, Economics & Management (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Game Theory and Decision Science (AREA)
  • Medical Informatics (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Computer Hardware Design (AREA)
  • Theoretical Computer Science (AREA)
  • Manipulator (AREA)
  • Mathematical Physics (AREA)

Abstract

A method and apparatus for navigating several robots is provided. A environment is discretised into several discrete regions, and a breadth-first search is used to determine a next unvisited region for one of the robots to explore. The robots may each determine the next unvisited region to explore, by taking a hypothetical step (510), in all directions, and maintaining a breadth-first search tree of paths until one declaring robot finds a next unvisited region (520). This declaring robot may then declare to other robots that an unvisited region has been found, and the other robots may then determine if there is a conflict between their planned paths and that of the declaring robot (540). The declaring robots search tree may then be collapsed to a single point corresponding to the found unvisited region (560). The search method may then repeat (570) until there are no regions left to explore (580).

Description

DISCOVERY AND MONITORING OF AN ENVIRONMENT USING A PLURALITY OF ROBOTS
Field of the Invention
The present invention relates to automated techniques for the coordination of multiple mobile robots for exploring and monitoring a given environment or region.
Background of the Invention
Data centers are consuming ever more energy. Recognizing that cooling is a significant contributor to energy consumption, data center operators are beginning to tolerate higher operating temperatures. While this practice saves substantial amounts of energy, running closer to allowable operating temperature limits increases the risk that temperature problems will result in equipment failures that wipe out the financial benefits of saving energy. Vigilance is needed, and increasingly that vigilance is being provided by data center energy management software that monitors data center environmental conditions, such as temperature, and alerts operators when troublesome hot spots develop. A number of techniques have been proposed or suggested for employing one or more robots to automatically navigate, map and monitor data centers. For example, J. Lenchner et al., “Towards Data Center Self-Diagnosis Using a Mobile Robot,” ACMInt’l Conf. on Autonomic Computing (ICAC ’ll) (2011) , incorporated by reference herein, discloses a robot that serves as a physical autonomic element to automatically navigate, map and monitor data centers. The disclosed robot navigates a data center, mapping its layout and monitoring its temperature and other quantities of interest with little, if any, human assistance. In addition, United States Patent Application Serial No. 12/892,532, filed September 28, 2010, entitled “Detecting Energy and Environmental Leaks in Indoor Environments Using a Mobile Robot,” incorporated by reference herein, discloses techniques for energy and environmental leak detection in an indoor environment using one or more mobile robots.
While the use of robots has greatly improved the ability to automatically monitor indoor environments, they suffer from a number of limitations, which if overcome, could further extend the utility and efficiency of robots that are monitoring an indoor environment. For example, it is challenging for a plurality of robots to efficiently navigate an indoor environment without getting in each other’s way, especially towards the end of the exploration. A number of existing navigation techniques employ the well-known Frontier-Based A* incremental navigation method, first described for a single robot in Peter Hart et al., “A Formal Basis for the Heuristic Determination of Minimum Cost Paths,” SIGART Newsletter, 37: 28-29 (1972), and more recently described in the context of multiple robots, by Yamauchi, “Frontier-Based Exploration Using Multiple Robots,” Proc. of the Int’l Conf. on Autonomous Agents (1998). In addition, a number of existing navigation techniques have also integrated the idea of each robot carrying a "potential field" so that robots are forced to stay at some manually tuned distance from one another. See, e.g., Yong K. Hwang and Narandra Ahuja, "A Potential Field Approach to Path Planning," IEEE Trans. On Robotics and Automation Actions, Vol. 8, Issue 1 (IEEE, 1992). A need remains for more efficient navigation methods for robots that automatically navigate, map and monitor environments, particularly well-structured indoor environments such as data centers.
Summary of the Invention
The invention is defined by the claims.
Generally, aspects of the invention provide discovery and monitoring of an environment using a plurality of robots.
According to an aspect of the invention, a plurality of robots navigate an environment by obtaining a discretization of the environment to a plurality of discrete regions; and determining a next unvisited discrete region for one of the plurality of robots to explore in the exemplary environment using a breadth-first search. The plurality of discrete regions can be, for example, a plurality of real or virtual tiles. A more complete understanding of the present invention, as well as further features and advantages of the present invention, will be obtained by reference to the following detailed description and drawings.
Brief Description of the Drawings FIG. 1 illustrates an exemplary indoor environment in which the present invention can be employed; FIG. 2 is an exemplary flowchart for a potential field radius determination process incorporating aspects of the present invention; FIGS. 3 and 4 illustrate the navigation by exemplary robots Ri and R2 through the exemplary indoor environment of FIG. 1 using the disclosed varying potential field navigation technique; FIG. 5 is an exemplary flowchart for a breadth-first search path determination process incorporating aspects of the present invention; FIGS. 6 through 13 illustrate the navigation by exemplary robots Riand R2 through the exemplary indoor environment of FIG. 1 using the breadth-first search path determination process of FIG. 5 to determine the next unvisited tile; and FIG. 14 is a block diagram of a robot navigation system that can implement the processes of the present invention.
Detailed Description of Preferred Embodiments
The present invention provides improved multi-robot navigation in previously known and also previously unknown environments. According to a varying potential field aspect of the invention, multi-robot navigation of known and unknown environments is improved by varying the radius of the potential field based on the percentage of area that remains to be explored in a known environment, or the estimated percentage of area that remains to be explored in an unknown environment. The potential fields are also referred to herein as “navigation buffers.” Generally, the radius of the exemplary potential field decreases as the percentage (or estimated percentage) of the indoor environment that remains unexplored decreases. While the varying potential field aspect of the invention is illustrated using circles having substantially uniform radii around each robot, navigation buffers of any shape and of varying sizes can be established around each robot, as would be apparent to a person of ordinary skill in the art.
According to a breadth-first search aspect of the invention, multi-robot navigation of known environments is improved using breadth-first searching (BFS) to determine paths through the known environment for a plurality of robots. The disclosed breadth-first search technique employs a polynomial-time recursive heuristic that prevents two or more robots from trying to visit the same portion of the environment. Generally, each robot incrementally creates a breadth-first search tree and they collectively attempt to find the next unvisited location within the environment. Each robot updates its respective BFS tree to accommodate the robot that was the first one to successfully find an unvisited location.
Generally, a breadth-first search (BFS) is a graph search algorithm that begins at the root node and explores all the neighboring nodes. Then, for each of those nearest nodes, the graph search algorithm explores all of their unexplored neighbor nodes, and so on, until the desired node is found (i.e., a previously unvisited tile).
The term “building,” as used herein, is intended to refer to a variety of facilities, including, but not limited to, data centers hosting large amounts of information technology (IT) equipment, as well as industrial office space and residential buildings. FIG. 1 illustrates an exemplary indoor environment 100 in which the present invention can be employed. Let a set of robots {Ri,...,Rk} be given and suppose that the space to be explored within the exemplary indoor environment 100 has been discretized into a set of square "tiles." In some practical environments, such as computer data centers, the natural discretization unit is in fact a physical floor or ceiling tile. In other environments, the discretization unit may be virtual tiles. The exemplary indoor environment 100 of FIG. 1 comprises an exemplary array of 6-by-6 tiles, being explored by two exemplary robots Ri and R2. Tiles marked with an x in the exemplary indoor environment 100 remain unvisited. Tiles filled with a cross-hatched pattern indicate the presence of obstacles. As discussed further below in conjunction with FIGS. 2 through 4, the exemplary robots Ri and R2 navigate paths through the exemplary indoor environment 100 using the varying potential field aspect of the invention. As discussed further below in conjunction with FIGS. 5 through 13, the exemplary robots Riand R2 navigate paths through the exemplary indoor environment 100 using the breadth-first searching aspect of the invention.
For a detailed discussion of suitable exemplary robots, see, for example, United States Patent Application Serial No. 12/892,532, filed September 28, 2010, entitled “Detecting Energy and Environmental Leaks in Indoor Environments Using a Mobile Robot,” incorporated by reference herein. The term “robot,” as used herein, refers generally to any form of mobile electro-mechanical device that can be controlled by electronic or computer programming. In this basic form, as will be described in detail below, the exemplary robots move throughout the designated portions of the building 100 and take temperature, air flow and/or airborne matter measurements as well as time and positioning data (so as to permit the temperature, air flow and/or airborne matter data to be associated with a given position in the building 100 at a particular time). The robots should be capable of moving in various directions along the floor of the building, so as to navigate where the robots need to go and to maneuver around obstacles, such as equipment, furniture, walls, etc. in the building 100.
It is preferable that the robots (e.g., Ri and R2 of 100) have the capability to collect and store the data, i.e., temperature, air flow and/or airborne matter measurements and time/positioning data, to allow for analysis at a later time though it is also possible that the data be streamed to a controlling or server computer where said data may be processed and/or stored.
As discussed hereinafter, the exemplary indoor environment 100 can be a known or unknown environment. As indicated above, the varying potential field aspect of the invention can be used to navigate known and unknown environments. Likewise, the breadth-first search aspect of the invention can be used to navigate known environments. The varying potential field and breadth-first search aspects of the invention can be combined in the case of known environments, especially with many robots where the computation time and space costs of the breadth-first search may be prohibitive. In a known grid-space, for example, the varying potential field approach can be used until, for example, 5R grid points remain unvisited, where R is the number of robots.
Navigation buffers Based on Percentage of Unexplored Area FIG. 2 is an exemplary flowchart for a potential field radius determination process 200 incorporating aspects of the present invention in which the navigation buffers are all circles of uniform radius. As shown in FIG. 2, the potential field radius determination process 200 initially starts with a polygon P, determined during step 210, describing the perimeter of the environment to be navigated. Next, the potential field radius determination process 200, during step 220 computes the area, A, of the region to be navigated and initializes itself to the feet that there are k robots {Ri,...,Rk}.
The potential field radius determination process 200 then obtains an approximately maximum (starting) radius during step 230 for packing k uniform radius disks inside P by trying disk packings at different radii using a binary search over plausible radii and, for example, using standard grid shifting methods, see e.g., D. Hochbaum and W. Maas, “Approximation Schemes for Covering and Packing Problems in Image Processing and VLSI,” 32(1): 130-136, 1985, to achieve each packing. The determined maximum radius is then reduced during step 240 by a configurable amount, such as 10-20%, to ensure that the robots have room to move and that the determined maximum radius is an underestimate. It is again noted that navigation buffers of any shape and of varying sizes can be established around each robot, as would be apparent to a person of ordinary skill in the art.
In step 250, the robots collectively try to find the next not-yet-visited area, and, as the percentage of area that has been explored increases, the potential field radius is reduced, for example, in linear proportion, during step 260 to the current percentage of unexplored area. In the event that the region to be explored is unknown in advance, the exact percentage of unexplored area at any point in time must of course be estimated, utilizing the area of the bounding polygon and the fraction of area so-far explored that has turned out to contain obstacles. When there is no next unvisited area, in other words when the area has been completely explored, the process completes in step 270. FIGS. 3 and 4 illustrate the navigation by the exemplary robots Ri and R2 through the exemplary indoor environment 100 using the varying potential field aspect of the invention. FIG. 3 illustrates the navigation by the exemplary robots Ri and R2 at a first time, ti, and FIG. 4 illustrates the navigation by the exemplary robots Riand R2at a later time, tn after a certain fraction of tiles have been explored. The tiles marked with X2S indicate that robot R2 has explored the tiles while those marked with Xis indicate that robot Ri has explored the tiles. It is noted that the radius, rti, at time ti is greater than the radius, rtn, at time t„, since the percentage of the exemplary indoor environment 100 that remains unexplored has reduced over time. The number of tiles that remain unexplored at time ti, of FIG. 3 is 27 tiles, while the number of tiles that remain unexplored at time ta of FIG. 4 is 15 tiles. The potential fields or navigation buffers associated with the various robots are given by the associated circles, or disks, around them at the given time, in other words the disks of radius rti, at time ti in FIG. 3 and those of radius rtn, at the later time, tn, in FIG. 4. At any given time t, these disks serve to keep the robots at least a distance of 2¾ apart. In a further variation, the potential fields or navigation buffers can serve to keep the robots at least a distance of rti apart (i.e., a given robot can continue moving as long as another robot itself is not within the navigation zone of the given robot). In yet another variation, the radii or shape area can be proportional to the speed of each robot (as well as proportional to the amount of remaining/unvisited space).
Breadth-First Search Navigation FIG. 5 is an exemplary flowchart for a breadth-first search path determination process 500 incorporating aspects of the present invention. As discussed further below in conjunction with FIGS. 6 through 13, each exemplary robot Ri and R2 executes the breadth-first search path determination process 500 to determine the next unvisited tile in navigating paths through the exemplary indoor environment 100.
As shown in FIG. 5, during step 510, each robot takes one hypothetical step (in software) in all directions, maintaining a Breath First Search (BFS) tree of its complete set of paths. BFS trees are discussed further below in conjunction with FIG. 6A. In step 520, the robots develop their BFS trees to one additional level of depth and any robot that finds an as-yet unvisited tile declares this fact, along with the path by which the tile was found, to the other robots. Call any robots which find such previously unvisited tiles the “declaring” robots. The declaration is made using whatever type of robot-to-robot, or robot-to-server-to-robot communication, is in place. In step 530, a decision/control point is reached. If no new tile has been found by any robot, step 510 is repeated and the robots all develop their BFS trees to an additional level of depth. If, on the other hand, an unvisited tile has been encountered by one or more of the robots, steps 540 and 550 are performed by the robots not finding the unvisited tile, and step 560 is performed by robots finding a tile.
These sets of steps can be performed in parallel by the two sets of robots. If two robots find the same tile, an arbitrary method may be used for deciding which robot gets to be the declaring robot and hence follow step 560, and which robot gets to be the non-declaring robot. For example, the robots may be pre-numbered (indexed) Ri,...,Rk, as previously done in the Potential Field Radius Determination Process, 200, with the lower-numbered robot becoming the “declaring” robot. In step 540, the nondeclaring robots go through their respective BFS trees to see if there is a conflict with any of the paths of the declaring robots. Any time a non-declaring robot is at a given tile T at the same time as a declaring robot, it is considered a conflict, and moreover, if a nondeclaring robot is moving from a T to a tile T’ at the same time as a declaring robot is moving from tile T’ to tile T it is considered a conflict. Any robots finding a conflict in step 540 must regenerate their BFS trees avoiding any conflicts in step 550. In parallel with these activities by the non-declaring robots, the declaring robots each collapse their BFS trees to the single path taken to reach their just-found, previously unvisited, tile in step 560. Upon completion of steps 540 and 550 by all non-declaring robots, and step 560 by all declaring robots, control returns to step 510 where the robots again develop their BFS trees to an additional level of depth. The process terminates in step 580 when all tiles have been explored and so the respective BFS trees cannot be further developed.
For R robots and N total tiles with tiles having constant connectivity (4-connectivity for robots constrained to moving orthogonally between rectangular tiles, and 8-connectivity for robots free to traverse diagonally across rectangular tiles), the running time of the breadth-first search path determination process 500 is 0(KN2) since a BFS tree at any node for any robot takes 0(N) time to generate, and at each node there is time N + O(N) = O(N) to first check for a path conflict, and then regenerate the BFS starting at a given depth. Given N total nodes and K robots, a total running time of 0(KN2) can be expected, which can be completely parallelized down to 0(N2) time per robot. The expected running time is likely much less since two robots will only have a conflict if their BFS trees discover the same tile at the same time increment.
The total space complexity of the breadth-first search path determination process 500 is also 0(KN2) since a robot maintains information of size 0(N) at each of up to O(N) tiles, and this space requirement is again parallelizable down to 0(N2).
The breadth-first search path determination process 500 can optionally be computed by any one robot (or all of the robots) and hence be performed without communication between the robots or between robots and a server, other than at startup (although at greater cost in terms of computational time and space), FIGS. 6 through 13 illustrate the execution over time of the breadth-first search path determination process 500 to determine the next unvisited tile in navigating paths through the exemplary indoor environment 100 for robots constrained to moving orthogonally between square tiles, beginning in FIG. 6, with just four as yet unvisited tiles - where the unvisited tiles are denoted with Xs. FIG. 6 illustrates two iterations of the Breadth First Search path determination process 500 by Robot ri as it continues to expand its BFS Tree. In the exemplary notation used in FIGS. 6 through 13, “n, Direction” indicates a number (n) of iterations associated with a robot movement and the concatenated Direction of movements required by the robot to reach a given tile from a starting tile. For example, “1, U” indicates a first movement in an upward direction. Likewise, D indicates a move in a downward direction, R indicates a move in a right-hand direction and L indicates a move in a left-hand direction. Thus, the notation “2, RU” indicates that the robot can move to the indicated tile in two iterations with a right-hand movement followed by an upward movement. Similarly, the notation “2, DR” indicates that the robot can move to the indicated tile in two iterations with a downward movement followed by a right-hand movement. FIG. 6A illustrates the same two iterations of the BFS Tree 600 as discussed above in conjunction with FIG. 6, as implemented by robot R| but rendering the tracing of the hypothetical paths in the more-traditional “tree structure.” Note, for example, that the robot Ri could have reached the tile diagonally below it via a downward (D) motion followed by a leftward (L) motion, as well as the given leftward (L) motion followed by downward (D) motion. The choice of which of these to use is arbitrary. FIG. 7 illustrates the second iteration of the Breadth First Search path determination process 500 by robot R2as it continues to expand its BFS Tree. As shown in FIG. 7, robot R2 hits an unvisited tile 710 during the second iteration using a first downward movement, followed by a right-hand movement. The declaring robot R2 broadcasts the fact that the unvisited tile 710 has been found to the other robot, along with the path (DR). Robot Ri receives notification of the previously unvisited tile 710 and robot Ri changes the tile indicator from unvisited (X) to visited by R2 (VR2). FIG. 8 illustrates the continued execution of the Breadth First Search path determination process 500 by robot R2 after detection of the previously unvisited tile 710. As shown in FIG. 8, robot R2 collapses its search tree to a single point associated with previously unvisited tile 710, so that on the next iteration, the BFS of robot R2 can start over at that point. FIG. 9 illustrates the continued execution of the Breadth First Search path determination process 500 during a fourth iteration by robot Ri as Ri expands its BFS to find a new unvisited tile 910. As shown in FIG. 9, robot Ri hits the unvisited tile 910 during the fourth iteration using a right-hand movement, followed by an upward movement, followed by two successive right-hand movements (RURR). The declaring robot Ri broadcasts the fact that the unvisited tile 910 has been found to the other robot, along with the path (RURR). Robot R2 receives notification of the previously unvisited tile 910 and robot R2 changes the tile indicator from unvisited (X) to visited by Ri (VRi). FIG. 10 illustrates the continued execution of the Breadth First Search path determination process 500 by robot Ri after detection of the previously unvisited tile 910. As shown in FIG. 10, robot Ri collapses its search tree to a single point associated with previously unvisited tile 910, so that on the next iteration, the BFS of robot Ri can start over at that point. FIG. 11 illustrates the continued execution of the Breadth First Search path determination process 500 during a fifth iteration by robot Ri as Ri expands its BFS to find a new unvisited tile 1110. As shown in FIG. 11, robot Ri hits the unvisited tile 1110 during the fifth iteration using the sequence of movements (RURRR). The declaring robot ri broadcasts the fact that the unvisited tile 1110 has been found to the other robot, along with the path (RURRR). Robot R2 receives notification of the previously unvisited tile 1110 and robot R2 changes the tile indicator from unvisited (X) to visited by Ri (VRi). FIG. 12 illustrates the continued execution of the Breadth First Search path determination process 500 during a fifth iteration by robot R2. As shown in FIG. 12, Robot R2 has changed the tile indicator for previously unvisited tile 1110 from unvisited (X) to visited by Ri (VRi). FIG. 13 indicates the ultimate paths of the robots Ri and R2 following completion of the Breadth First Search path determination process 500, with all tiles now marked as visited.
According to an exemplary embodiment, each robot also has a vision component, e.g., a mounted camera. In the context of a regularly gridded (e.g., tiled) room such as a data center, the vision component of the robot is responsible for detecting a “pose” of the robot with respect to the center of a tile, and for determining whether the next tile which the robot wishes to investigate is visitable or blocked (for example, because the tile is occupied by equipment or otherwise obstructed). The pose of the robot is the location and orientation of the robot relative to the forward pointing “orthogonal” orientation at the center of the tile. The forward pointing orthogonal orientation is the orientation that is exactly aligned with the intended reckoning of the robot (from the center of one tile to the center of a second adjacent tile) such that if the robot moved straight ahead it would cross the boundary between the tiles along a path which is perpendicular (orthogonal) to the tile boundary and reach the center of the second tile in which it either intends to get to or intends to inspect, with the purpose of determining whether the second tile is visitable. This assumes that a (theoretical) straight line connecting the centers of two adjacent tiles is perpendicular (orthogonal) to the boundary between the two tiles, which is typically the case in data centers.
In the data center context, the vision component specializes in detecting tile boundaries, determining a distance of the robot from a tile boundary (and thereby, a distance of the robot from the center of the tile), determining an angle the robot currently makes with the line orthogonal to the next tile boundary, and determining whether the next tile in the direction the robot is headed is occupied or visitable. According to an exemplary embodiment, the robot automatically determines, e.g., tile boundaries and whether a tile is visitable or obstructed. The programming of the robot to perform this task would be apparent to one of skill in the art and thus is not described further herein.
For orientation purposes, the robot has a forward-pointing direction determined by the direction in which the vision component, e.g., camera, faces. This forward-pointing direction is also aligned with a forward wheel direction when the robot is instructed to move forward (i.e., when the robot rotates, it is not just the wheels that rotate but the entire assembly).
In a more general facility where there is no guarantee of a gridded layout of tiles, one option is to lay down a fine rectangular grid (e.g., a grid with cell dimensions of 15.24 cm by 15 .24 cm (6 inches by 6 inches)) of alpha or beta emitting particles to simulate tiles and subsequently (upon backtracking) have the robot detect the grid of alpha or beta emitting particles using methods akin to those used with a mounted webcam. While a webcam by itself would not be able to detect alpha or beta particles, once the location of the alpha or beta particles are known (e.g., using an alpha or beta detector such as a thin-film Geiger-Muller Counter), the webcam could take a snapshot of the vicinity around the alpha or beta particles and the robot could keep a record of the square determined by the alpha or beta particles and the surroundings, so that next time the robot could do a reasonable job of navigating back.
This artificially placed grid, i.e., virtual tiles, can serve to mark where the robot has been and to keep track of, for example, a best-first, A* search (see Peter Hart et al., “A Formal Basis for the Heuristic Determination of Minimum Cost Paths,” SIGART Newsletter, 37: 28-29 (1972)) or a depth-first search tree on the virtual tiles, to ensure a complete navigation of the environment, if that is desirable. A depth-first search tree is a software data structure that keeps track of an attempted complete exploration of a graph. In the case of these virtual tiles, the nodes of the graph are the virtual tiles and in one implementation of the depth first search tree, two tiles are connected by an edge of the graph if they are neighbors in the tile layout - in other words if the robot can travel from tilel to tile2 without passing through additional tiles.
To provide free movement throughout the building, in one exemplary embodiment, the robots run on battery power. Preferably, the battery is rechargeable and the system further includes one or more charging stations (not shown). That way, if the robot runs low on power during a scan, the robot can proceed to the charging station, recharge and then resume the scan at the last location visited (see below). Techniques for configuring a mobile robot to return to a charging station to recharge are known to those of skill in the art and thus are not described further herein.
Techniques that may be employed in accordance with the present teachings to coordinate movement of the robot(s) around the building while at the same time performing the necessary sensor measurements will now be described. In a data center, for example, coordinating movement of the robot(s) is facilitated somewhat by the fact that the typical data center floor consists entirely of industry-standard 60.96 cm by 60.96 cm (two foot by two foot) tiles. In this case, the localization of the robot can be accomplished using video means, as long as still pictures (provided by the robot (see above) of the floor) can be accurately analyzed and tile boundaries thereby determined. By way of example only, a computer or a human operator thereof can analyze still images taken by the robot(s) and can determine where the outer boundaries of a given tile reside.
As will be apparent from the following description, the system can utilize recognition of the boundaries of industry standard rectilinear tiles to accurately generate a floor plan previously unknown to it.
According to an exemplary embodiment, the system leverages existing location-awareness technology employing one or more of on-board sonar, laser and video, employing the methods of Simultaneous Localization and Mapping (SLAM). The heart of the SLAM process is to use the environment to update the position of the robot. Since the odometry of the robot, which can be used to give an estimate of a position of the robot, generally accumulates error or a “drift” over time, it cannot be solely relied upon to obtain the position of the robot. In addition to odometry, laser, sonar and/or video can be used to correct the position of the robot. This is accomplished using Extended Kalman Filters to extract features of the environment and then re-observing these features as the robot moves around. In the SLAM literature, features are generally called “landmarks.” The Extended Kalman Filter keeps track of an estimate of the uncertainty in the position of the robot as well as uncertainty in the landmarks it has seen in the environment. The case of a robot navigation using a web-cam and navigating around a data center (or other building/facility) equipment guided by tile boundaries is just a special case of the more generic SLAM framework.
Once readings are taken at a particular location, the robot moves to the next location, using the navigation techniques of the present invention.
The techniques described herein extend naturally to cases which have heretofore not been considered in any detail in the literature, but which are of practical significance, namely: (i) Robots have varying speeds; (ii) Robots have varying quality factors - i.e. robot Ri does a more effective job, or is more thorough in its monitoring, than some fixed standard robot R by a factor f. All monitoring locations must be covered with a minimum total quality factor; and (iii) Subregions of the environment have varying priorities
While FIGS. 2 and 5 show an exemplary sequence of steps, it is also an embodiment of the present invention that these sequences may be varied. Various permutations of the algorithms are contemplated as alternate embodiments of the invention.
While exemplary embodiments of the present invention have been described with respect to processing steps in a software program, as would be apparent to one skilled in the art, various functions may be implemented in the digital domain as processing steps in a software program, in hardware by a programmed general-purpose computer, circuit elements or state machines, or in combination of both software and hardware. Such software may be employed in, for example, a hardware device, such as a digital signal processor, application specific integrated circuit, micro-controller, or general-purpose computer. Such hardware and software may be embodied within circuits implemented within an integrated circuit.
As will be appreciated by one skilled in the art, aspects of the present invention may be embodied as a system, method or computer program product. Accordingly, aspects of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a “circuit,” “module” or “system.” Furthermore, aspects of the present invention may take the form of a computer program product embodied in one or more computer readable medium(s) having computer readable program code embodied thereon.
Any combination of one or more computer readable medium(s) may be utilized. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java (RTM), Smalltalk, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a standalone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
Aspects of the present invention are described below with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. FIG. 14 is a block diagram of a robot navigation system 1400 that can implement the processes of the present invention. As shown in FIG. 14, memory 1430 configures the processor 1420 to implement the robot navigation methods, steps, and functions disclosed herein (collectively, shown as 1480 in FIG. 14). The memory 1430 could be distributed or local and the processor 1420 could be distributed or singular. The memory 1430 could be implemented as an electrical, magnetic or optical memory, or any combination of these or other types of storage devices. It should be noted that each distributed processor that makes up processor 1420 generally contains its own addressable memory space. It should also be noted that some or all of computer system 200 can be incorporated into a personal computer, laptop computer, handheld computing device, application-specific circuit or general-use integrated circuit.
The flowcharts and block diagrams in the Figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowcharts or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
It is to be understood that the embodiments and variations shown and described herein are merely illustrative of the principles of this invention and that various modifications may be implemented by those skilled in the art without departing from the scope and spirit of the invention.
The present disclosure includes the following: [1 ] A method for navigating a plurality of robots in an environment, comprising: determining a navigation buffer for each of said robots; and allowing each of said plurality of robots to navigate within said environment while maintaining a substantially minimum distance from other robots, wherein said substantially minimum distance corresponds to a size of said navigation buffers, wherein a size of each of said navigation buffers is reduced over time based on a percentage of said environment that remains to be navigated.
[2] The method of [1], wherein said navigation buffers are reduced in direct proportion to said percentage of said environment that remains to be navigated.
[3] The method of [1 ], wherein said plurality k of navigation buffers are initialized to starting navigation buffers for fitting k substantially uniform navigation buffers inside said environment.
[4] The method of [1 ], wherein said percentage of said environment that remains is an estimated value.
[5] The method of [1], wherein said size of at least one of said navigation buffers is based on a speed of said corresponding robot.
[6] The method of [1 ], wherein each of said navigation buffers comprises a circle of a given radius around said corresponding robot.
[7] A method for navigating a plurality of robots in an environment, comprising: obtaining a discretization of said environment to a plurality of discrete regions; and determining a next unvisited discrete region for one of said plurality of robots to explore in said exemplary environment using a breadth-first search.
[8] The method of [7], wherein said determining step is performed by each of said plurality of robots.
[9] The method of [7], wherein said determining step is performed by at least one processor and a result of said determining step is provided to each of said plurality of robots.
[10] The method of [7], wherein said plurality of discrete regions comprise a plurality of real or virtual tiles.
[11] The method of [7], wherein said environment comprises a known environment.
[12] The method of [7], wherein said determining step further comprises the steps of each robot taking a hypothetical step into one of said discrete regions at a time in all possible directions, and maintaining a breath-first search tree of paths until one robot reaches said next unvisited discrete region.
[13] The method of [12], wherein said determining step further comprises the steps of said one robot declaring to others of said plurality of robots that said one robot has reached said next unvisited discrete region in said breath-first search tree; said others of said plurality of robots determining if there is a conflict with said one robot; and said one robot collapsing said breadth-first search tree to a single point of said next unvisited discrete region.
[14] An apparatus for navigating a plurality of robots in an environment, the apparatus comprising: a memory; and at least one hardware device, coupled to the memory, operative to: determine a navigation buffer for each of said robots; and allow each of said plurality of robots to navigate within said environment while maintaining a substantially minimum distance from other robots, wherein said substantially minimum distance corresponds to a size of said navigation buffers, wherein a size of each of said navigation buffers is reduced over time based on a percentage of said environment that remains to be navigated.
[15] The apparatus of [14], wherein said navigation buffers are reduced in direct proportion to said percentage of said environment that remains to be navigated.
[16] The apparatus of [14], wherein said robots employ one or more sensors to perform one or more measurements in said environment.
[17] The apparatus of [14], wherein said plurality k of navigation buffers are initialized to starting navigation buffers for fitting k substantially uniform navigation buffers inside said environment.
[18] The apparatus of [14], wherein said percentage of said environment that remains is an estimated value.
[19] The apparatus of [14], wherein a size of at least one of said navigation buffers is based on a speed of said corresponding robot.
[20] The apparatus of [14], wherein each of said navigation buffers comprise a circle of a given radius around said corresponding robot.
[21 ] An apparatus for navigating a plurality of robots in an environment, comprising: a memory; and at least one hardware device, coupled to the memory, operative to: obtain a discretization of said environment to a plurality of discrete regions; and determine a next unvisited discrete region for one of said plurality of robots to explore in said exemplary environment using a breadth-first search.
[22] The apparatus of [21], wherein said determination is performed by each of said plurality of robots.
[23] The apparatus of [21], wherein said determination is performed by at least one processor and a result of said determination is provided to each of said plurality of robots.
[24] The apparatus of [21], wherein said environment comprises a known environment.
[25] The apparatus of [21], wherein said determining step further comprises the steps of each robot taking a hypothetical step into one of said discrete regions at a time in all possible directions, and maintaining a breath-first search tree of paths until one robot reaches said next unvisited discrete region.

Claims (12)

Claims What is claimed is:
1. A method for navigating a plurality of robots in an environment, comprising: obtaining a discretization of said environment to a plurality of discrete regions; and determining a next unvisited discrete region for one of said plurality of robots to explore in said exemplary environment using a breadth-first search.
2. The method of claim 1, wherein said determining step is performed by each of said plurality of robots.
3. The method of claim 1, wherein said determining step is performed by at least one processor and a result of said determining step is provided to each of said plurality of robots.
4. The method of claim 1, wherein said plurality of discrete regions comprise a plurality of real or virtual tiles.
5. The method of claim 1, wherein said environment comprises a known environment.
6. The method of claim 1, wherein said determining step further comprises the steps of each robot taking a hypothetical step into one of said discrete regions at a time in all possible directions, and maintaining a breath-first search tree of paths until one robot reaches said next unvisited discrete region.
7. The method of claim 6, wherein said determining step further comprises the steps of said one robot declaring to others of said plurality of robots that said one robot has reached said next unvisited discrete region in said breath-first search tree; said others of said plurality of robots determining if there is a conflict with said one robot; and said one robot collapsing said breadth-first search tree to a single point of said next unvisited discrete region.
8. An apparatus for navigating a plurality of robots in an environment, comprising: a memory; and at least one hardware device, coupled to the memory, operative to: obtain a discretization of said environment to a plurality of discrete regions; and determine a next unvisited discrete region for one of said plurality of robots to explore in said exemplary environment using a breadth-first search.
9. The apparatus of claim 8, wherein said determination is performed by each of said plurality of robots.
10. The apparatus of claim 8, wherein said determination is performed by at least one processor and a result of said determination is provided to each of said plurality of robots.
11. The apparatus of claim 8, wherein said environment comprises a known environment.
12. The apparatus of claim 8, wherein said determining step further comprises the steps of each robot taking a hypothetical step into one of said discrete regions at a time in all possible directions, and maintaining a breath-first search tree of paths until one robot reaches said next unvisited discrete region.
GB1704304.3A 2012-01-12 2012-11-27 Discovery and monitoring of an environment using a plurality of robots Expired - Fee Related GB2545134B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US13/348,846 US9606542B2 (en) 2012-01-12 2012-01-12 Discovery and monitoring of an environment using a plurality of robots
GB1410495.4A GB2511966B (en) 2012-01-12 2012-11-27 Discovery and monitoring of an environment using a plurality of robots

Publications (3)

Publication Number Publication Date
GB201704304D0 GB201704304D0 (en) 2017-05-03
GB2545134A true GB2545134A (en) 2017-06-07
GB2545134B GB2545134B (en) 2017-10-11

Family

ID=48780546

Family Applications (2)

Application Number Title Priority Date Filing Date
GB1410495.4A Expired - Fee Related GB2511966B (en) 2012-01-12 2012-11-27 Discovery and monitoring of an environment using a plurality of robots
GB1704304.3A Expired - Fee Related GB2545134B (en) 2012-01-12 2012-11-27 Discovery and monitoring of an environment using a plurality of robots

Family Applications Before (1)

Application Number Title Priority Date Filing Date
GB1410495.4A Expired - Fee Related GB2511966B (en) 2012-01-12 2012-11-27 Discovery and monitoring of an environment using a plurality of robots

Country Status (8)

Country Link
US (5) US9606542B2 (en)
JP (1) JP2015505410A (en)
KR (2) KR101912001B1 (en)
CN (1) CN104040560B (en)
CA (1) CA2862213C (en)
DE (1) DE112012005193T5 (en)
GB (2) GB2511966B (en)
WO (1) WO2013106135A1 (en)

Families Citing this family (42)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9606542B2 (en) * 2012-01-12 2017-03-28 International Business Machines Corporation Discovery and monitoring of an environment using a plurality of robots
US9320074B2 (en) 2012-04-06 2016-04-19 Suitable Technologies, Inc. Method for wireless connectivity continuity and quality
WO2013152360A1 (en) 2012-04-06 2013-10-10 Suitable Technologies, Inc. System for wireless connectivity continuity and quality
US9307568B2 (en) 2012-04-06 2016-04-05 Suitable Technologies, Inc. System for wireless connectivity continuity and quality
US20130279411A1 (en) 2012-04-06 2013-10-24 Suitable Technologies, Inc. Method for wireless connectivity continuity and quality
US9320076B2 (en) 2012-04-06 2016-04-19 Suitable Technologies, Inc. System for wireless connectivity continuity and quality
US20130265885A1 (en) 2012-04-06 2013-10-10 Suitable Technologies, Inc. Method for wireless connectivity continuity and quality
US20130279473A1 (en) 2012-04-06 2013-10-24 Suitable Technologies, Inc. Method for wireless connectivity continuity and quality
US20130279472A1 (en) 2012-04-06 2013-10-24 Suitable Technologies, Inc. System for wireless connectivity continuity and quality
US9344935B2 (en) 2012-04-06 2016-05-17 Suitable Technologies, Inc. System for wireless connectivity continuity and quality
US20130343344A1 (en) 2012-04-06 2013-12-26 Suitable Technologies, Inc. Method for wireless connectivity continuity and quality
US20130279487A1 (en) 2012-04-06 2013-10-24 Suitable Technologies, Inc. System for wireless connectivity continuity and quality
US20130279479A1 (en) 2012-04-06 2013-10-24 Suitable Technologies, Inc. Method for wireless connectivity continuity and quality
US20160164976A1 (en) * 2012-09-24 2016-06-09 Suitable Technologies, Inc. Systems and methods for remote presence
US9420238B2 (en) 2014-04-10 2016-08-16 Smartvue Corporation Systems and methods for automated cloud-based 3-dimensional (3D) analytics for surveillance systems
US9405979B2 (en) 2014-04-10 2016-08-02 Smartvue Corporation Systems and methods for automated cloud-based analytics and 3-dimensional (3D) display for surveillance systems
US9407879B2 (en) 2014-04-10 2016-08-02 Smartvue Corporation Systems and methods for automated cloud-based analytics and 3-dimensional (3D) playback for surveillance systems
US10057546B2 (en) 2014-04-10 2018-08-21 Sensormatic Electronics, LLC Systems and methods for automated cloud-based analytics for security and/or surveillance
US10217003B2 (en) 2014-04-10 2019-02-26 Sensormatic Electronics, LLC Systems and methods for automated analytics for security surveillance in operation areas
US11093545B2 (en) 2014-04-10 2021-08-17 Sensormatic Electronics, LLC Systems and methods for an automated cloud-based video surveillance system
US10084995B2 (en) 2014-04-10 2018-09-25 Sensormatic Electronics, LLC Systems and methods for an automated cloud-based video surveillance system
US9686514B2 (en) 2014-04-10 2017-06-20 Kip Smrt P1 Lp Systems and methods for an automated cloud-based video surveillance system
US11120274B2 (en) 2014-04-10 2021-09-14 Sensormatic Electronics, LLC Systems and methods for automated analytics for security surveillance in operation areas
US9426428B2 (en) 2014-04-10 2016-08-23 Smartvue Corporation Systems and methods for automated cloud-based analytics and 3-dimensional (3D) display for surveillance systems in retail stores
US9407880B2 (en) 2014-04-10 2016-08-02 Smartvue Corporation Systems and methods for automated 3-dimensional (3D) cloud-based analytics for security surveillance in operation areas
JP6700198B2 (en) 2014-05-05 2020-05-27 ジョージア テック リサーチ コーポレイション Multi-robot system and control method thereof
US20160085887A1 (en) * 2014-09-18 2016-03-24 Siemens Industry Software Ltd. Method for improving efficiency of industrial robotic energy consumption and cycle time by handling location orientation
EP3198580A4 (en) * 2014-09-22 2018-05-23 Sikorsky Aircraft Corporation Coordinated planning with graph sharing over networks
US9921578B2 (en) 2016-03-09 2018-03-20 International Business Machines Corporation Automatic database filtering system utilizing robotic filters
US10040195B2 (en) 2016-06-21 2018-08-07 International Business Machines Corporation Recognizing a location of a robot shared in multiple data centers
JP6899505B2 (en) * 2016-09-16 2021-07-07 株式会社日立国際電気 Broadcasting device and received waveform evacuation method
US11270592B2 (en) * 2016-12-20 2022-03-08 Nec Corporation Vehicle control device, method for control of vehicle, and program for control of vehicle control device
US11347241B2 (en) 2017-02-21 2022-05-31 Nec Corporation Control device, control method, and non-transitory program recording medium
JP6858046B2 (en) * 2017-03-24 2021-04-14 シャープ株式会社 Driving management device, autonomous driving device, driving management method and driving management program
US11691264B2 (en) 2017-06-02 2023-07-04 Pixart Imaging Inc. Mobile robot performing multiple detections using image frames of same optical sensor
US11821985B2 (en) 2017-06-02 2023-11-21 Pixart Imaging Inc. Mobile robot performing multiple detections using image frames of same optical sensor
US10627518B2 (en) 2017-06-02 2020-04-21 Pixart Imaging Inc Tracking device with improved work surface adaptability
US11752635B2 (en) * 2017-06-02 2023-09-12 Pixart Imaging Inc. Mobile robot performing multiple detections using image frames of same optical sensor
CN109426222A (en) * 2017-08-24 2019-03-05 中华映管股份有限公司 Unmanned handling system and its operating method
CN107898393B (en) * 2017-11-17 2020-12-04 北京奇虎科技有限公司 Block adjusting method and device for cleaning robot and robot
US20190311373A1 (en) * 2018-04-04 2019-10-10 Hitachi, Ltd. System and method of taking over customer service
US20230076609A1 (en) * 2020-02-27 2023-03-09 Nec Corporation Control system, control apparatus, control method, and recording medium

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110010083A1 (en) * 2007-07-03 2011-01-13 Jae-Yeong Lee Path search method

Family Cites Families (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6577906B1 (en) 1999-08-05 2003-06-10 Sandia Corporation Distributed optimization system and method
US6507771B2 (en) 2000-07-10 2003-01-14 Hrl Laboratories Method and apparatus for controlling the movement of a plurality of agents
US6408226B1 (en) * 2001-04-24 2002-06-18 Sandia Corporation Cooperative system and method using mobile robots for testing a cooperative search controller
US6687571B1 (en) 2001-04-24 2004-02-03 Sandia Corporation Cooperating mobile robots
US7305371B2 (en) 2001-07-06 2007-12-04 Newvectors Llc Swarming agents for distributed pattern detection and classification
JP4087104B2 (en) * 2001-11-20 2008-05-21 シャープ株式会社 Group robot system
JP2003266349A (en) 2002-03-18 2003-09-24 Sony Corp Position recognition method, device thereof, program thereof, recording medium thereof, and robot device provided with position recognition device
US7844364B2 (en) * 2002-04-16 2010-11-30 Irobot Corporation Systems and methods for dispersing and clustering a plurality of robotic devices
JP4043289B2 (en) 2002-05-27 2008-02-06 シャープ株式会社 Search robot system
JP4007947B2 (en) 2002-12-20 2007-11-14 シャープ株式会社 Group robot system, sensing robot included in group robot system, base station included in group robot system, and control robot included in group robot system
US6907336B2 (en) 2003-03-31 2005-06-14 Deere & Company Method and system for efficiently traversing an area with a work vehicle
DE10317044A1 (en) 2003-04-11 2004-10-21 Daimlerchrysler Ag Optical monitoring system for use in maneuvering road vehicles provides virtual guide surfaces to ensure collision free movement
JP4251545B2 (en) 2003-07-11 2009-04-08 独立行政法人科学技術振興機構 Route planning system for mobile robot
US20060161405A1 (en) * 2004-06-04 2006-07-20 Munirajan Vignesh K Methods for locating targets and simmulating mine detection via a cognitive, swarm intelligence-based approach
JP2006085369A (en) * 2004-09-15 2006-03-30 Sony Corp Traveling object device and its control method
US20060235610A1 (en) * 2005-04-14 2006-10-19 Honeywell International Inc. Map-based trajectory generation
US8355818B2 (en) 2009-09-03 2013-01-15 Battelle Energy Alliance, Llc Robots, systems, and methods for hazard evaluation and visualization
CN101549498B (en) * 2009-04-23 2010-12-29 上海交通大学 Automatic tracking and navigation system of intelligent aid type walking robots
US9014848B2 (en) 2010-05-20 2015-04-21 Irobot Corporation Mobile robot system
US20120078417A1 (en) * 2010-09-28 2012-03-29 International Business Machines Corporartion Detecting Energy and Environmental Leaks In Indoor Environments Using a Mobile Robot
US9323250B2 (en) 2011-01-28 2016-04-26 Intouch Technologies, Inc. Time-dependent navigation of telepresence robots
CN102231082B (en) * 2011-04-08 2013-06-12 中国船舶重工集团公司第七○二研究所 Underwater object detection and autonomous underwater vehicle (AUV) automatic collision prevention method and system based on mini sonar
US9606542B2 (en) * 2012-01-12 2017-03-28 International Business Machines Corporation Discovery and monitoring of an environment using a plurality of robots

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110010083A1 (en) * 2007-07-03 2011-01-13 Jae-Yeong Lee Path search method

Also Published As

Publication number Publication date
KR20150129043A (en) 2015-11-18
CA2862213C (en) 2021-01-05
US10705537B2 (en) 2020-07-07
US20170139422A1 (en) 2017-05-18
DE112012005193T5 (en) 2014-09-04
US20170131724A1 (en) 2017-05-11
US10712749B2 (en) 2020-07-14
CN104040560A (en) 2014-09-10
CA2862213A1 (en) 2013-07-18
US8751043B2 (en) 2014-06-10
GB2511966B (en) 2017-05-10
US20200301443A1 (en) 2020-09-24
US20130184864A1 (en) 2013-07-18
GB201704304D0 (en) 2017-05-03
KR20140117519A (en) 2014-10-07
WO2013106135A1 (en) 2013-07-18
US9606542B2 (en) 2017-03-28
KR101599176B1 (en) 2016-03-02
JP2015505410A (en) 2015-02-19
GB2545134B (en) 2017-10-11
GB201410495D0 (en) 2014-07-30
US20130184865A1 (en) 2013-07-18
KR101912001B1 (en) 2018-10-25
CN104040560B (en) 2017-05-31
GB2511966A (en) 2014-09-17

Similar Documents

Publication Publication Date Title
US20200301443A1 (en) Discovery and monitoring of an environment using a plurality of robots
CN108507578B (en) Navigation method of robot
Choset Coverage for robotics–a survey of recent results
US11747825B2 (en) Autonomous map traversal with waypoint matching
EP1504277A2 (en) Real-time target tracking of an unpredictable target amid unknown obstacles
Kurazume et al. Automatic large-scale three dimensional modeling using cooperative multiple robots
Ravankar et al. A hybrid topological mapping and navigation method for large area robot mapping
O'Kane et al. Localization with limited sensing
Faigl et al. A sensor placement algorithm for a mobile robot inspection planning
CN111609853A (en) Three-dimensional map construction method, sweeping robot and electronic equipment
He et al. Camera-odometer calibration and fusion using graph based optimization
Choi et al. Cellular Communication-Based Autonomous UAV Navigation with Obstacle Avoidance for Unknown Indoor Environments.
O'Kane Global localization using odometry
AlDahak et al. Frontier-based exploration for unknown environments using incremental triangulation
Yamauchi et al. Magellan: An integrated adaptive architecture for mobile robotics
Lebedev et al. Method for inspecting high-voltage power lines using UAV based on the RRT algorithm
KR101297608B1 (en) Method and system for robot coverage of unknown environment
Cieslewski et al. Exploration without global consistency using local volume consolidation
Joshi et al. Simultaneous Navigator for Autonomous Identification and Localization Robot
US20230277027A1 (en) System and method of minimum turn coverage of arbitrary non-convex regions
Pannu Path Traversal Around Obstacles by a Robot Using Terrain Marks for Guidance

Legal Events

Date Code Title Description
PCNP Patent ceased through non-payment of renewal fee

Effective date: 20221127