WO2019104133A1 - Map-based framework for the integration of robots and smart devices - Google Patents

Map-based framework for the integration of robots and smart devices Download PDF

Info

Publication number
WO2019104133A1
WO2019104133A1 PCT/US2018/062196 US2018062196W WO2019104133A1 WO 2019104133 A1 WO2019104133 A1 WO 2019104133A1 US 2018062196 W US2018062196 W US 2018062196W WO 2019104133 A1 WO2019104133 A1 WO 2019104133A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
robots
robot
devices
map
Prior art date
Application number
PCT/US2018/062196
Other languages
French (fr)
Inventor
Gregory P. Scott
Karoline P. PERSHELL
Original Assignee
Service Robotics & Technologies, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Service Robotics & Technologies, Inc. filed Critical Service Robotics & Technologies, Inc.
Priority to US16/763,710 priority Critical patent/US10960548B2/en
Priority to EP18882221.7A priority patent/EP3713720A4/en
Publication of WO2019104133A1 publication Critical patent/WO2019104133A1/en
Priority to US17/204,176 priority patent/US20210221001A1/en

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/04Programme control other than numerical control, i.e. in sequence controllers or logic controllers
    • G05B19/042Programme control other than numerical control, i.e. in sequence controllers or logic controllers using digital processors
    • G05B19/0423Input/output
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1692Calibration of manipulator
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/021Optical sensing devices
    • B25J19/023Optical sensing devices including video camera means
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N35/00Automatic analysis not limited to methods or materials provided for in any single one of groups G01N1/00 - G01N33/00; Handling materials therefor
    • G01N35/00584Control arrangements for automatic analysers
    • G01N35/00722Communications; Identification
    • G01N35/00871Communications between instruments or with remote terminals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM]
    • G05B19/4185Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM] characterised by the network communication
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B23/00Testing or monitoring of control systems or parts thereof
    • G05B23/02Electric testing or monitoring
    • G05B23/0205Electric testing or monitoring by means of a monitoring system capable of detecting and responding to faults
    • G05B23/0208Electric testing or monitoring by means of a monitoring system capable of detecting and responding to faults characterized by the configuration of the monitoring system
    • G05B23/0216Human interface functionality, e.g. monitoring system providing help to the user in the selection of tests or in its configuration
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/0011Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot associated with a remote control arrangement
    • G05D1/0027Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot associated with a remote control arrangement involving a plurality of vehicles, e.g. fleet or convoy travelling
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/0011Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot associated with a remote control arrangement
    • G05D1/0044Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot associated with a remote control arrangement by providing the operator with a computer generated representation of the environment of the vehicle, e.g. virtual reality, maps
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/0274Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0287Control of position or course in two dimensions specially adapted to land vehicles involving a plurality of land vehicles, e.g. fleet or convoy travelling
    • G05D1/0291Fleet control
    • G05D1/0297Fleet control by controlling means in a control room
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N35/00Automatic analysis not limited to methods or materials provided for in any single one of groups G01N1/00 - G01N33/00; Handling materials therefor
    • G01N35/00584Control arrangements for automatic analysers
    • G01N35/00722Communications; Identification
    • G01N35/00871Communications between instruments or with remote terminals
    • G01N2035/00881Communications between instruments or with remote terminals network configurations
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/20Pc systems
    • G05B2219/21Pc I-O input output
    • G05B2219/21116Universal cabling; control interface between processor and devices
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Definitions

  • Embodiments of the subject matter disclosed herein generally relate to methods and systems for allowing users to monitor, control and report on a deployed fleet of robots and smart devices.
  • the industrial sector e.g., auto manufacturing
  • robotics integration companies to build customized software to control robots from multiple manufacturers, often at the cost of millions of dollars.
  • Ford Motor Company may hire an integration company to develop software to coordinate activities of a dozen welding arms and conveyers from disparate manufacturers into a single assembly line.
  • customized software is cost prohibitive.
  • the service sector e.g. hospitality, custodial, medical
  • the service sector has not been able to take advantage of the growing variety of service robotics systems as anything more than single robot deployments.
  • the custodial sector is rife with robots from disparate manufacturers, as well as custodial managers without the expertise to employ multiple, specialized robotics systems.
  • the service robotics industry was valued at $3.77 billion with more than 24,000 units sold for professional use.
  • These systems include (but are not limited to) large vacuuming and scrubbing robots from companies like Intellibot, Avidbot and CleanFix; small vacuuming and mopping robots from iRobot and Neato; environmental monitoring robots from Ecovacs;
  • a central controller for robotics and connected includes a first communication interface configured to receive data from a plurality of robots or connected devices, at least some of which plurality of robots or connected devices are of different types.
  • the data generated by robots or connected devices of different types are generated in different native data formats.
  • a processor is configured to translate the received data from the different native data formats into a common protocol format.
  • a storage framework is configured to store the data translated into the common protocol format.
  • a second communication interface is configured to transmit commands based on data stored in the common protocol format and translated to the native data format of one or more of the plurality of robots or connected devices.
  • a method for controlling robots or connected devices of different types includes the steps of receiving data from a plurality of robots or connected devices, at least some of which plurality of robots or connected devices are of different types; wherein data generated by robots or connected devices of different types are generated in different native data formats; translating said received data from the different native data formats into a common protocol format; storing the data translated into the common protocol format; and transmitting commands based on data stored in the common protocol format and translated to the native data format of one or more of the plurality of robots or connected devices.
  • a robotics deployment of a plurality of robots and connected devices system includes a first robot having a first type of sensor configured to detect a parameter and a second robot having a second type of sensor configured to detect said parameter, wherein said first type of sensor is different than said second type of sensor.
  • the deployment also includes a central controller which receives data associated with said detected parameter from both said first robot and said second robot and which is further configured to translate said data received from said first robot from a first native data format into a common central controller protocol and to translate said data received from said second robot from a second native data format into said common central controller protocol, wherein said first native data format is different than said second native data format.
  • a multi-device control interface for controlling a plurality of robots and connected devices, comprising a user interface, includes at least one display window for each of the plurality of devices, the at least one device display feature illustrating one or more conditions of a respective one of the plurality of devices; at least one device command window for each of the plurality of devices, the at least one device command feature configured to receive one or more commands for sending to the respective one of the plurality of devices; a multi- device common window comprised of a fusion of information representing a collaborative workspace between at least one user and more than one concurrently operating device of the plurality of devices, the collaborative workspace including map information added by the at least one user and map information received from the more than one concurrently operating robots of the plurality of devices and configured for presenting the fusion of information as a coherent picture of an emerging map of an environment of the plurality of devices, including overlay of any data collected by the plurality of devices for display on the emerging map of an environment; and a multi-device common window comprised of
  • a robotics system comprises a first robot having a first type of humidity sensor as one of its perceptors, a second robot having a second type of humidity sensor, and a third type of humidity sensor that is not built into a robot, but is a stand-alone sensor mounted to a wall, wherein the first type of humidity sensor is different than the second type of humidity sensor and both are different from the third type of humidity sensor.
  • the robotics system further comprises a central controller (either cloud-based or locally deployed) which receives humidity data from both the first and second robots and the third stand- alone sensor via first, second and third application programming interfaces (API), respectively.
  • API application programming interfaces
  • the first API translates the humidity data received from the first robot from a first native data format into a common central controller protocol.
  • the second API translates the humidity data received from the second robot from a second native data format into the common central controller protocol.
  • the third API translates the humidity data received from the third stand-alone sensor from a third native data format into the common central controller protocol. Data from both robots and the stand-alone sensor are stored in the common central controller protocol format.
  • FIG. 1 depicts an exemplary top view of a facility wherein a plurality of robots and connected devices or sensors have been deployed;
  • FIG. 2 is a block diagram illustrating various structural elements associated with a robot
  • FIG. 3 is a block diagram illustrating various structural elements associated with a central controller for robots and devices or sensors;
  • FIG. 4 depicts an integrated system of connected devices according to an embodiment
  • FIGS. 5A and 5B depict an integrated system of connected devices from a data perspective according to an embodiment
  • FIGS. 6A-6C depict a relational database according to an embodiment as shown in FIGS. 5A and 5B in more detail.
  • FIG. 7 illustrates a facility map with a representation of devices deployed within that facility.
  • a map-based software and hardware framework is provided that can quickly and easily link the smart hardware at a work site (e.g., vacuuming robots and mold sensors) and provide a dashboard that allows users (e.g., custodial managers) to task and monitor all connected hardware in their facility.
  • a work site e.g., vacuuming robots and mold sensors
  • users e.g., custodial managers
  • this embodiment is described in the context of custodial services, those skilled in the art will appreciate that the concepts associated with this embodiment are broadly applicable and can be applied to a wide array of other service sectors.
  • This software framework serves as a universal receptor for service robotics systems and smart devices (also referred to herein as“connected devices” which can be mounted in a fixed, stationary position or mounted to a robot). This framework takes in data from each individual robot or sensor, shares key
  • the central controller will be flexible enough to integrate any connected device, such as smart sensors (e.g., temperature sensors and smoke detectors) or other smart dispensers (e.g., bathroom soap and paper towel dispensers). All components will be part of the same working ecosystem that can be controlled and monitored from a single dashboard.
  • smart sensors e.g., temperature sensors and smoke detectors
  • other smart dispensers e.g., bathroom soap and paper towel dispensers
  • Figure 1 illustrates a top-down view of an operating area 100 for an integrated robotics and smart sensor system.
  • three different robots R 102, 104 and 106 are capable of moving about the operating area 100 where the central controller 108 calculates and transmits location, schedule and action to be taken.
  • the operating area 100 also includes a plurality of smart sensors S 1 12-1 16 which can also be homogeneous or heterogeneous.
  • a plurality of smart sensors S 1 12-1 16 which can also be homogeneous or heterogeneous.
  • the robots 102-106 and smart sensors 1 12-1 16 can be the same type of robot or smart sensor, however in order to illustrate the manner in which robotic systems according to these embodiments are able to integrate different types of devices, different types of robots are used in the illustrative examples herein.
  • the other lines and shapes illustrated in Figure 1 represent walls and other obstacles disposed within operating area 100.
  • the robots 102-106 can perform operations which, for example, might otherwise be performed by humans. In the context of custodial services, this might include vacuuming, for example.
  • An exemplary (but non-limiting) high level architecture of a robot 102 is shown in Figure 2.
  • the robot 102 may include, for example, a controller 200 including a system bus 202 which communicatively couples the controller 200 to: (1 ) one or more communication devices 204 which enable communications with other devices via communications channels, (2) one or more perceptors 206, (3) one or more manipulators 208, and (4) one or more locomotors 210.
  • the communication channels 206 may be adaptable to both wired and wireless communication, as well as supporting various communication protocols.
  • the communication channels 206 may be configured as serial and/or parallel communication channels, such as, for example, USB, IEEE-1394, 802.1 1 , BLE, cellular (e.g., LTE or 5G), and other wired and wireless communication protocols. If wireless communication channels are used, then the communication devices 204 will include a wireless transceiver and antenna (not shown in Figure 2).
  • the perceptors 206 may, for example, include any number of different sensors such as: optical sensors, inertial sensors (e.g., gyroscopes, accelerometers, etc.), thermal sensors, tactile sensors, compasses, range sensors, sonar, Global Positioning System (GPS), Ground Penetrating Radar (GPR), lasers for object detection and range sensing, imaging devices, magnetometers and the like.
  • a perceptor could also be any other existing sensor within a deployment, that would otherwise be static, but could be mounted onto a robot to get the same data distributed across a facility, instead of from a single location (e.g., temperature or humidity sensors).
  • sensors may include both a source and a sensor to combine sensor inputs into meaningful, actionable perceptions.
  • sonar perceptors and GPR may generate sound waves or sub-sonic waves and sense reflected waves.
  • perceptors including lasers may include sensors configured for detecting reflected waves from the lasers for determining interruptions or phase shifts in the laser beam.
  • Imaging devices may be any suitable device for capturing images, such as, for example, an infrared imager, a video camera, a still camera, a digital camera, a Complementary Metal Oxide Semiconductor (CMOS) imaging device, a charge coupled device (CCD) imager, and the like.
  • CMOS Complementary Metal Oxide Semiconductor
  • CCD charge coupled device
  • the imaging device may include optical devices for modifying the image to be captured, such as: lenses, collimators, filters, and mirrors.
  • optical devices for modifying the image to be captured, such as: lenses, collimators, filters, and mirrors.
  • a robot 102 may also include pan and tilt mechanisms coupled to the imaging device.
  • the manipulators 208 may include, for example, vacuum devices, magnetic pickup devices, arm manipulators, scoops, grippers, camera pan and tilt manipulators, individual or coupled actuators, and the like.
  • the locomotors 210 may include, for example, one or more wheels, tracks, legs, rollers, propellers, and the like. For providing the locomotive power and steering capabilities, the locomotors 210 may be driven by motors, actuators, levers, relays and the like.
  • perceptors 206 may be configured in conjunction with the manipulators 208 or locomotors 210, such as, for example, odometers and pedometers.
  • Smart sensors 1 10-1 16 will typically have a subset of the hardware elements associated with robots 102-106.
  • smart sensors 1 10-1 16 may not typically include manipulators 208 and locomotors 210, but may include a controller 200, one or more perceptors 206 and one or more communication devices 204.
  • the central controller 108 for controlling the robots 102-106 and smart sensors 1 10-1 16 will also include various hardware elements as shown in Figure 3.
  • the central controller 108 is where device data is stored system decisions are made. More specifically, decisions, artificial intelligence, machine learning, data analytics, and basic deduction are performed in the central controller 108 to pull together all the data from all the connected devices, make sense of it all, and recommend or automate actions based on that data.
  • the central controller 108 can be located in the cloud, locally deployed in or near to the operating area or a hybrid system including both.
  • central controller 108 includes one or more processors 300, one or more memory devices suitable for storing data, software or application code, one or more communication devices 304, including either wired or wireless communication capabilities, one or more input/output (I/O) devices 306 and a communication bus 308 to interconnect these various hardware elements.
  • processors 300 one or more memory devices suitable for storing data, software or application code
  • communication devices 304 including either wired or wireless communication capabilities
  • I/O input/output
  • communication bus 308 to interconnect these various hardware elements.
  • embodiments described herein provide for an integrated communication and control of disparate robots 102-106, smart sensors 1 10-1 16, and other devices by providing, in central controller 108, which takes in data from each individual robot and sensor, shares that information across the fleet, and controls their movements and operations.
  • FIG. 4 A high-level example of such a system 400 is illustrated in Figure 4.
  • each piece of hardware i.e. , robot or sensor
  • the robots and sensors having disparate communication protocols, as well as disparate idiosyncrasies with respect to other hardware related parameters can be plugged into the central controller 108, so all devices are integrated, sharing the same data, dashboard 408, and facility map 410 of the operating area 100, where the dashboard 408 and facility map 410 are non-limiting examples of possible I/O functions that consume the data stored by the central controller 108 in the common central controller protocol.
  • the API wrappers 404 both translate the data being provided by the robot 102-106 or smart sensor 1 10-1 16 from its native, proprietary protocol or format into a common central controller protocol and, when needed, adjusts the data values themselves to account for the unique hardware idiosyncrasies mentioned above to provide consistent data across the fleet. For example, if robot 102 deploys a first type of humidity sensor as one of its preceptor(s) 206 and robot 104 deploys a second, different type of humidity sensor (e.g., different manufacturer with the same humidity sensing technology, or just different humidity sensing technology), then the API wrappers 404 associated with robot 102 and robot 104 could include
  • the common central controller protocol can, for example, take the following exemplary format (using, purely as an example, a common central controller protocol data unit capturing information about the cleaning distance traveled by a robotic vacuum cleaner or other robotic cleaning device):
  • the id value is a numerical identifier for this particular data element in the protocol
  • the type_name value is a short textual name for the data element
  • the datatype value is one of a predetermined set of value types that the stored value for this particular data element will take
  • the data_units value is the measurement used to capture the collected data
  • the display_name is a value that can be displayed
  • a common central controller protocol needs to have a library of data elements which cover the gamut of different values which can be received from all of the different smart sensors or robots in a particular system, as well as commands which need to be transmitted to the smart sensors and robots.
  • An example of such a common central controller protocol is attached hereto as Appendix A. Note that the data element exemplified above can be found in
  • Appendix A in the row numbered based on the Id value, i.e. , 60121 .
  • a facility has a humidity device 1 which is a static device mounted to a wall at a specific (x,y,z) location on the facility map.
  • Humidity device 1 has a single sensor and returns values in percent with one significant digit and has no onboard calibration to adapt to sensor deterioration over time (ex: 35.4%, 0 calibration).
  • the facility also includes a humidity device 2, mounted to the back of a robot that drives around the facility (this robot may also vacuum, or make deliveries within the building as its primary function), which makes it a mobile humidity monitoring device.
  • the system 400 stores the raw humidity values and existing calibration values in the relational database 502, in order to make sure sensors are providing proper information, and also adds its own calibration value to sensors that do not have them onboard, to ensure the most accurate readings.
  • cross-device calibration can be conducted by the central controller using machine learning or data analytics algorithms.
  • the central controller recognizes that the first humidity sensor has no ability to perform onboard calibrations, but can decide whether or not the calibration values on the mobile humidity sensor need to be updated remotely, and can send those updates directly to the device as required.
  • the system 400 can be implemented as a hybrid software architecture split between the cloud and a local server.
  • the local server controls all robots and actions that require near-realtime control (e.g., object localization, robot navigation, emergency device shutdowns, building alarms).
  • the cloud server stores and processes all non-realtime data and information from all devices, where it is accessed for back-end analytics, data display on a dashboard, data reports, or schedule planning.
  • Figures 5A and 5B show the architecture and arrangement of various aspects of the system 400’s architecture according to an embodiment, e.g., which components communicate with each other, and which components are located locally or on the cloud.
  • the relational database 502 is an ever-evolving data management structure that stores all data collected by all devices in all deployments.
  • the relational database can be set up in the cloud, and serves as an easily referenced filing cabinet for device data. This allows users to sort data as needed, such as by facility where the devices are deployed (e.g., what a custodial manager would want to know); or by device, meaning you look at all make and model of a specific type of hardware (e.g., what a hardware manufacturer may want to know).
  • This organization-centric structure requires that all devices (captured in orgjdevice table 600) be of a specific type (specific manufacturers and model numbers, denoted in the device table 602) and assigned to a specific organization (organization table 604).
  • the device table 602 captures the generic data
  • each specific type of device i.e., brand and model of a sensor, dispenser or robot.
  • Only existing types of devices from the device table 602 can be deployed to a facility.
  • any type of smart device can be included in an organization as many times as desired, by pointing to this generic information of the device stored in the device table 602.
  • Each generic device is then assigned device specific information (e.g. serial number, IP address, firmware version), and thus is made unique within the database at the orgjdevice table 600.
  • Each unique device in the relational database is allocated to one organization, using the organization’s unique identifier.
  • This framework allows for plug-and-play components, meaning the addition or subtraction of devices from an organization will not require software development, but rather just a database update which assigns that unique device to a specific organization. Since each device is also facility-specific, meaning data is organized based on whom the recipient of the data is, this structure allows for administrators to review data across clients (to ask questions like,“What is the average charging time of all Roomba 960’s?”), but eliminates the possibility of clients seeing data that is not their own.
  • each individual instance of these orgjdevice s populates the subordinate tables with the data collected by them. For example, a complete history of this data is collected in the org_device_data_history table 606, where time- based reports can be built and a comparison between data across the lifetime of a device can be evaluated. The current status of all devices is stored in the org_device_data_history table 606, where time- based reports can be built and a comparison between data across the lifetime of a device can be evaluated. The current status of all devices is stored in the
  • org_device_data_status table 608 A history of all position data from where all devices are either installed or where they have moved through, are stored in the org_device_data_position table 610. These positions allow for the specific devices to be displayed at the correct locations on the map screen of the Ul. Additionally, a collection of specific information that is evaluated from the collected data is stored on the org_device_data_card table 612, for display on the fleet overview screen of the Ul.
  • These tables are representational as the core components of the software, but additional database tables are built and in some cases data in these tables is duplicated from other tables, in order to provide fast calculations for the machine learning and big data analytics algorithms to be optimized.
  • the dashboard 408 can include a map Ul screen 700 as shown in Figure 7, that uses the org_device_data_position table to show the current (or past) positions of the mobile robots and static devices in any particular organization.
  • Ul screens can, for example, include a drag-and-drop map of the facility, the current location of all deployed robots, and a library of all robots waiting to be deployed (as seen in Figure 7), and a fleet status overview and scheduling screen.
  • the org_device_data_position table 610 is dedicated to capturing all device position data as Cartesian coordinates (x, y, z) and body rotation (roll, pitch, yaw), where the planar map only considers x and y coordinates and the rotation denoted by yaw. These position coordinates are in relation to the specific facility map and displayed with respect to the origin of that map. Stationary devices (e.g., environmental monitoring sensors) have static coordinates, whereas robots have dynamic coordinates, yet all devices store their coordinates in the org_device_data_position table 610.
  • the system 400 pulls data from these position fields via a back-end API for all devices assigned to that facility, interpolates the data in between the position intervals stored on the table, and displays a moving trajectory (for dynamic devices, e.g., robots).
  • the position data displayed on the dashboard is approximately 5 seconds delayed due to latency in the cloud transmission and data processing layer, but optimizations in data management and database architecture are constantly evolving to improve system efficiency.
  • a collection of“cards” is displayed as a device fleet overview, one card for each deployed device.
  • the thumbnail images of the devices, names, general locations, and active tasks being worked are displayed. Additionally, battery life or quantity of product remaining is displayed as a percentage in graphical form, and connection and error reporting status are also reported.
  • Data displayed on these cards is updated at every instance that data is received from those devices. For different devices, data is transmitted at different rates. For example, environmental monitoring sensors may report temperature or humidity every 1 minute, while moving robot position data may be reported every 1 second.
  • the most recent data point from each device is stored in the org_device_data_status table 608, while all previous data from each device is stored in the org_device_data_hi story table 606. Both can be referenced at any time to build reports and provide data analytics across devices within their facility.
  • Embodiments also provide for a web service which allows the user interface to query and update the database 502, and to pull data as required for display and reporting.
  • data includes information about the organization, the map of the facility, the devices deployed within a facility, and actionable data the devices provide to the facility manager.
  • the database illustrated in Figures 6A-C is a relational database, those skilled in the art will appreciate that these embodiments can be implemented with other types of databases or data storage structures.
  • the write endpoints for the API that will allow the user to update the current location of a device, will append a new row to the org_device_data_position table with the updated position for that specific device.
  • the read endpoint pulls a device's position data into the user interface, that API returns the most recent position of that device, allowing the new location to be properly displayed on the dashboard 408.
  • the robotic system embodiments described herein provide for a number of advantages and/or benefits including, but not limited to, enabling robotic system implementations which: create a scalable system to deploy numerous versions of the same robot across multiple facilities with no additional programming requirements (plug and play); incorporate hardware from disparate manufacturer’s into the same system; combine robots and smart devices into the same monitoring and control framework; integrate data from each device to our cloud database in a common communication protocol so any single data point can be used as required across all devices, such as updating maps that are used by multiple devices; utilize the same localization and navigation algorithms on different mobile devices with different platform kinematics for individual and co-robot navigation; autonomously path plan a robot’s trajectory from one point on the map to another; display on a user interface the data received about the health and sensor data of a diverse collection of devices; allow a Facility Manager to send commands to any deployed robot to stop its task and return to its charging station (and later to any point on the map to complete a task); display the locations of all devices within a facility on a robot
  • robotic systems include a method for building and uploading a map of any facility to the cloud system. Utilizing the Robot Operating System (ROS) framework (e.g., as described at http://www.ros.orgl.
  • ROS Robot Operating System
  • embodiments employ a map-building robot which constructs a 2-dimensional map and 3-dimensional map using lidar, accurate robot odometry, stereo cameras, accelerometers, a variety of other onboard sensors, and an open source ROS package (e.g. gmapping).
  • a variety of files and file structures are created during this process.
  • two files are created to store a specific facility’s map: a .PGM file, which is a grayscale probability grid referencing the likelihood of an obstacle being located at each point within that grid; and a .YAML file, which stores the origin and resolution data for that specific map.
  • the map is stored on a local server within the facility where the devices are deployed.
  • the map is also uploaded to an encrypted file store on the cloud from the local server when the local server is booted up, and referenced in the organization table, stored in map_uri.
  • the system 400 utilizes a hybrid architecture for all mapping functions.
  • the locally stored maps are used within the ROS installation to localize and navigate locally deployed robots to that map.
  • Embodiments can also use cloud-based storage and processing systems (e.g., Amazon Web Services, Microsoft Azure) to store map files that are used for visualization of the localized robot positions on the user interface. Both maps are synchronized to ensure that the robots’ positions on the ROS map are overlaid on the same map for the user display.
  • the selected map is noted in the organization table of the relational database as the link stored in the map_uri , and used by both the local ROS server and the cloud-based Ul display.
  • Embodiments further provide for a localization algorithm that allows a research robot to localize itself on an existing facility map.
  • This localization algorithm is applicable to a variety of robots, in addition to the mapping robot.
  • a hybrid data management system has been developed: all non-realtime data (such as device sensor data, where extended processing and storage time is acceptable) is processed on the cloud through the above-defined relational database, while all real- time data (such as localization and navigation, where robot response time is critical) is processed on a local server on the same network as the robots.
  • robots localize on the version of the map stored locally, while each robot’s location is displayed on the user interface though the cloud-based map, at a few seconds delay.
  • localization can be performed onboard the robot itself in order to improve system response time.
  • beaconing hardware that gives off a wireless signal (e.g., Bluetooth, RFID, visual cues), the strength of which can be assessed by the robot to improve its localization.
  • a wireless signal e.g., Bluetooth, RFID, visual cues
  • Bluetooth beacons are used to track traffic patterns of personnel movement across a building, specifically which hallways, rooms, or areas are receiving no traffic, less traffic than average, or more traffic than average.
  • Systems designed in accordance with these embodiments take this traffic pattern analysis and determines, for example, which hallways have received significant traffic compared to those that have received limited traffic. Central algorithms then analyze the traffic patterns to determine which areas of the building need cleaning or maintenance more or less than on a normal day of operations.
  • a server-based map allows for different robots to access the same map at different times, allowing for only a single robot to access that map file at any one time.
  • multiple robots will share a map in realtime, usually requiring all robots to have similar onboard sensors and map-building capabilities, and utilizing a map format that is identical across all robots that use it.
  • Embodiments described herein include a method for synchronizing a server-based area map that can be utilized by robots of different footprints, from different manufacturers, utilizing different onboard mapping sensors, and storing map data in different formats.
  • This method utilizes a 3D representation of a mapped area, either created by a specific mapping robot or by any other robot in the system, as the baseline map for all other robots.
  • This 3D map is flattened to a 2D map based on the obstacles expected to obstruct the robot based on its height and footprint. That flattened map is then converted into the mapping format required by that specific robot and used onboard that robot to navigate an area. If that onboard map is updated by the robot, it is uploaded back to the server and integrated back into the global 3D map of that area.
  • robotics systems use a single, standardized protocol, meaning that data from any device in the system can be uniformly stored, retrieved, analyzed, compared across other devices, and combined with data from other devices, regardless of the device manufacturers.
  • Cheap temperature sensors can be used to minimize the cost, but strategically placed sensors that jointly measure humidity and temperature and other air quality components should also be included within the facility.
  • the standardized data protocol also referred to herein as the“common protocol format”
  • facility-wide heat maps using data from temperature from both brands of sensors can be made.
  • the standardized protocol means that longitudinal data (e.g., looking at the temp in this facility on this same date for each of the last five years to determine how the building may be leaking over time) can be immediately retrieved and processed from a cloud- based system.
  • algorithmic translations of the data can be provided that would, for example, involve triangulation of humidity values from different sensors across a facility map, so as to see where additional air flow would be required.
  • robotic systems can determine where peaks and troughs are for humidity levels (or temperature, or signal strength, or C02, or any other sensor-tracked information), provide a "heat map" of this data plotted against the facility map, and then provide actionable information to the facility manager so that she or he is able to make changes in her facility to address these problems (or to instruct other outside systems, like HVAC, to adapt their behaviors.
  • data from one device can now be both a direct measure and/or a proxy measure for a recommended action.
  • That action can either be completed autonomously by sending commands to connected devices or can be sent as actionable information to the humans in the loop to take action.
  • the garbage can and paper towel dispenser in this example are not connected devices, but are correlated to the soap dispenser’s data.
  • Beacon technology can monitor traffic patterns of personnel movement through hallways of a connected facility. High traffic days can indicate when a deeper clean is required for that hallway. In the case of using a vacuuming robot, it would be commanded to make an additional pass of those high traffic hallways, or could be commanded that no vacuuming is required if no traffic was detected in that hallway.
  • Coupling traffic patterns with a connected outdoor weather station that measures local rainfall can have the intelligence to schedule a mopping robot instead of a vacuuming robot due to the likelihood of mud being present that would not be effectively removed by a vacuuming robot.
  • Map-based software allows for asset tracking, which may include mobile tools or machinery, consumables, personnel, and/or static or dynamic devices.
  • goods and services e.g., soap refills, or a planned wipe down of table tops
  • autonomous, dynamic systems like floor cleaning robots— are added to a facility maintenance plan. That is, smart static devices (e.g., bathroom paper towel dispensers) can send alerts when they are near empty, and the appropriate action can be quickly taken (e.g., go to Bathroom 2 and fill the dispenser), but autonomous dynamic systems need to be tracked spatially so that errors can be addressed (e.g., a door that was supposed to be open is closed, so robot has identified the problem and is waiting for the door to be opened), or so that the device can be located quickly when it needs to be reallocated (e.g., the schedule shows the robot has already started to scrub the cafeteria, but since it was a raining today, we will postpone the cafeteria and need to scrub the front hall instead).
  • smart static devices e.g., bathroom paper towel dispensers
  • the appropriate action can be quickly taken (e.g., go to Bathroom 2 and fill the dispenser)
  • autonomous dynamic systems need to be tracked spatially so that errors can be addressed (e.g., a door that was
  • a server-based map of an area wherein a single or plurality of robots are deployed can be translated from the server-based format to an individual robot’s native format, based on the map file’s format requirements and the dimensions of the robots, and transmitted onboard to each individual robot in that deployment for localization and navigation.
  • the server-based map can be updated by any robot in the deployment, by an individual robot’s natively formatted map being translated into the format of the server-based map within the footprint specifications of the robot’s onboard sensors that built the map and transmitted to the server for integration.
  • a deployment in such a facility could include, for example, the following smart devices:
  • VOC and CO monitor near a loading dock
  • C02 monitors distributed across classrooms and high-volume rooms (cafeteria, auditorium); particulate matter counter near doorways and windows, as well as within shop/home economics class rooms, maintenance rooms, etc.; some environmental monitoring sensors attached to mobile robots for gathering distributed air quality data across non-static locations.) • 10 Leak detection sensors, standing water sensors
  • the leak detection sensors may not detect a leak, so the central controller algorithms give highest priority to water waste from the toilet, but may lead to a flood in the bathroom, and therefore the following actions are taken: bathroom’s water shutoff valve is triggered, floor mopping robot is hailed in case the leak becomes a flood, text message sent to plumber on duty, email sent to facility manager notifying them of the problem.
  • tracking beacons detect a higher than expected number of people in the building late in the evening due to a scheduled meeting. Although the remaining toilet paper and soap in the nearest bathroom to this meeting could have waited until the next morning to be replaced under normal usage conditions, the increased usage rate is predicted by the number of people in the building and a preventative message is sent to the custodian on-duty with an estimate of when the consumables will run out, so they can be replaced before they are empty.
  • the delivery robot is already pre-loaded with consumables resupply originally scheduled for the morning, and is automatically sent to the bathroom ahead of the on-duty custodian.

Abstract

A central controller for robotics and connected devices includes a first communication interface configured to receive data from a plurality of robots or connected devices, at least some of which plurality of robots or connected devices are of different types. The data generated by robots or connected devices of different types are generated in different native data formats. The central controller also includes a processor configured to translate said received data from the different native data formats into a common protocol format, a storage framework configured to store the data translated into the common protocol format; and a second communication interface configured to transmit commands based on data stored in the common protocol format and translated to the native data format of one or more of the plurality of robots or connected devices.

Description

MAP-BASED FRAMEWORK FOR THE INTEGRATION
OF ROBOTS AND SMART DEVICES
RELATED APPLICATION
[0001] This application is related to, and claims priority from, U.S. Provisional Patent Application No. 62/589,089, entitled“MAP-BASED FRAMEWORK FOR THE INTEGRATION OF ROBOTS AND SMART DEVICES” to Gregory P. Scott and Karoline P. Pershell, the disclosure of which is incorporated here by reference.
TECHNICAL FIELD
[0002] Embodiments of the subject matter disclosed herein generally relate to methods and systems for allowing users to monitor, control and report on a deployed fleet of robots and smart devices.
BACKGROUND
[0003] There currently is no standardized software language or
communication protocols for robotics systems and smart devices, making it difficult for end-users to deploy a variety of robotics systems from different manufacturers. To address this, the industrial sector (e.g., auto manufacturing) employs robotics integration companies to build customized software to control robots from multiple manufacturers, often at the cost of millions of dollars. For example, Ford Motor Company may hire an integration company to develop software to coordinate activities of a dozen welding arms and conveyers from disparate manufacturers into a single assembly line. [0004] In the service sector (e.g. hospitality, custodial, medical) which does not use robotics to perform repetitive actions on a large scale, customized software is cost prohibitive. As such, the service sector has not been able to take advantage of the growing variety of service robotics systems as anything more than single robot deployments. For example, consider the custodial sector: although an essential service, this sector is under constant pressure to reduce costs because janitorial services add no direct commercial value to an organization. In the U.S., industry cost for contract building cleaning services hit $65.4 billion in 2017 (including contract and personnel management, but neglecting in-house custodial staff). A mere 5% increase in efficiency would result in $3.2 billion in savings annually that could be re-invested into the companies.
[0005] Further, the custodial sector is rife with robots from disparate manufacturers, as well as custodial managers without the expertise to employ multiple, specialized robotics systems. In 2014, the service robotics industry was valued at $3.77 billion with more than 24,000 units sold for professional use. These systems include (but are not limited to) large vacuuming and scrubbing robots from companies like Intellibot, Avidbot and CleanFix; small vacuuming and mopping robots from iRobot and Neato; environmental monitoring robots from Ecovacs;
security robots from Knightscope and RoboteX. Additionally, there is a great variety of Internet of Things (loT) sensors from hundreds of manufacturers, including sensors from Measurement Computing, Wicked Device, Monnit and Senseware; intelligent bathroom dispensers from Hagleitner, Tork and Purell; and tracking beacons from Estimote and BluVision to name only a handful of examples. Unfortunately, every one of these robots and many smart devices run on their own software and are not designed to share information across a common framework.
[0006] Accordingly, it would be desirable to provide an integrated facility management hardware portal that allows end-users to monitor, control, and report on a deployed fleet of robots and smart devices as they transmit data about themselves and their environment, in a form consistent with the service industry.
SUMMARY
[0007] According to an embodiment, a central controller for robotics and connected includes a first communication interface configured to receive data from a plurality of robots or connected devices, at least some of which plurality of robots or connected devices are of different types. The data generated by robots or connected devices of different types are generated in different native data formats.
A processor is configured to translate the received data from the different native data formats into a common protocol format. A storage framework is configured to store the data translated into the common protocol format. A second communication interface is configured to transmit commands based on data stored in the common protocol format and translated to the native data format of one or more of the plurality of robots or connected devices.
[0008] According to another embodiment, a method for controlling robots or connected devices of different types includes the steps of receiving data from a plurality of robots or connected devices, at least some of which plurality of robots or connected devices are of different types; wherein data generated by robots or connected devices of different types are generated in different native data formats; translating said received data from the different native data formats into a common protocol format; storing the data translated into the common protocol format; and transmitting commands based on data stored in the common protocol format and translated to the native data format of one or more of the plurality of robots or connected devices.
[0009] According to another embodiment, a robotics deployment of a plurality of robots and connected devices system includes a first robot having a first type of sensor configured to detect a parameter and a second robot having a second type of sensor configured to detect said parameter, wherein said first type of sensor is different than said second type of sensor. The deployment also includes a central controller which receives data associated with said detected parameter from both said first robot and said second robot and which is further configured to translate said data received from said first robot from a first native data format into a common central controller protocol and to translate said data received from said second robot from a second native data format into said common central controller protocol, wherein said first native data format is different than said second native data format.
[0010] According to another embodiment a multi-device control interface for controlling a plurality of robots and connected devices, comprising a user interface, includes at least one display window for each of the plurality of devices, the at least one device display feature illustrating one or more conditions of a respective one of the plurality of devices; at least one device command window for each of the plurality of devices, the at least one device command feature configured to receive one or more commands for sending to the respective one of the plurality of devices; a multi- device common window comprised of a fusion of information representing a collaborative workspace between at least one user and more than one concurrently operating device of the plurality of devices, the collaborative workspace including map information added by the at least one user and map information received from the more than one concurrently operating robots of the plurality of devices and configured for presenting the fusion of information as a coherent picture of an emerging map of an environment of the plurality of devices, including overlay of any data collected by the plurality of devices for display on the emerging map of an environment; and a multi-device common window comprised of a fusion of information representing information and insights captured from across more than one concurrently operating device of the plurality of devices through statistical analysis, machine learning, and/or big data analytic techniques, and displayed in a manner most appropriate for the information collected, such as through raw numbers, tables, charts, overlays to maps.
[0011] According to an embodiment, a robotics system comprises a first robot having a first type of humidity sensor as one of its perceptors, a second robot having a second type of humidity sensor, and a third type of humidity sensor that is not built into a robot, but is a stand-alone sensor mounted to a wall, wherein the first type of humidity sensor is different than the second type of humidity sensor and both are different from the third type of humidity sensor. The robotics system further comprises a central controller (either cloud-based or locally deployed) which receives humidity data from both the first and second robots and the third stand- alone sensor via first, second and third application programming interfaces (API), respectively. The first API translates the humidity data received from the first robot from a first native data format into a common central controller protocol. The second API translates the humidity data received from the second robot from a second native data format into the common central controller protocol. The third API translates the humidity data received from the third stand-alone sensor from a third native data format into the common central controller protocol. Data from both robots and the stand-alone sensor are stored in the common central controller protocol format. BRIEF DESCRIPTION OF THE DRAWINGS
[0012] The accompanying drawings, which are incorporated in and constitute a part of the specification, illustrate one or more embodiments and, together with the description, explain these embodiments. In the drawings:
[0013] FIG. 1 depicts an exemplary top view of a facility wherein a plurality of robots and connected devices or sensors have been deployed;
[0014] FIG. 2 is a block diagram illustrating various structural elements associated with a robot;
[0015] FIG. 3 is a block diagram illustrating various structural elements associated with a central controller for robots and devices or sensors;
[0016] FIG. 4 depicts an integrated system of connected devices according to an embodiment;
[0017] FIGS. 5A and 5B depict an integrated system of connected devices from a data perspective according to an embodiment;
[0018] FIGS. 6A-6C depict a relational database according to an embodiment as shown in FIGS. 5A and 5B in more detail; and
[0019] FIG. 7 illustrates a facility map with a representation of devices deployed within that facility.
DETAILED DESCRIPTION
[0020] The following description of the embodiments refers to the accompanying drawings. The same reference numbers in different drawings identify the same or similar elements. The following detailed description does not limit the invention. Instead, the scope of the invention is defined by the appended claims. The embodiments to be discussed next are not limited to the configurations described below but may be extended to other arrangements as discussed later.
[0021] Reference throughout the specification to “one embodiment” or “an embodiment” means that a particular feature, structure or characteristic described in connection with an embodiment is included in at least one embodiment of the subject matter disclosed. Thus, the appearance of the phrases“in one embodiment” or“in an embodiment” in various places throughout the specification is not necessarily referring to the same embodiment. Further, the particular features, structures or characteristics may be combined in any suitable manner in one or more embodiments.
[0022] As mentioned above, when implementing robotic systems, systems using smart sensors, or both, it would be desirable to provide an integrated management system that allow users to monitor, control and report on a deployed fleet of robots and smart devices as they report on data about themselves and their environment. Additionally, with such an integrated robotic management system in place, it then becomes feasible to utilize the fleet of robots and smart devices, many of which may be quite different from one another in terms of how they collect and communicate data about themselves and their environment, to use the fleet containing both mobile robots and static devices to gather various types of data, homogenize the data, and use the homogenized data to create different types of maps associated with their operating area (e.g., a humidity distribution map across a floor of a building). These maps can then, according to other embodiments, be used as inputs to data analytic systems to generate practical scenarios for controlling resource utilization within that operating environment (e.g., reduce cleaning efforts by human and/or robotic cleaners in areas which have recently been less used or entirely unused). In this text, reference is thus made to robots, smart sensors and other connected devices of“different types”. In this context,“different types” refers to robots, smart sensors and other connected devices which are, for example, made by different manufacturers, or are made by the same manufacturer but are of different hardware models or versions, or are made by the same manufacturer but use different data formats/protocols and or commands to communicate information, such that when they are integrated into a particular system, such as system will benefit from the embodiments described herein.
[0023] According to an embodiment, a map-based software and hardware framework is provided that can quickly and easily link the smart hardware at a work site (e.g., vacuuming robots and mold sensors) and provide a dashboard that allows users (e.g., custodial managers) to task and monitor all connected hardware in their facility. Although this embodiment is described in the context of custodial services, those skilled in the art will appreciate that the concepts associated with this embodiment are broadly applicable and can be applied to a wide array of other service sectors.
[0024] This software framework serves as a universal receptor for service robotics systems and smart devices (also referred to herein as“connected devices” which can be mounted in a fixed, stationary position or mounted to a robot). This framework takes in data from each individual robot or sensor, shares key
components from that aggregated information across the fleet of deployed hardware, and controls their movements, schedules or responses. Expanding beyond robots, the central controller according to this embodiment will be flexible enough to integrate any connected device, such as smart sensors (e.g., temperature sensors and smoke detectors) or other smart dispensers (e.g., bathroom soap and paper towel dispensers). All components will be part of the same working ecosystem that can be controlled and monitored from a single dashboard.
[0025] Figure 1 illustrates a top-down view of an operating area 100 for an integrated robotics and smart sensor system. Therein three different robots R 102, 104 and 106 are capable of moving about the operating area 100 where the central controller 108 calculates and transmits location, schedule and action to be taken.
The operating area 100 also includes a plurality of smart sensors S 1 12-1 16 which can also be homogeneous or heterogeneous. Alternatively, those skilled in the art will appreciate that two or more of the robots 102-106 and smart sensors 1 12-1 16 can be the same type of robot or smart sensor, however in order to illustrate the manner in which robotic systems according to these embodiments are able to integrate different types of devices, different types of robots are used in the illustrative examples herein. The other lines and shapes illustrated in Figure 1 represent walls and other obstacles disposed within operating area 100.
[0026] The robots 102-106 can perform operations which, for example, might otherwise be performed by humans. In the context of custodial services, this might include vacuuming, for example. An exemplary (but non-limiting) high level architecture of a robot 102 is shown in Figure 2. Therein, the robot 102 may include, for example, a controller 200 including a system bus 202 which communicatively couples the controller 200 to: (1 ) one or more communication devices 204 which enable communications with other devices via communications channels, (2) one or more perceptors 206, (3) one or more manipulators 208, and (4) one or more locomotors 210. The communication channels 206 may be adaptable to both wired and wireless communication, as well as supporting various communication protocols. By way of example and not limitation, the communication channels 206 may be configured as serial and/or parallel communication channels, such as, for example, USB, IEEE-1394, 802.1 1 , BLE, cellular (e.g., LTE or 5G), and other wired and wireless communication protocols. If wireless communication channels are used, then the communication devices 204 will include a wireless transceiver and antenna (not shown in Figure 2).
[0027] The perceptors 206 may, for example, include any number of different sensors such as: optical sensors, inertial sensors (e.g., gyroscopes, accelerometers, etc.), thermal sensors, tactile sensors, compasses, range sensors, sonar, Global Positioning System (GPS), Ground Penetrating Radar (GPR), lasers for object detection and range sensing, imaging devices, magnetometers and the like. A perceptor could also be any other existing sensor within a deployment, that would otherwise be static, but could be mounted onto a robot to get the same data distributed across a facility, instead of from a single location (e.g., temperature or humidity sensors).
[0028] Furthermore, those skilled in the art will understand that many of these sensors may include both a source and a sensor to combine sensor inputs into meaningful, actionable perceptions. For example, sonar perceptors and GPR may generate sound waves or sub-sonic waves and sense reflected waves. Similarly, perceptors including lasers may include sensors configured for detecting reflected waves from the lasers for determining interruptions or phase shifts in the laser beam. Imaging devices may be any suitable device for capturing images, such as, for example, an infrared imager, a video camera, a still camera, a digital camera, a Complementary Metal Oxide Semiconductor (CMOS) imaging device, a charge coupled device (CCD) imager, and the like. In addition, the imaging device may include optical devices for modifying the image to be captured, such as: lenses, collimators, filters, and mirrors. For adjusting the direction at which the imaging device is oriented, a robot 102 may also include pan and tilt mechanisms coupled to the imaging device.
[0029] The manipulators 208 may include, for example, vacuum devices, magnetic pickup devices, arm manipulators, scoops, grippers, camera pan and tilt manipulators, individual or coupled actuators, and the like. The locomotors 210 may include, for example, one or more wheels, tracks, legs, rollers, propellers, and the like. For providing the locomotive power and steering capabilities, the locomotors 210 may be driven by motors, actuators, levers, relays and the like. Furthermore, perceptors 206 may be configured in conjunction with the manipulators 208 or locomotors 210, such as, for example, odometers and pedometers.
[0030] The foregoing discussion of Figure 2 and hardware associated with a typical robot was adapted from U.S. Patent No. 8,073,564 (hereafter the‘564 patent), the disclosure of which is incorporated here by reference. Those skilled in the art will appreciate, however that such elements are purely exemplary. Some robots will not include all of the elements illustrated in Figure 2, whereas other robots may include hardware elements which do not fall into the categories depicted in Figure 2.
[0031] Smart sensors 1 10-1 16 will typically have a subset of the hardware elements associated with robots 102-106. For example, smart sensors 1 10-1 16 may not typically include manipulators 208 and locomotors 210, but may include a controller 200, one or more perceptors 206 and one or more communication devices 204.
[0032] Similarly, the central controller 108 for controlling the robots 102-106 and smart sensors 1 10-1 16 will also include various hardware elements as shown in Figure 3. According to these embodiments, the central controller 108 is where device data is stored system decisions are made. More specifically, decisions, artificial intelligence, machine learning, data analytics, and basic deduction are performed in the central controller 108 to pull together all the data from all the connected devices, make sense of it all, and recommend or automate actions based on that data. The central controller 108 can be located in the cloud, locally deployed in or near to the operating area or a hybrid system including both.
[0033] Therein, central controller 108 includes one or more processors 300, one or more memory devices suitable for storing data, software or application code, one or more communication devices 304, including either wired or wireless communication capabilities, one or more input/output (I/O) devices 306 and a communication bus 308 to interconnect these various hardware elements. As mentioned above, embodiments described herein provide for an integrated communication and control of disparate robots 102-106, smart sensors 1 10-1 16, and other devices by providing, in central controller 108, which takes in data from each individual robot and sensor, shares that information across the fleet, and controls their movements and operations.
[0034] A high-level example of such a system 400 is illustrated in Figure 4. Therein, each piece of hardware (i.e. , robot or sensor) in the system has a unique API 402 that sends and receives data through its own, typically proprietary, communication protocol. By creating an API wrapper 404 for each unique piece of hardware, the robots and sensors having disparate communication protocols, as well as disparate idiosyncrasies with respect to other hardware related parameters (e.g., calibration, normalization, output range, input range, bias effects, etc., of the preceptor(s) 206, manipulator(s) 208, and/or locomotor(s) 210), can be plugged into the central controller 108, so all devices are integrated, sharing the same data, dashboard 408, and facility map 410 of the operating area 100, where the dashboard 408 and facility map 410 are non-limiting examples of possible I/O functions that consume the data stored by the central controller 108 in the common central controller protocol.
[0035] Thus, the API wrappers 404 both translate the data being provided by the robot 102-106 or smart sensor 1 10-1 16 from its native, proprietary protocol or format into a common central controller protocol and, when needed, adjusts the data values themselves to account for the unique hardware idiosyncrasies mentioned above to provide consistent data across the fleet. For example, if robot 102 deploys a first type of humidity sensor as one of its preceptor(s) 206 and robot 104 deploys a second, different type of humidity sensor (e.g., different manufacturer with the same humidity sensing technology, or just different humidity sensing technology), then the API wrappers 404 associated with robot 102 and robot 104 could include
transformation algorithms or calibration factors which change the sensed humidity values transmitted from either or both of robots 102 and 104 such that the humidity data stored by the central controller 108 in memory device(s) 302 is consistent.
[0036] According to an embodiment, the common central controller protocol can, for example, take the following exemplary format (using, purely as an example, a common central controller protocol data unit capturing information about the cleaning distance traveled by a robotic vacuum cleaner or other robotic cleaning device):
{
"id": "60121 ",
"type_name": "cleaning_distance",
"datatype": "double",
"data_units": "meters",
"display_name": "Distance cleaned during this current task",
"description": "This represents the distance in meters that the robot traveled while operating in cleaning mode"
} where the id value is a numerical identifier for this particular data element in the protocol, the type_name value is a short textual name for the data element, the datatype value is one of a predetermined set of value types that the stored value for this particular data element will take, the data_units value is the measurement used to capture the collected data, the display_name is a value that can be displayed
(e.g., on the dashboard 408), and the description value can be a longer text description of the data element having various uses. In order to provide for a comprehensive capability to capture and homogenize data received from disparate smart sensors and/or robotic devices, a common central controller protocol needs to have a library of data elements which cover the gamut of different values which can be received from all of the different smart sensors or robots in a particular system, as well as commands which need to be transmitted to the smart sensors and robots. An example of such a common central controller protocol is attached hereto as Appendix A. Note that the data element exemplified above can be found in
Appendix A in the row numbered based on the Id value, i.e. , 60121 .
[0037] To better understand how the provision of a common central protocol for data management in robotics systems according to these embodiments enhances robotics systems applications, consider the following illustrative example. Suppose that a facility has a humidity device 1 which is a static device mounted to a wall at a specific (x,y,z) location on the facility map. Humidity device 1 has a single sensor and returns values in percent with one significant digit and has no onboard calibration to adapt to sensor deterioration over time (ex: 35.4%, 0 calibration). The facility also includes a humidity device 2, mounted to the back of a robot that drives around the facility (this robot may also vacuum, or make deliveries within the building as its primary function), which makes it a mobile humidity monitoring device.
Humidity device 2 captures humidity through two onboard humidity sensors and onboard calibration values that allow for adapting to deteriorating sensors over time. Both onboard sensors utilize the calibration value to get an accurate reading and are averaged together to get an even more precise result (ex: 35.45% with +0.08% calibration and 36.56% with -0.06 calibration, resulting in 35.53% & 36.50% =
36.02% humidity). The system 400 stores the raw humidity values and existing calibration values in the relational database 502, in order to make sure sensors are providing proper information, and also adds its own calibration value to sensors that do not have them onboard, to ensure the most accurate readings. As the mobile humidity sensor passes nearby the wall-mounted humidity sensors, cross-device calibration can be conducted by the central controller using machine learning or data analytics algorithms. The central controller recognizes that the first humidity sensor has no ability to perform onboard calibrations, but can decide whether or not the calibration values on the mobile humidity sensor need to be updated remotely, and can send those updates directly to the device as required.
[0038] According to an embodiment, the system 400 can be implemented as a hybrid software architecture split between the cloud and a local server. The local server controls all robots and actions that require near-realtime control (e.g., object localization, robot navigation, emergency device shutdowns, building alarms). The cloud server stores and processes all non-realtime data and information from all devices, where it is accessed for back-end analytics, data display on a dashboard, data reports, or schedule planning. Figures 5A and 5B show the architecture and arrangement of various aspects of the system 400’s architecture according to an embodiment, e.g., which components communicate with each other, and which components are located locally or on the cloud. The representative software frameworks used here in the diagram are interchangeable with other existing software frameworks that provide similar capabilities, so the architecture is flexible to use a variety of inputs and outputs that communicate via formatted data (e.g. JSON, XML, or the like). Note that the example illustrated in Figures 5A and 5B is not a structural requirement and these embodiments are not limited to the specific software frameworks displayed therein.
[0039] An example of the database tables 502 in Figures 5A and 5B used by the system 400 according to an embodiment is illustrated in Figures 6A-6C. The relational database 502 is an ever-evolving data management structure that stores all data collected by all devices in all deployments. The relational database can be set up in the cloud, and serves as an easily referenced filing cabinet for device data. This allows users to sort data as needed, such as by facility where the devices are deployed (e.g., what a custodial manager would want to know); or by device, meaning you look at all make and model of a specific type of hardware (e.g., what a hardware manufacturer may want to know).
[0040] This organization-centric structure requires that all devices (captured in orgjdevice table 600) be of a specific type (specific manufacturers and model numbers, denoted in the device table 602) and assigned to a specific organization (organization table 604). The device table 602 captures the generic data
requirements and capabilities of each specific type of device (i.e., brand and model of a sensor, dispenser or robot). Only existing types of devices from the device table 602 can be deployed to a facility. However, any type of smart device can be included in an organization as many times as desired, by pointing to this generic information of the device stored in the device table 602. Each generic device is then assigned device specific information (e.g. serial number, IP address, firmware version), and thus is made unique within the database at the orgjdevice table 600. Each unique device in the relational database is allocated to one organization, using the organization’s unique identifier.
[0041] This framework allows for plug-and-play components, meaning the addition or subtraction of devices from an organization will not require software development, but rather just a database update which assigns that unique device to a specific organization. Since each device is also facility-specific, meaning data is organized based on whom the recipient of the data is, this structure allows for administrators to review data across clients (to ask questions like,“What is the average charging time of all Roomba 960’s?”), but eliminates the possibility of clients seeing data that is not their own.
[0042] From there, each individual instance of these orgjdevice s populates the subordinate tables with the data collected by them. For example, a complete history of this data is collected in the org_device_data_history table 606, where time- based reports can be built and a comparison between data across the lifetime of a device can be evaluated. The current status of all devices is stored in the
org_device_data_status table 608. A history of all position data from where all devices are either installed or where they have moved through, are stored in the org_device_data_position table 610. These positions allow for the specific devices to be displayed at the correct locations on the map screen of the Ul. Additionally, a collection of specific information that is evaluated from the collected data is stored on the org_device_data_card table 612, for display on the fleet overview screen of the Ul. These tables are representational as the core components of the software, but additional database tables are built and in some cases data in these tables is duplicated from other tables, in order to provide fast calculations for the machine learning and big data analytics algorithms to be optimized.
[0043] For example, the dashboard 408 can include a map Ul screen 700 as shown in Figure 7, that uses the org_device_data_position table to show the current (or past) positions of the mobile robots and static devices in any particular organization. These Ul screens can, for example, include a drag-and-drop map of the facility, the current location of all deployed robots, and a library of all robots waiting to be deployed (as seen in Figure 7), and a fleet status overview and scheduling screen.
[0044] Within the database, the org_device_data_position table 610 is dedicated to capturing all device position data as Cartesian coordinates (x, y, z) and body rotation (roll, pitch, yaw), where the planar map only considers x and y coordinates and the rotation denoted by yaw. These position coordinates are in relation to the specific facility map and displayed with respect to the origin of that map. Stationary devices (e.g., environmental monitoring sensors) have static coordinates, whereas robots have dynamic coordinates, yet all devices store their coordinates in the org_device_data_position table 610. Given one specific facility, the system 400 pulls data from these position fields via a back-end API for all devices assigned to that facility, interpolates the data in between the position intervals stored on the table, and displays a moving trajectory (for dynamic devices, e.g., robots). The position data displayed on the dashboard is approximately 5 seconds delayed due to latency in the cloud transmission and data processing layer, but optimizations in data management and database architecture are constantly evolving to improve system efficiency.
[0045] From the dashboard 408, a collection of“cards” is displayed as a device fleet overview, one card for each deployed device. On the fleet overview, the thumbnail images of the devices, names, general locations, and active tasks being worked are displayed. Additionally, battery life or quantity of product remaining is displayed as a percentage in graphical form, and connection and error reporting status are also reported. Data displayed on these cards is updated at every instance that data is received from those devices. For different devices, data is transmitted at different rates. For example, environmental monitoring sensors may report temperature or humidity every 1 minute, while moving robot position data may be reported every 1 second. The most recent data point from each device is stored in the org_device_data_status table 608, while all previous data from each device is stored in the org_device_data_hi story table 606. Both can be referenced at any time to build reports and provide data analytics across devices within their facility.
[0046] Embodiments also provide for a web service which allows the user interface to query and update the database 502, and to pull data as required for display and reporting. Such data includes information about the organization, the map of the facility, the devices deployed within a facility, and actionable data the devices provide to the facility manager. Although the database illustrated in Figures 6A-C is a relational database, those skilled in the art will appreciate that these embodiments can be implemented with other types of databases or data storage structures.
[0047] Different API endpoints were created for different tables in the relational database and for different screens on the GUI. Some of these endpoints are read-only (e.g., the information displayed about each device on the Fleet screen), while other endpoints are read-and-write (e.g., the position data of a static device is readable so that it can be displayed on the map, but is also writeable for instances where that device is physically relocated to a different location). The write endpoints are much more detailed, and involve utilizing post, put, or delete HTTP commands to modify specific columns within specific tables. For example, the write endpoints for the API that will allow the user to update the current location of a device, will append a new row to the org_device_data_position table with the updated position for that specific device. When the read endpoint pulls a device's position data into the user interface, that API returns the most recent position of that device, allowing the new location to be properly displayed on the dashboard 408.
[0048] The robotic system embodiments described herein provide for a number of advantages and/or benefits including, but not limited to, enabling robotic system implementations which: create a scalable system to deploy numerous versions of the same robot across multiple facilities with no additional programming requirements (plug and play); incorporate hardware from disparate manufacturer’s into the same system; combine robots and smart devices into the same monitoring and control framework; integrate data from each device to our cloud database in a common communication protocol so any single data point can be used as required across all devices, such as updating maps that are used by multiple devices; utilize the same localization and navigation algorithms on different mobile devices with different platform kinematics for individual and co-robot navigation; autonomously path plan a robot’s trajectory from one point on the map to another; display on a user interface the data received about the health and sensor data of a diverse collection of devices; allow a Facility Manager to send commands to any deployed robot to stop its task and return to its charging station (and later to any point on the map to complete a task); display the locations of all devices within a facility on a robot-built facility map, and allow the Facility Manager to update the location of any device in the deployment; structure data storage and retrieval to allow for the development big data analytics using data from multiple device platforms.
[0049] Mapping of a facility within which a robotic system according to these embodiments operates has been briefly discussed above and is a significant enabler for these embodiments. Those skilled in the art will appreciate that there are a number of different ways in which to generate a facility map that can then be shared amongst the robots operating in the system, some of which are described in the above incorporated by reference‘564 patent. According to some embodiments, robotic systems include a method for building and uploading a map of any facility to the cloud system. Utilizing the Robot Operating System (ROS) framework (e.g., as described at http://www.ros.orgl. embodiments employ a map-building robot which constructs a 2-dimensional map and 3-dimensional map using lidar, accurate robot odometry, stereo cameras, accelerometers, a variety of other onboard sensors, and an open source ROS package (e.g. gmapping). A variety of files and file structures are created during this process. In preparation for display of the map on the users’ dashboard, two files are created to store a specific facility’s map: a .PGM file, which is a grayscale probability grid referencing the likelihood of an obstacle being located at each point within that grid; and a .YAML file, which stores the origin and resolution data for that specific map. The map is stored on a local server within the facility where the devices are deployed. The map is also uploaded to an encrypted file store on the cloud from the local server when the local server is booted up, and referenced in the organization table, stored in map_uri.
[0050] Rather than having only a locally stored map, the system 400 utilizes a hybrid architecture for all mapping functions. The locally stored maps are used within the ROS installation to localize and navigate locally deployed robots to that map. Embodiments can also use cloud-based storage and processing systems (e.g., Amazon Web Services, Microsoft Azure) to store map files that are used for visualization of the localized robot positions on the user interface. Both maps are synchronized to ensure that the robots’ positions on the ROS map are overlaid on the same map for the user display. When multiple maps are available for a given facility, the selected map is noted in the organization table of the relational database as the link stored in the map_uri , and used by both the local ROS server and the cloud-based Ul display.
[0051] Embodiments further provide for a localization algorithm that allows a research robot to localize itself on an existing facility map. This localization algorithm is applicable to a variety of robots, in addition to the mapping robot. In order to integrate this algorithm into the central controller architecture, a hybrid data management system has been developed: all non-realtime data (such as device sensor data, where extended processing and storage time is acceptable) is processed on the cloud through the above-defined relational database, while all real- time data (such as localization and navigation, where robot response time is critical) is processed on a local server on the same network as the robots. As such, robots localize on the version of the map stored locally, while each robot’s location is displayed on the user interface though the cloud-based map, at a few seconds delay. Alternatively, localization can be performed onboard the robot itself in order to improve system response time.
[0052] To further improve the robot’s ability to localize, embodiments provide for the robotics system within the facility to incorporate beaconing hardware that gives off a wireless signal (e.g., Bluetooth, RFID, visual cues), the strength of which can be assessed by the robot to improve its localization. For example, Bluetooth beacons are used to track traffic patterns of personnel movement across a building, specifically which hallways, rooms, or areas are receiving no traffic, less traffic than average, or more traffic than average. Systems designed in accordance with these embodiments take this traffic pattern analysis and determines, for example, which hallways have received significant traffic compared to those that have received limited traffic. Central algorithms then analyze the traffic patterns to determine which areas of the building need cleaning or maintenance more or less than on a normal day of operations. Data through these algorithms will be used to automatically adjust the scheduler that assigns robots (or human personnel) specific cleaning duties so that the deeper cleaning tasks are allocated additional time, while the limited cleaning tasks are allocated less. In cases where zero traffic has been detected within an office or down a specific hallway, then the scheduler completely reallocates the robot (or human staff) away from that area, so as to focus on areas of the facility that need additional attention.
[0053] When an individual robot builds a map, localizes on it, and navigates across it, that map is often stored onboard the robot, or may be stored on a local or cloud-based server that only that robot utilizes. In some cases, a server-based map allows for different robots to access the same map at different times, allowing for only a single robot to access that map file at any one time. In rare circumstances, multiple robots will share a map in realtime, usually requiring all robots to have similar onboard sensors and map-building capabilities, and utilizing a map format that is identical across all robots that use it.
Embodiments described herein include a method for synchronizing a server-based area map that can be utilized by robots of different footprints, from different manufacturers, utilizing different onboard mapping sensors, and storing map data in different formats. This method utilizes a 3D representation of a mapped area, either created by a specific mapping robot or by any other robot in the system, as the baseline map for all other robots. This 3D map is flattened to a 2D map based on the obstacles expected to obstruct the robot based on its height and footprint. That flattened map is then converted into the mapping format required by that specific robot and used onboard that robot to navigate an area. If that onboard map is updated by the robot, it is uploaded back to the server and integrated back into the global 3D map of that area.
EXAMPLES
[0054] To understand the unique contributions of robotics systems and software according to these embodiments, various examples of applications of the foregoing principles and embodiments will now be discussed in this context.
[0055] As described above, robotics systems according to these embodiments use a single, standardized protocol, meaning that data from any device in the system can be uniformly stored, retrieved, analyzed, compared across other devices, and combined with data from other devices, regardless of the device manufacturers.
[0056] Example: Consider a climate-controlled storage warehouse which has deployed environmental monitoring sensors throughout the facility. Cheap temperature sensors can be used to minimize the cost, but strategically placed sensors that jointly measure humidity and temperature and other air quality components should also be included within the facility. With the standardized data protocol, also referred to herein as the“common protocol format”, facility-wide heat maps using data from temperature from both brands of sensors can be made. The standardized protocol means that longitudinal data (e.g., looking at the temp in this facility on this same date for each of the last five years to determine how the building may be leaking over time) can be immediately retrieved and processed from a cloud- based system.
[0057] According to an embodiment, algorithmic translations of the data can be provided that would, for example, involve triangulation of humidity values from different sensors across a facility map, so as to see where additional air flow would be required. This is another benefit of the map-based approach according to embodiments described herein, robotic systems can determine where peaks and troughs are for humidity levels (or temperature, or signal strength, or C02, or any other sensor-tracked information), provide a "heat map" of this data plotted against the facility map, and then provide actionable information to the facility manager so that she or he is able to make changes in her facility to address these problems (or to instruct other outside systems, like HVAC, to adapt their behaviors.
[0058] The outputs and outcomes of robotics systems according to these embodiments are unique, and are made possible by the standardized inputs, allowing for methods to gather information across devices. That is, the embodiments do not merely combine devices into a single software interface, but allow devices to be collaborative units working in the same ecosystem. Below are provided a progression of examples which demonstrate how the information and analytics of this integrated system can affect decision making according to certain embodiments.
[0059] By making all data streams in the same recognizable language and structured protocol, data from disparate devices can be
understood/reviewed/processed with the same algorithms, in realtime, on a cloud based system. That is, data from one device can now be both a direct measure and/or a proxy measure for a recommended action. That action can either be completed autonomously by sending commands to connected devices or can be sent as actionable information to the humans in the loop to take action.
[0060] Example: Data from a connected device used as a direct measure to affect human staff operations. If door counters show that an office was not entered on a given day, cleaning staff can omit that office from their cleaning schedule.
[0061] Example: Data from smart devices extrapolated to provide proxy measurements on non-smart devices. Data analysis can then be used for a connected device to be a proxy measure for a non-connected device, such as if a connected bathroom soap dispenser demonstrates 100 uses, this typically corresponds with 250 tri-fold paper towels that have been used and one garbage can that is expected to be full. The garbage can and paper towel dispenser in this example are not connected devices, but are correlated to the soap dispenser’s data.
[0062] Example: One connected device serves as an activator for another device. Beacon technology can monitor traffic patterns of personnel movement through hallways of a connected facility. High traffic days can indicate when a deeper clean is required for that hallway. In the case of using a vacuuming robot, it would be commanded to make an additional pass of those high traffic hallways, or could be commanded that no vacuuming is required if no traffic was detected in that hallway. Coupling traffic patterns with a connected outdoor weather station that measures local rainfall can have the intelligence to schedule a mopping robot instead of a vacuuming robot due to the likelihood of mud being present that would not be effectively removed by a vacuuming robot. [0063] As described above, various embodiments provide a map-based view of the facility, a natural and intuitive visual for facility managers, as their work requires them to get goods or provide services to specific locations in a facility. Map- based software allows for asset tracking, which may include mobile tools or machinery, consumables, personnel, and/or static or dynamic devices.
[0064] Example: Movement of goods and services (e.g., soap refills, or a planned wipe down of table tops) is done at scheduled intervals, but also may also need to be done on-the-fly (e.g., someone got ill in a hallway and it needs to be tended to now, or no toilet paper in Bathroom 2). On-the-fly requests are both expected, and yet disrupt scheduled maintenance. Map-based facility views of resources (staff and materials) allows for managers to quickly plan logistics, so as to minimize disruptions to previously scheduled maintenance.
[0065] Example: A map-based facility view is also necessary when
autonomous, dynamic systems— like floor cleaning robots— are added to a facility maintenance plan. That is, smart static devices (e.g., bathroom paper towel dispensers) can send alerts when they are near empty, and the appropriate action can be quickly taken (e.g., go to Bathroom 2 and fill the dispenser), but autonomous dynamic systems need to be tracked spatially so that errors can be addressed (e.g., a door that was supposed to be open is closed, so robot has identified the problem and is waiting for the door to be opened), or so that the device can be located quickly when it needs to be reallocated (e.g., the schedule shows the robot has already started to scrub the cafeteria, but since it was a raining today, we will postpone the cafeteria and need to scrub the front hall instead). [0066] A server-based map of an area wherein a single or plurality of robots are deployed, can be translated from the server-based format to an individual robot’s native format, based on the map file’s format requirements and the dimensions of the robots, and transmitted onboard to each individual robot in that deployment for localization and navigation. The server-based map can be updated by any robot in the deployment, by an individual robot’s natively formatted map being translated into the format of the server-based map within the footprint specifications of the robot’s onboard sensors that built the map and transmitted to the server for integration.
[0067] In the above examples of interconnected sensors from a variety of manufacturers, these systems can be deployed in various facilities like a high school: multiple standard classrooms, specialty classrooms (like woodshop and home economics kitchens), substantial hallways, large spaces like a gym and cafeteria, and several multi-stall bathrooms, loading bays by the cafeteria, an HVAC system broken into zones, industrial lighting control system, and plumbing monitoring for boilers and supply water.
[0068] A deployment in such a facility could include, for example, the following smart devices:
• 30+ air quality sensors (Temperature and humidity sensors distributed
throughout the facility; the VOC and CO monitor near a loading dock; C02 monitors distributed across classrooms and high-volume rooms (cafeteria, auditorium); particulate matter counter near doorways and windows, as well as within shop/home economics class rooms, maintenance rooms, etc.; some environmental monitoring sensors attached to mobile robots for gathering distributed air quality data across non-static locations.) • 10 Leak detection sensors, standing water sensors
• Individual flow meter sensors on each toilet and sink
• Water shutoff valves for each bathroom, kitchen, and janitors’ closet
• Industrial controllers for boilers, HVAC, and plumbing monitoring
• 2 large floor scrubbing/vacuuming robots
• 10 small vacuuming robots
• 5 small floor scrubbing robots
• 1 delivery robot for relocating supplies within the building
• 50 smart soap dispensers (across 10 bathrooms)
• 50 smart paper towel dispensers (across 10 bathrooms)
• 25 door counters (bathroom doors, entry/exit doors to school, etc.)
• 25 traffic/asset monitoring beacons
• Beacon-tracked badges for all employees within the facility
• 1 weather station located outside of the facility
[0069] In the above example of mobile robots and static devices deployed within a high school, all of the device data is collected in the common central controller protocol described above (and found in Appendix A) and calibrated and normalized to each class of data that is being collected. This generates the single data management framework which allows for all of the data collected from myriad different connected devices to be used in analytics to evaluate whether there are any desirable changes to the usage or behavior of one or more of the connected devices themselves, or to create actionable information for the user of such data to adapt other facility functions, including those outside of the robotic system itself. [0070] In the above example deployment, flow meters at each toilet measure the amount of water used during a flush. The toilets are expected to use 1.5 gallons per flush. If some threshold (e.g., 5 gallons) are detected within a single flush, then the expectation is that there is a stuck flush valve or bathroom leak. The leak detection sensors may not detect a leak, so the central controller algorithms give highest priority to water waste from the toilet, but may lead to a flood in the bathroom, and therefore the following actions are taken: bathroom’s water shutoff valve is triggered, floor mopping robot is hailed in case the leak becomes a flood, text message sent to plumber on duty, email sent to facility manager notifying them of the problem.
[0071] In the above example deployment, tracking beacons detect a higher than expected number of people in the building late in the evening due to a scheduled meeting. Although the remaining toilet paper and soap in the nearest bathroom to this meeting could have waited until the next morning to be replaced under normal usage conditions, the increased usage rate is predicted by the number of people in the building and a preventative message is sent to the custodian on-duty with an estimate of when the consumables will run out, so they can be replaced before they are empty. The delivery robot is already pre-loaded with consumables resupply originally scheduled for the morning, and is automatically sent to the bathroom ahead of the on-duty custodian.
[0072] Not all of the steps of the techniques described herein are necessarily performed in a single deployment or even in a single module.
[0073] It should be understood that this description is not intended to limit the invention. On the contrary, the embodiments are intended to cover alternatives, modifications and equivalents, which are included in the spirit and scope of the invention. Further, in the detailed description of the embodiments, numerous specific details are set forth in order to provide a comprehensive understanding of the invention. However, one skilled in the art would understand that various embodiments may be practiced without such specific details.
[0074] Although the features and elements of the present embodiments are described in the embodiments in particular combinations, each feature or element can be used alone without the other features and elements of the embodiments or in various combinations with or without other features and elements disclosed herein.

Claims

WHAT IS CLAIMED IS:
1. A central controller for robotics and connected devices comprising: a first communication interface configured to receive data from a plurality of robots or connected devices, at least some of which plurality of robots or connected devices are of different types; wherein data generated by robots or connected devices of different types are generated in different native data formats; a processor configured to translate said received data from the different native data formats into a common protocol format; a storage framework configured to store the data translated into the common protocol format; and a second communication interface configured to transmit commands based on data stored in the common protocol format and translated to the native data format of one or more of the plurality of robots or connected devices.
2. The central controller of claim 1 , wherein: the data received from each robot or connected device contains a spatial reference to a location point from where the data was collected; wherein the central controller is further configured to calibrate the data received from robots and connected devices for differences in one or more of:
sensor structures, age of hardware from which the data was received, and changes in calibration of the hardware from which the data was received.
3. The central controller of any of claims 1-2, further comprising: an output display configured to display an image of an operating area in which said plurality of robots or connected devices are deployed and wherein the physical location of each of said plurality of robots or connected devices is represented on the image at the location from which the data is collected; wherein said central controller is further configured to include on said image a plurality of position-referenced values generated from said processed data, as well as interpolated values between received data points.
4. The central controller of claim 3, wherein: the image on which the locations of robots and connected devices are displayed, is any image representing the operating area where those devices are located, including to a map built by a robot or other mapping devices, a structural floorplan for the operating area, a utility or architectural rendering of the operating area, or a graphical representation of the operating area showing location-based data distributed across the image; and the images used for said display are processed to align with every other image that represents the same operating area, including the origin reference point, orientation, rotation reference frame, and image resolution related to actual distance.
5. The central controller of any of claims 1-4, wherein: the central controller is further configured to perform analytics, computations, and interpolations on the received data after the received data has been translated from native data formats into the common protocol format.
6. The central controller of any of claims 1-5, wherein: the commands that are based on data stored in the common protocol format and translated to the native data format and are transmitted to any of the plurality of robots or connected devices, or software interfaces, include commands which are initiated or triggered based on: a) data received from any single robot or connected device within the system; or b) data from any plurality of robots or connected devices within the system; or c) data from any additional processing of the data from any single or plurality of robots or connected devices.
7. The central controller of any claims 1-6, further comprising: a server-based map of an area wherein a single or plurality of robots are deployed, can be translated from the server-based format to an individual robot’s native format, based on the map file’s format requirements and the dimensions of the robots, and transmitted onboard to each individual robot in that deployment for localization and navigation; wherein the server-based map can be updated by any robot in said
deployment, by an individual robot’s natively formatted map being translated into the format of the server-based map within the footprint specifications of the robot’s onboard sensors that built the map, and transmitted to the server for integration.
8. A method for controlling robots or connected devices of different types, the method comprising: receiving data from a plurality of robots or connected devices, at least some of which plurality of robots or connected devices are of different types; wherein data generated by robots or connected devices of different types are generated in different native data formats; translating said received data from the different native data formats into a common protocol format; storing the data translated into the common protocol format; and transmitting commands based on data stored in the common protocol format and translated to the native data format of one or more of the plurality of robots or connected devices.
9. The method of claim 8, wherein the data received from each robot or connected device contains a spatial reference to a location point from where the data was collected, the method further comprising: calibrating the data received from robots and connected devices for differences in one or more of: sensor structures, age of hardware from which the data was received, and changes in calibration of the hardware from which the data was received.
10. The method of any of claims 8-9, further comprising: displaying an image of an operating area in which said plurality of robots or connected devices are deployed and wherein the physical location of each of said plurality of robots or connected devices is represented on the image at the location from which the data is collected; and including on said image a plurality of position-referenced values generated from said processed data, as well as interpolated values between received data points.
11 . The method of claim 10, wherein: the image on which the locations of robots and connected devices are displayed, is any image representing the operating area where those devices are located, including to a map built by a robot or other mapping devices, a structural floorplan for the operating area, a utility or architectural rendering of the operating area, or a graphical representation of the operating area showing location-based data distributed across the image; and the images are processed to align with every other image that represents the same operating area, including the origin reference point, orientation, rotation reference frame, and image resolution related to actual distance.
12. The method of any of claims 8-1 1 , further comprising: performing analytics, computations, and interpolations on the received data after the received data has been translated from native data formats into the common protocol format.
13. The method of any of claims 8-12, wherein: the commands that are based on data stored in the common protocol format and translated to the native data format and are transmitted to any of the plurality of robots or connected devices, or software interfaces, include commands which are initiated or triggered based on: a) data received from any single robot or connected device within the system; or b) data from any plurality of robots or connected devices within the system; or c) data from any additional processing of the data from any single or plurality of robots or connected devices.
14. The method of any claims 8-13, further comprising: generating a server-based map of an area wherein a single or plurality of robots are deployed, can be translated from the server-based format to an individual robot’s native format, based on the map file’s format requirements and the
dimensions of the robots, and transmitted onboard to each individual robot in that deployment for localization and navigation; wherein the server-based map can be updated by any robot in said
deployment, by an individual robot’s natively formatted map being translated into the format of the server-based map within the footprint specifications of the robot’s onboard sensors that built the map, and transmitted to the server for integration.
15. A robotics deployment of a plurality of robots and connected devices system comprising: a first robot having a first type of sensor configured to detect a parameter; a second robot having a second type of sensor configured to detect said parameter, wherein said first type of sensor is different than said second type of sensor; and a central controller which receives data associated with said detected parameter from both said first robot and said second robot and which is further configured to translate said data received from said first robot from a first native data format into a common central controller protocol and to translate said data received from said second robot from a second native data format into said common central controller protocol, wherein said first native data format is different than said second native data format.
16. The robotics deployment of claim 15, wherein both said first type of sensor and said second type of sensor detect a same parameter of the sensor.
17. The robotics deployment of any of claims 15-16, further comprising: one or more static sensors configured to detect said parameter and to communicate the parameter data to said central controller in a third native data format which is different than said first and second native data formats, wherein said central controller is further configured to translate said parameter data received from said one or more static sensors into said common central controller protocol.
18. The robotics deployment of any of claims 15-17, wherein said first and second robots move throughout an operating area and periodically take measurements of said parameter at different positions within said operating area.
19. The robotics deployment of any of claims 15-19, wherein said central controller performs said translations of said parameter data from both said first native data format and said second native data format in a manner which adapts said parameter data to compensate for differences between hardware parameters associated with said first type of sensor and said second type of sensor.
20. A multi-device control interface for controlling a plurality of robots and connected devices, comprising a user interface, including: a. at least one display window for each of the plurality of devices, the at least one device display feature illustrating one or more conditions of a respective one of the plurality of devices; b. at least one device command window for each of the plurality of
devices, the at least one device command feature configured to receive one or more commands for sending to the respective one of the plurality of devices; c. a multi-device common window comprised of a fusion of information representing a collaborative workspace between at least one user and more than one concurrently operating device of the plurality of devices, the collaborative workspace including map information added by the at least one user and map information received from the more than one concurrently operating robots of the plurality of devices and configured for presenting the fusion of information as a coherent picture of an emerging map of an environment of the plurality of devices, including overlay of any data collected by the plurality of devices for display on the emerging map of an environment. d. and a multi-device common window comprised of a fusion of
information representing information and insights captured from across more than one concurrently operating device of the plurality of devices through statistical analysis, machine learning, and/or big data analytic techniques, and displayed in a manner most appropriate for the information collected, such as through raw numbers, tables, charts, overlays to maps.
PCT/US2018/062196 2017-11-21 2018-11-21 Map-based framework for the integration of robots and smart devices WO2019104133A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US16/763,710 US10960548B2 (en) 2017-11-21 2018-11-21 Map-based framework for the integration of robots and smart devices
EP18882221.7A EP3713720A4 (en) 2017-11-21 2018-11-21 Map-based framework for the integration of robots and smart devices
US17/204,176 US20210221001A1 (en) 2017-11-21 2021-03-17 Map-based framework for the integration of robots and smart devices

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201762589089P 2017-11-21 2017-11-21
US62/589,089 2017-11-21

Related Child Applications (2)

Application Number Title Priority Date Filing Date
US16/763,710 A-371-Of-International US10960548B2 (en) 2017-11-21 2018-11-21 Map-based framework for the integration of robots and smart devices
US17/204,176 Continuation US20210221001A1 (en) 2017-11-21 2021-03-17 Map-based framework for the integration of robots and smart devices

Publications (1)

Publication Number Publication Date
WO2019104133A1 true WO2019104133A1 (en) 2019-05-31

Family

ID=66632147

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2018/062196 WO2019104133A1 (en) 2017-11-21 2018-11-21 Map-based framework for the integration of robots and smart devices

Country Status (3)

Country Link
US (2) US10960548B2 (en)
EP (1) EP3713720A4 (en)
WO (1) WO2019104133A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190361623A1 (en) * 2018-05-23 2019-11-28 University-Industry Cooperation Group Of Kyung-Hee University System for providing virtual data storage medium and method of providing data using the same
TWI757818B (en) * 2019-08-22 2022-03-11 日商大正天空大樓股份有限公司 Rental system
US20220107633A1 (en) * 2020-10-02 2022-04-07 Dell Products L.P. Transportation Robot Mesh Manufacturing Environment

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20190109324A (en) * 2019-07-26 2019-09-25 엘지전자 주식회사 Method, apparatus and system for recommending location of robot charging station
KR20210095359A (en) * 2020-01-23 2021-08-02 엘지전자 주식회사 Robot, control method of the robot, and server for controlling the robot
US11524846B2 (en) 2020-10-19 2022-12-13 Gideon Brothers d.o.o. Pose determination by autonomous robots in a facility context
WO2023168334A1 (en) * 2022-03-02 2023-09-07 Bear Robotics, Inc. Data retention in image-based localization at scale

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010032268A1 (en) * 1995-05-30 2001-10-18 Brown David W. Distribution of motion control commands over a network
US20080009969A1 (en) 2006-07-05 2008-01-10 Battelle Energy Alliance, Llc Multi-Robot Control Interface
US20080133052A1 (en) * 2006-11-29 2008-06-05 Irobot Corporation Robot development platform
US20120271590A1 (en) * 2007-08-30 2012-10-25 Vijay Sakhare Method and apparatus for robot calibrations with a calibrating device
WO2013059513A1 (en) * 2011-10-18 2013-04-25 Reconrobotics, Inc. Robot control translation and gateway system
US20130304882A1 (en) * 2012-05-10 2013-11-14 Cognex Corporation Systems and Methods for Dynamically Configuring Communication Data Items
US20130325244A1 (en) * 2011-01-28 2013-12-05 Intouch Health Time-dependent navigation of telepresence robots
US20150192439A1 (en) * 2014-01-03 2015-07-09 Motorola Mobility Llc Methods and Systems for Calibrating Sensors of a Computing Device
US20170225336A1 (en) 2016-02-09 2017-08-10 Cobalt Robotics Inc. Building-Integrated Mobile Robot

Family Cites Families (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7340379B2 (en) 2005-02-18 2008-03-04 Ans, Inc. Automated acquisition and notification system
EP1772746B1 (en) 2005-10-07 2015-03-04 Saab Ab Method and arrangement for data fusion
WO2007098468A1 (en) 2006-02-21 2007-08-30 University Of Florida Research Foundation Inc. Modular platform enabling heterogeneous devices, sensors and actuators to integrate automatically into heterogeneous networks
US9195233B2 (en) 2006-02-27 2015-11-24 Perrone Robotics, Inc. General purpose robotics operating system
KR100877072B1 (en) 2007-06-28 2009-01-07 삼성전자주식회사 Method and apparatus of building map for a mobile robot and cleaning simultaneously
WO2009038772A2 (en) 2007-09-20 2009-03-26 Evolution Robotics Transferable intelligent control device
TWI439671B (en) 2009-12-18 2014-06-01 Ind Tech Res Inst Map building system, building method and computer readable media thereof
DE102010005308A1 (en) 2010-01-21 2011-07-28 Dürr Systems GmbH, 74321 Test system for testing control programs for a robot system
US9146558B2 (en) 2010-11-30 2015-09-29 Irobot Corporation Mobile robot and method of operating thereof
TWI627084B (en) 2011-09-19 2018-06-21 塔塔顧問服務有限公司 A computing platform for development and deployment of sensor-driven vehicle telemetry applications and services
ES2812568T3 (en) 2012-01-25 2021-03-17 Omron Tateisi Electronics Co Autonomous mobile robot to execute work assignments in a physical environment in which there are stationary and non-stationary obstacles
US20160196700A1 (en) 2012-12-18 2016-07-07 Quentin Guhr Universal Software Platform For Work Vehicles
KR102061511B1 (en) 2013-04-26 2020-01-02 삼성전자주식회사 Cleaning robot, home monitoring apparatus and method for controlling the same
US9307368B1 (en) 2013-05-14 2016-04-05 Google Inc. Automatically generating and maintaining a floor plan
US20150338447A1 (en) 2014-05-20 2015-11-26 Allied Telesis Holdings Kabushiki Kaisha Sensor based detection system
US9701018B2 (en) 2014-04-01 2017-07-11 Bot & Dolly, Llc Software interface for authoring robotic manufacturing process
US9756511B1 (en) 2014-05-13 2017-09-05 Senseware, Inc. System, method and apparatus for wireless sensor network configuration
US9420741B2 (en) 2014-12-15 2016-08-23 Irobot Corporation Robot lawnmower mapping
WO2016150500A1 (en) 2015-03-25 2016-09-29 Siemens Aktiengesellschaft Method for operating an automation system, automation system and automation device
EP3185124A1 (en) 2015-12-22 2017-06-28 Tata Consultancy Services Limited System and method for monitoring, deploying, and tracking autonomous software robots
CN106502095B (en) 2016-10-27 2019-11-12 福州大学 A kind of cooperative control method of more industrial robots

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010032268A1 (en) * 1995-05-30 2001-10-18 Brown David W. Distribution of motion control commands over a network
US20080009969A1 (en) 2006-07-05 2008-01-10 Battelle Energy Alliance, Llc Multi-Robot Control Interface
US8073564B2 (en) 2006-07-05 2011-12-06 Battelle Energy Alliance, Llc Multi-robot control interface
US20080133052A1 (en) * 2006-11-29 2008-06-05 Irobot Corporation Robot development platform
US20120271590A1 (en) * 2007-08-30 2012-10-25 Vijay Sakhare Method and apparatus for robot calibrations with a calibrating device
US20130325244A1 (en) * 2011-01-28 2013-12-05 Intouch Health Time-dependent navigation of telepresence robots
WO2013059513A1 (en) * 2011-10-18 2013-04-25 Reconrobotics, Inc. Robot control translation and gateway system
US20130304882A1 (en) * 2012-05-10 2013-11-14 Cognex Corporation Systems and Methods for Dynamically Configuring Communication Data Items
US20150192439A1 (en) * 2014-01-03 2015-07-09 Motorola Mobility Llc Methods and Systems for Calibrating Sensors of a Computing Device
US20170225336A1 (en) 2016-02-09 2017-08-10 Cobalt Robotics Inc. Building-Integrated Mobile Robot

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3713720A4

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190361623A1 (en) * 2018-05-23 2019-11-28 University-Industry Cooperation Group Of Kyung-Hee University System for providing virtual data storage medium and method of providing data using the same
US10852977B2 (en) * 2018-05-23 2020-12-01 University-Industry Cooperation Group Of Kyung-Hee University System for providing virtual data storage medium and method of providing data using the same
TWI757818B (en) * 2019-08-22 2022-03-11 日商大正天空大樓股份有限公司 Rental system
US20220107633A1 (en) * 2020-10-02 2022-04-07 Dell Products L.P. Transportation Robot Mesh Manufacturing Environment

Also Published As

Publication number Publication date
US10960548B2 (en) 2021-03-30
US20200368913A1 (en) 2020-11-26
EP3713720A4 (en) 2021-08-18
EP3713720A1 (en) 2020-09-30
US20210221001A1 (en) 2021-07-22

Similar Documents

Publication Publication Date Title
US20210221001A1 (en) Map-based framework for the integration of robots and smart devices
US11010501B2 (en) Monitoring users and conditions in a structure
US10671767B2 (en) Smart construction with automated detection of adverse structure conditions and remediation
US10268782B1 (en) System for conducting a service call with orienteering
US10885234B2 (en) Apparatus for determining a direction of interest
US10866157B2 (en) Monitoring a condition within a structure
CA3054299C (en) Improved building model with capture of as built features and experiential data
US11798390B2 (en) Automated robot alert system
US20200250352A1 (en) Spatial self-verifying array of nodes
US10984148B2 (en) Methods for generating a user interface based upon orientation of a smart device
US11481527B2 (en) Apparatus for displaying information about an item of equipment in a direction of interest
CA3054521A1 (en) System for conducting a service call with orienteering
US20190028843A1 (en) Methods and apparatus for orienteering
CA3148476A1 (en) Spatially self-verifying array of nodes
CA3114190A1 (en) Method and apparatus for orienteering
US20240094712A1 (en) Robot staging area management
WO2020091836A1 (en) System for conducting a service call with orienteering

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18882221

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2018882221

Country of ref document: EP

Effective date: 20200622