US20210334742A1 - Systems and methods for object replacement - Google Patents

Systems and methods for object replacement Download PDF

Info

Publication number
US20210334742A1
US20210334742A1 US17/366,492 US202117366492A US2021334742A1 US 20210334742 A1 US20210334742 A1 US 20210334742A1 US 202117366492 A US202117366492 A US 202117366492A US 2021334742 A1 US2021334742 A1 US 2021334742A1
Authority
US
United States
Prior art keywords
physical objects
replacement
computing system
central computing
facility
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US17/366,492
Inventor
Ehsan Nazarian
Behzad Nemati
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Walmart Apollo LLC
Original Assignee
Walmart Apollo LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Walmart Apollo LLC filed Critical Walmart Apollo LLC
Priority to US17/366,492 priority Critical patent/US20210334742A1/en
Assigned to WALMART APOLLO, LLC reassignment WALMART APOLLO, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: WAL-MART STORES, INC.
Assigned to WAL-MART STORES, INC. reassignment WAL-MART STORES, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: NAZARIAN, EHSAN, NEMATI, BEHZAD
Publication of US20210334742A1 publication Critical patent/US20210334742A1/en
Priority to US18/243,851 priority patent/US20230419252A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/08Logistics, e.g. warehousing, loading or distribution; Inventory or stock management
    • G06Q10/087Inventory or stock management, e.g. order filling, procurement or balancing against orders
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/0088Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot characterized by the autonomous decision making process, e.g. artificial intelligence, predefined behaviours
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/0094Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot involving pointing a payload, e.g. camera, weapon, sensor, towards a fixed or moving target
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0287Control of position or course in two dimensions specially adapted to land vehicles involving a plurality of land vehicles, e.g. fleet or convoy travelling
    • G05D1/0291Fleet control
    • G05D1/0297Fleet control by controlling means in a control room
    • G06K9/00664
    • G06K9/00671
    • G06K9/00771
    • G06K9/22
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/593Depth or shape recovery from multiple images from stereo images
    • G06T7/596Depth or shape recovery from multiple images from stereo images from three or more stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/17Image acquisition using hand-held instruments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/20Scenes; Scene-specific elements in augmented reality scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D2201/00Application
    • G05D2201/02Control of position of land vehicles
    • G05D2201/0216Vehicle for transporting goods in a warehouse, factory or similar

Definitions

  • Object replacement can be a slow and error prone process causing delays in filling vacant spaces in facilities, in which physical objects are designated to be disposed.
  • FIGS. 1A-B are a block diagrams illustrating an autonomous robot device navigating in a facility according to exemplary embodiments of the present disclosure
  • FIG. 2 illustrates a network diagram of an object replacement system in accordance with an exemplary embodiment
  • FIG. 3 illustrates a block diagram an exemplary computing device in accordance with an exemplary embodiment
  • FIG. 4 is a flowchart illustrating a process implemented by the object replacement system according to an exemplary embodiment.
  • FIG. 5 is a flowchart illustrating a process implemented by the object replacement system according to an exemplary embodiment.
  • a central computing system which can include a data storage facility and can be operatively coupled to remote systems, can be configured to receive data associated with quantities of like physical objects from the remote systems.
  • the central computing system can determine that data corresponding to a first quantity of the like physical objects stored in a first one of the remote systems fails to correspond to data corresponding to a second quantity of the like physical objects stored at least another one of the remote systems.
  • the central computing system can adjust the data corresponding to the first quantity of the like physical objects stored in the first one of the remote systems based on the data corresponding to the second quantity of the like physical objects stored in the at least another one of the remote systems.
  • the central computing system can generate an expected value for the quantity of the like physical objects at a facility associated with the first one of the remote systems, in response to the execution of the reconciliation of the plurality of quantities of the like physical objects.
  • the central computing system can trigger, via the central computing system, an alert in response to determining the expected quantity is less than a threshold amount.
  • An autonomous robot device can receive the alert indicating the expected value for the quantity of the like physical objects is less than a threshold amount.
  • the autonomous robot device can determine the designated location of the like physical objects within the facility.
  • the autonomous robot device can autonomously navigate to the designated location of the like physical objects, and can detect, via an image capturing device, a vacant space at the designated location at which the like physical objects are supposed to be disposed.
  • the autonomous robot device using the image capturing device, can capture an image of the vacant space.
  • the autonomous robot device can transmit the image to the central computing system, which can extract the physical attributes of the vacant space.
  • the central computing system can query a database to retrieve attributes associated with the like physical objects for the designated location.
  • the central computing system can determine a set of replacement physical objects based on the physical attributes of the vacant space and the attributes associated with the physical object that are supposed to be in the vacant space.
  • the physical attributes of the vacant space include the shape, size and dimensions of the vacant space.
  • the central computing system can generate a replaceability score for each of the plurality of replacement physical objects in the set based on a calculated probability that the like physical objects are replaceable by the replacement physical objects in the set.
  • replacement physical objects that have similar dimensions to the vacant space and/or the absent physical objects can result in a higher replaceability score.
  • replacement physical objects having similar ingredients, functions, or uses can result in a higher replaceability score.
  • the central computing system can rank the replacement physical objects based on the replaceability score.
  • the central computing system can transmit instructions to the autonomous robot device to retrieve a set of like replacement physical objects from a location in the facility based on the ranking and to deposit the set of like replacement physical objects in the vacant space at the designated location.
  • the autonomous robot device can receive the instructions from the central computing system, can navigate to the location in the facility of the set of like replacement objects, can pick up the set of like replacement objects, and can navigate to the designated location.
  • the autonomous robot device can deposit the set of like replacement objects in the vacant space at the designated location to fill the vacant space.
  • FIGS. 1A-B are a block diagrams illustrating an autonomous robot device navigating in the facility according to exemplary embodiments of the present disclosure.
  • physical objects 102 A can be disposed in a first area 100 of a facility.
  • the physical objects 102 A can be disposed on a shelving unit 104 .
  • a label 106 can be disposed below the physical objects 102 A.
  • the label 106 can include a string of alphanumeric characters and/or a machine-readable element 108 encoded with an identifier associated with the physical object disposed above the corresponding label 106 .
  • An autonomous robot device 110 can navigate autonomously to the first area 100 of the facility.
  • the autonomous robot device 110 can be a driverless vehicle, an unmanned aerial craft, and/or the like.
  • the autonomous robot device 110 can include an image capturing device 112 , motive assemblies 114 , a picking unit 115 , a controller 116 , an optical scanner 118 , a drive motor 120 , a GPS receiver 122 , accelerometer 124 and a gyroscope 126 , and can be configured to roam autonomously through a facility.
  • the picking unit 115 can be an articulated arm.
  • the autonomous robot device 110 can be and intelligent device capable of performing tasks without human control.
  • the controller 116 can be programmed to control an operation of the image capturing device 112 , motive assemblies 114 , (e.g., via the drive motor 120 ), in response to various inputs including inputs from the GPS receiver 122 , the accelerometer 124 , and the gyroscope 126 .
  • the drive motor 120 can control the operation of the motive assemblies 122 directly and/or through one or more drive trains (e.g., gear assemblies and/or belts).
  • the motive assemblies 122 are wheels affixed to the bottom end of the autonomous robot device 110 .
  • the motive assemblies 122 can be but are not limited to wheels, tracks, rotors, rotors with blades, and propellers.
  • the motive assemblies 122 can facilitate 360 degree movement for the autonomous robot device 110 .
  • the image capturing device 112 can be a still image camera or a moving image camera.
  • the controller 116 of the autonomous robot device 110 can be configured to control the drive motor 120 to drive the motive assemblies 114 so that the autonomous robot device 110 can autonomously navigate through the facility based on inputs from the GPS receiver 122 , accelerometer 124 and gyroscope 126 .
  • the GPS receiver 122 can be an L-band radio processor capable of solving the navigation equations in order to determine a position of the autonomous robot device 110 , determine a velocity and precise time (PVT) by processing the signal broadcasted by GPS satellites.
  • the accelerometer 124 and gyroscope 126 can determine the direction, orientation, position, acceleration, velocity, tilt, pitch, yaw, and roll of the autonomous robot device 110 .
  • the controller 116 can implement one or more algorithms, such as a Kalman filter, for determining a position of the autonomous robot device
  • the autonomous robot device 110 can roam to the first area 100 in the facility using the motive assemblies 114 and the controller 116 can control the image capturing device 112 to capture images of the set of physical objects 102 A in a designated location 109 and the respective labels 106 including the string and/or machine-readable elements 108 .
  • the autonomous robot device 110 can be programmed with a map of the facility and/or can generate a map of the facility using simultaneous localization and mapping (SLAM).
  • SLAM simultaneous localization and mapping
  • the autonomous robot device 110 can navigate around the facility based on inputs from the GPS receiver 122 , the accelerometer 124 , and/or the gyroscope 126 .
  • the autonomous robot device 110 can be configured to capture images after an amount of time that elapses between captures, a distance traveled within the facility, continuously, and/or the like.
  • the autonomous robot device 110 can determine from the captured image of the designated location 109 that the set of like physical objects 102 A is absent from the shelving unit 104 at the designated location 109 , i.e., there is a vacant space at the designated location 109 .
  • the autonomous robot device 110 can use machine vision to determine the set of like physical objects 102 A is absent from the designated location 109 in the shelving unit. Machine vision can be used to provide imaging-based automatic inspection and analysis of the facility.
  • the autonomous robot device 110 can extract the identifier from the machine-readable element 108 disposed adjacent to the vacant space, and associated with the absent set of like physical objects 102 A from the captured image using machine vision. Alternatively or in addition to, the autonomous robot device 110 can extract the identifier by scanning the machine-readable element 108 using the optical scanner 118 . The autonomous robot device 110 can transmit the identifier to a computing system. The autonomous robot device 110 can also transmit the captured images and/or the detected attributes associated with the absent physical objects 102 A. The computing system will be discussed in greater detail with reference to FIG. 5 .
  • replacement physical objects 102 B can be disposed at a second area 150 of the facility. Similar to the physical objects 102 A disposed in first area 100 , the physical objects 102 B can be disposed on a shelving unit 104 . A label 106 can be disposed below the physical objects 104 . The label 106 can include a string of alphanumeric characters and/or a machine-readable element 108 encoded with an identifier associated with the physical object disposed above the corresponding label 106 .
  • the autonomous robot device 110 can receive instructions from the computing system to navigate to a second area 150 of the facility and pick up a set of replacement physical objects 102 B to deposit in the designated location of the set of absent physical objects 102 A from the first location of the facility.
  • the instructions can include identification information associated with the replacement physical object 102 B, and a quantity of the replacement physical object to be picked up by the autonomous robot device 110 .
  • the identification information can include an identifier associated with the replacement physical object 102 B or other attributes (i.e. name, size, type, or color) associated with the replacement physical object 102 B.
  • the autonomous robot device 110 can navigate to the second location 150 of the facility.
  • the autonomous robot device 110 can capture images of the physical objects 102 B disposed on the shelving unit 104 .
  • the autonomous robot device 110 can extract attributes of the replacement physical object 102 B from the captured images.
  • the autonomous robot device 110 can identify the replacement physical object based on the attributes extracted from the captured images and the identification information received in the instructions.
  • the autonomous robot device 110 can also use machine vision to identify the replacement physical object 102 B.
  • the autonomous robot device 110 can extract the identifier from the machine-readable element 108 associated with the absent set of like replacement physical object 102 B from the captured image using machine vision.
  • the autonomous robot device 110 can extract the identifier by scanning the machine-readable element 108 using the optical scanner 118 .
  • the autonomous robot device 110 can pick up a set of replacement physical objects 102 B, using the picking unit 115 .
  • the autonomous robot device 110 can carry the set of replacement physical objects 102 and navigate to the first area 100 .
  • the autonomous robot device 110 can deposit the set of replacement physical objects 102 B in the designated location (e.g. designated area 109 as shown in FIG. 1A ) of the absent physical objects 102 A.
  • FIG. 2 illustrates a network diagram of an object replacement system in accordance with an exemplary embodiment.
  • the object replacement system 250 can include one or more databases 205 , one or more central computing systems 200 , one or more autonomous robotic devices 110 , and one or more remote systems 240 communicating over communication network 215 .
  • the remote systems 240 can include a remote system database 242 .
  • the central computing system 200 can execute one or more instances of a control engine 220 and a decision engine 225 .
  • the control engine 220 and decision engine 225 can be an executable application residing on the computing system 200 to implement the object replacement system 250 as described herein.
  • one or more portions of the communications network 215 can be an ad hoc network, an intranet, an extranet, a virtual private network (VPN), a local area network (LAN), a wireless LAN (WLAN), a wide area network (WAN), a wireless wide area network (WWAN), a metropolitan area network (MAN), a portion of the Internet, a portion of the Public Switched Telephone Network (PSTN), a cellular telephone network, a wireless network, a WiFi network, a WiMax network, any other type of network, or a combination of two or more such networks.
  • VPN virtual private network
  • LAN local area network
  • WLAN wireless LAN
  • WAN wide area network
  • WWAN wireless wide area network
  • MAN metropolitan area network
  • PSTN Public Switched Telephone Network
  • PSTN Public Switched Telephone Network
  • the central computing system 200 includes one or more computers or processors configured to communicate with the databases 205 , autonomous robotic devices 110 and remote systems 240 , via the network 215 .
  • the computing system 200 hosts one or more applications configured to interact with one or more components of the object replacement system 250 .
  • the databases 305 may store information/data, as described herein.
  • the databases 205 can include an events data storage facility 230 , a physical objects database 230 , and a facilities database 235 .
  • the events data storage facility 230 can store information associated with events.
  • the physical objects database 230 can store information associated with physical objects.
  • the facilities database 235 can store information associated with facilities.
  • the databases 205 can be located at one or more geographically distributed locations from the central computing system 200 . Alternatively, the databases 205 can be included within the computing system 200 .
  • multiple remote systems 240 can be in communication with a decision engine 225 residing on a central computing system.
  • Each of the remote systems 240 can be in communication with a remote system database 242 and the decision engine 225 can be in communication with an event data storage facility 232 .
  • Each of the remote systems databases 242 can store data associated with the respective remote system 240 .
  • Various events can occur in different facilities.
  • the events can include actions occurring at one or more remote systems 240 .
  • the action can be database actions, associated with data of physical objects disposed at the facility.
  • the action can be executed by the one or more remote systems on the remote system databases 242 of the respective one or more remote systems 240 .
  • the events can be transmitted from the one or more remote systems 240 to the central computing system 200 .
  • the central computing system 200 can execute the decision engine 225 in response to receiving the events which occur at remote systems 240 .
  • the decision engine 225 can store the received events in the event data storage facility 230 .
  • the decision engine 225 can determine whether the actions that occurred at the one more remote systems 240 from which the events were transmitted, prompt any actions which need to be taken on data associated with other remote systems 240 .
  • the decision engine 225 can transmit instructions to each of the other remote systems 240 to execute actions on the data stored in the respective databases, based on the actions associated with data which have taken place at the one or more remote systems 240 which transmitted the events.
  • the other remote systems 240 can execute the actions based on the received instructions.
  • the actions can be deleting data, inserting data, merging date, and/or any other database related action.
  • the actions can be adjusting quantities of physical objects disposed at facilities at remote system databases 242 which store data associated with the respective physical objects.
  • the central computing system 200 can determine a quantity of a set of like physical objects designated to be disposed at a facility is absent from the facility, based on the adjustment of data corresponding to the quantities of physical objects at remote system databases 242 .
  • the central computing system 200 can execute the control engine 220 in response to determining the quantity of the set of like physical objects designated to be disposed at a facility is absent from the facility.
  • the control engine 220 can query the physical objects database 230 to retrieve identification information of the physical object along with the designated location of the set of like physical objects in the facility.
  • the control engine 220 can instruct an autonomous robotic device 110 disposed in the facility to verify the set of like physical objects are absent from the facility.
  • the instructions can include the identification information of the set of like physical objects and the location of the set of like physical objects in the facility.
  • the autonomous robot device 110 can navigate to the designated location of the set of like physical objects in the facility.
  • the controller 116 can control the image capturing device 112 to capture images of the designated location of the set of like physical objects and the respective labels including the string and/or machine-readable elements.
  • the autonomous robot device 110 can determine from the captured image that the set of like physical objects 102 is absent from the designated location.
  • the autonomous robot device 110 can use machine vision to determine the set of like physical objects is absent from the designated location. Machine vision can be used to provide imaging-based automatic inspection and analysis of the facility.
  • the autonomous robot device 110 can extract the identifier from the machine-readable element disposed adjacent to the designated location, and associated with the absent set of like physical objects from the captured image using machine vision.
  • the autonomous robot device can extract the identifier by scanning the machine-readable element using the optical scanner 118 .
  • the autonomous robot device 110 can transmit a verification that the set of like physical objects are absent from the designated location.
  • the autonomous robot device 110 can also transmit the captured images and/or the detected attributes associated with the designated location of the absent set of like physical objects to the central computing system 200 .
  • the attributes can include size, dimensions and proximity to other physical objects.
  • the control engine 220 can receive the verification of the set of like physical objects are absent from the designated location.
  • the control engine 220 can also receive the captured images and/or detected attributes of the designated location of the set of like physical objects. In the event the control engine 220 receives images of the designated location, the control engine 220 can extract attributes from the captured images of the designated location. The attributes can include size, dimensions and proximity to other physical objects.
  • the control engine 220 can query the physical objects database 230 to retrieve information associated with the absent set of like physical objects. The information can include, name, type, size, dimensions, color and other information associated with a physical objects.
  • the control engine 220 can also query the facilities database 235 to determine rules associated with physical objects disposed in the facility.
  • the control engine 220 can determine a replacement physical object based on the information associated with the absent set of like physical objects, the attributes associated with the designated location, and the rules of the facility associated with physical objects disposed in the facility.
  • the control engine 220 can query the physical objects database 230 to determine a quantity of the replacement physical object disposed at the facility, the identification information of the replacement physical object, and location of replacement physical object disposed at the facility.
  • the control engine 220 can instruct the autonomous robotic device to retrieve a specified a quantity of the replacement physical object from the location of the replacement physical object in the facility, and deposit the specified quantity of replacement physical object at the designed location of the absent set of like physical objects.
  • control engine 220 can determine a group of replacement physical objects.
  • the control engine 220 can generate a replaceability score for each of the group of replacement objects based on the similarity to the absent set of physical objects, the attributes associated with the designated location and the rules of the facility associated with physical objects disposed in the facility.
  • the control engine 220 can rank each of the replacement physical objects based on score.
  • the control engine 220 can select a replacement physical object from the group of replacement physical objects based on rank and quantity of replacement physical objects disposed in the facility.
  • the autonomous robotic device 110 can receive the instructions from the control engine 220 .
  • the instructions can include identification information associated with the replacement physical object, and a quantity of the replacement physical object to be picked up by the autonomous robot device 110 .
  • the identification information can include an identifier associated with the replacement physical object or other attributes (i.e. name, size, type, or color) associated with the replacement physical object.
  • the autonomous robot device 110 can navigate to the location of the facility where the replacement object is disposed.
  • the autonomous robot device 110 can capture images of the physical objects at the location where the replacement physical object is disposed.
  • the autonomous robot device 110 can extract attributes of the replacement physical object from the captured images.
  • the autonomous robot device 110 can identify the replacement physical object based on the attributes extracted from the captured images and the identification information received in the instructions.
  • the autonomous robot device 110 can also use machine vision to identify the replacement physical object. In response to confirming the replacement physical object is present, the autonomous robot device 110 can pick up a set of replacement physical objects, based on the quantity received in the instructions. The autonomous robot device 110 can carry the set of replacement physical objects and navigate to the designated location of the absent set of like physical objects. The autonomous robot device 110 can deposit the set of replacement physical objects in the designated location of the absent physical objects.
  • the object replacement system 350 can be implemented in a retail store and/or e-commerce environment.
  • the remote systems 240 can be associated with one or more retail store or e-commerce website.
  • Each of the remote systems 240 can be associated with a remote system database 242 and each of the aforementioned remote systems 240 can generate events.
  • a Sales/Returns remote system 240 can generate an event of a sale of a product. Data associated with the sale of the product can be committed to the remote system database 242 associated with the Sales/Returns remote system 240 .
  • the event can be transmitted from the Sales/Returns remote system 240 can transmitted to the central computing system 240 .
  • the event can include data associated with the sale of the product including the identification of the product and the quantity of the product that has been sold.
  • the decision engine 225 can determine the remote systems 240 affected by the sale of product.
  • the decision engine 240 can determine the Inventory Adjustment remote system 240 is affected by the sale of the product as the inventory of the product should be decreased by the quantity of product sold based on the received event.
  • the decision engine 225 can also determine based on an adjustment to the inventory of the product, the quantity of the product disposed in the retail store is less than a threshold amount and a purchase order needs to be generated for more of the product for the facility.
  • the decision engine 225 can determine the PO Create/Update remote system 240 is also affected by the event of the sale of the product.
  • the decision engine 225 can transmit instructions to the Inventory Adjustment remote system 240 can the PO create/update remote system 240 to update the respective remote system databases 242 .
  • the Inventory Adjustment remote system 240 can adjust the inventory of the sold product in the remote system database 242 associated with the Inventory Adjustment remote system 240 , in response to receiving instructions from the decision engine 225 .
  • the PO create/update remote system 240 can generate and store a new purchase order in the remote system database 242 associated with the PO create/update remote system 240 , in response to receiving instructions from the decision engine 225 .
  • the updated data can trigger a change in a sales forecast and demand forecast associated with the product.
  • the control engine 220 can determine the product like the sold product is now absent from the facility based on the Inventory Adjustment remote system 240 adjusting the inventory of the product in the remote system database 242 .
  • the control engine 220 can query the physical objects database 230 to retrieve identification information of the product along with the designated location of the product in the retail store.
  • the control engine 220 can instruct an autonomous robotic device 110 disposed in the facility to verify product is now absent from the retail store.
  • the autonomous robot device 110 can navigate to the designated location of the set of the product.
  • the autonomous robot device 110 can determine from the captured image that the set of like physical objects 102 is absent from the designated location.
  • the autonomous robot device 110 can transmit a verification that the product is absent from the designated location.
  • the autonomous robot device 110 can also transmit the captured images and/or the detected attributes associated with the designated location of the absent product to the central computing system 200 .
  • the attributes can include, size, dimensions, proximity to other physical objects, demand forecast, sales forecast and vendor pack size rules.
  • the control engine 220 can receive the verification of the product are absent from the designated location.
  • the control engine 220 can also receive the captured images and/or detected attributes of the designated location of the product. In the event the control engine 220 receives images of the designated location, the control engine 220 can extract attributes from the captured images of the designated location.
  • the control engine 220 can query the physical objects database 330 to retrieve information associated with the absent the product. The information can include, name, type, size, dimensions, color and other information associated with the product.
  • the control engine 220 can also query the facilities database 235 to determine rules associated with the products disposed in the retail store. For example, the rules can control the display of age-restricted products such as alcohol and cigarettes.
  • the control engine 220 can determine a replacement product based on the information associated with the absent product, the attributes associated with the designated location, and the rules of the facility associated with physical objects disposed in the facility. For example, in the event the absent product is a 12-Pack of Coca-Cola, the control engine 220 can determine 2 6-Packs of Pepsi can be a replacement product, as 2 6-Packs of Pepsi are similar product to 12-Pack of Coca-Cola, and the same size, shape and dimensions as 12-pack of Coca-Cola. The control engine 220 can query the physical objects database 230 to determine a quantity of the replacement product disposed at the facility, the identification information of the replacement product, and location of replacement product disposed at the retail store.
  • control engine 220 can instruct the autonomous robotic device to retrieve a specified a quantity of the replacement product from the location of the replacement product in the retail store, and deposit the specified quantity of replacement product at the designed location of the absent product.
  • the autonomous robotic device 110 can receive the instructions from the control engine 220 .
  • the instructions can include identification information associated with the replacement product, location of the replacement product in the retail store and a quantity of the replacement product to be picked up by the autonomous robot device 110 .
  • the identification information can include an identifier associated with the replacement product or other attributes (i.e. name, size, type, or color) associated with the replacement product.
  • the autonomous robot device 110 can navigate to the location of the retail store where the replacement product is disposed. In response to confirming the replacement product is present, the autonomous robot device 110 can pick up a set of replacement products, based on the quantity received in the instructions.
  • the autonomous robot device 110 can carry the set of replacement products and navigate to the designated location of the absent product.
  • the autonomous robot device 110 can deposit the set of replacement products in the designated location of the absent product.
  • FIG. 3 is a block diagram of an exemplary computing device suitable for implementing embodiments of the object replacement system.
  • the computing device may be, but is not limited to, a smartphone, laptop, tablet, desktop computer, server or network appliance.
  • the computing device 300 can be embodied as the central computing system, remote system, and/or autonomous robot device.
  • the computing device 300 includes one or more non-transitory computer-readable media for storing one or more computer-executable instructions or software for implementing exemplary embodiments.
  • the non-transitory computer-readable media may include, but are not limited to, one or more types of hardware memory, non-transitory tangible media (for example, one or more magnetic storage disks, one or more optical disks, one or more flash drives, one or more solid state disks), and the like.
  • memory 306 included in the computing device 300 may store computer-readable and computer-executable instructions or software (e.g., applications 330 such as the decision engine 225 and the control engine 220 ) for implementing exemplary operations of the computing device 300 .
  • the computing device 300 also includes configurable and/or programmable processor 302 and associated core(s) 304 , and optionally, one or more additional configurable and/or programmable processor(s) 302 ′ and associated core(s) 304 ′ (for example, in the case of computer systems having multiple processors/cores), for executing computer-readable and computer-executable instructions or software stored in the memory 306 and other programs for implementing exemplary embodiments of the present disclosure.
  • Processor 302 and processor(s) 302 ′ may each be a single core processor or multiple core ( 304 and 304 ′) processor. Either or both of processor 302 and processor(s) 302 ′ may be configured to execute one or more of the instructions described in connection with computing device 300 .
  • Virtualization may be employed in the computing device 300 so that infrastructure and resources in the computing device 300 may be shared dynamically.
  • a virtual machine 312 may be provided to handle a process running on multiple processors so that the process appears to be using only one computing resource rather than multiple computing resources. Multiple virtual machines may also be used with one processor.
  • Memory 306 may include a computer system memory or random access memory, such as DRAM, SRAM, EDO RAM, and the like. Memory 306 may include other types of memory as well, or combinations thereof.
  • the computing device 300 can receive data from input/output devices such as, a reader 334 and an image capturing device 332 .
  • a user may interact with the computing device 300 through a visual display device 314 , such as a computer monitor, which may display one or more graphical user interfaces 316 , multi touch interface 320 and a pointing device 318 .
  • a visual display device 314 such as a computer monitor, which may display one or more graphical user interfaces 316 , multi touch interface 320 and a pointing device 318 .
  • the computing device 300 may also include one or more storage devices 326 , such as a hard-drive, CD-ROM, or other computer readable media, for storing data and computer-readable instructions and/or software that implement exemplary embodiments of the present disclosure (e.g., applications such as the decision engine 225 and the control engine 220 ).
  • exemplary storage device 326 can include one or more databases 328 for storing information regarding the physical objects, facilities and events.
  • the databases 328 may be updated manually or automatically at any suitable time to add, delete, and/or update one or more data items in the databases.
  • the computing device 300 can include a network interface 308 configured to interface via one or more network devices 324 with one or more networks, for example, Local Area Network (LAN), Wide Area Network (WAN) or the Internet through a variety of connections including, but not limited to, standard telephone lines, LAN or WAN links (for example, 802.11, T1, T3, 56 kb, X.25), broadband connections (for example, ISDN, Frame Relay, ATM), wireless connections, controller area network (CAN), or some combination of any or all of the above.
  • the computing system can include one or more antennas 322 to facilitate wireless communication (e.g., via the network interface) between the computing device 300 and a network and/or between the computing device 300 and other computing devices.
  • the network interface 308 may include a built-in network adapter, network interface card, PCMCIA network card, card bus network adapter, wireless network adapter, USB network adapter, modem or any other device suitable for interfacing the computing device 300 to any type of network capable of communication and performing the operations described herein.
  • the computing device 300 may run any operating system 310 , such as any of the versions of the Microsoft® Windows® operating systems, the different releases of the Unix and Linux operating systems, any version of the MacOS® for Macintosh computers, any embedded operating system, any real-time operating system, any open source operating system, any proprietary operating system, or any other operating system capable of running on the computing device 300 and performing the operations described herein.
  • the operating system 310 may be run in native mode or emulated mode.
  • the operating system 310 may be run on one or more cloud machine instances.
  • FIG. 4 is a flowchart illustrating a process implemented by an object replacement system according to an exemplary embodiment.
  • a central computing system e.g. central computing system 200 as shown in FIG. 2
  • a data storage facility e.g. events data storage facility 232 as shown in FIGS. 2-3 and physical objects database 330 and facilities database 235 as shown in FIG. 3
  • remote systems e.g. remote systems 240 as shown in FIGS. 2-3
  • receives data associated with quantities of like physical objects physical objects 102 A-B as shown in FIGS. 1A-B
  • the central computing system can determine a first quantity of the like physical objects stored in a first one of the remote systems fails to correspond to a second quantity of the like physical objects stored at least another one of the remote systems.
  • the central computing system can adjust data corresponding to the first quantity of the like physical objects stored in the first one of the remote systems based on data corresponding to the second quantity of the like physical objects stored in the at least another one of the remote systems.
  • the central computing system can generate an expected quantity of the like physical objects at a facility associated with the first one of the remote systems, in response to the execution of the reconciliation of the plurality of quantities of the like physical objects.
  • the central computing system can trigger an alert in response to determining the expected quantity is less than a threshold amount.
  • an autonomous robot device e.g. autonomous robot device 110 as shown in FIGS. 1A-B and 3
  • the autonomous robot device can receive the alert indicating the expected quantity of the like physical objects is less than a threshold amount.
  • the autonomous robot device can include an image capturing device (e.g. image capturing device 112 as shown in FIGS. 1A and 3 )
  • the autonomous robot device can determine the designated location (e.g. designated location 109 as shown in FIG. 1A ) of the like physical objects within the facility.
  • the autonomous robot device can autonomously navigate to the designated location of the like physical objects.
  • the autonomous robot device can detect via the image capturing device, a vacant space at the designated location at the designated location at which the like physical objects are supposed to be disposed.
  • the autonomous robot device using the image capturing device can capture an image of the vacant space.
  • the autonomous robot device can transmit the image to the central computing system.
  • the central computing system can receive the image.
  • the central computing system can extract the physical attributes of the vacant space.
  • the central computing system can query the database to retrieve attributes associated with the like physical objects.
  • the central computing system can determine a set of like replacement physical objects based on the physical attributes of the vacant space and the attributes associated with the physical object.
  • FIG. 5 is a flowchart illustrating a process implemented by an object replacement system according to an exemplary embodiment.
  • a central computing system e.g. central computing system 200 as shown in FIG. 3
  • an autonomous robot devices e.g. autonomous robot device 110 as shown in FIGS. 1A-1B and 3
  • retrieve a set of like replacement objects e.g. physical objects 102 as shown in FIG. 1A-1B
  • a location e.g. second location 150 as shown in FIG. 1B
  • the autonomous robot device can receive the instructions from the central computing system.
  • the autonomous robot device can navigate to the location in the facility of the set of like replacement objects.
  • the autonomous robot device can pick up the set of like replacement objects.
  • the autonomous robot device can navigate to the designated location of the like physical objects.
  • the autonomous robot device can deposit the set of like replacement objects to in the vacant space at the designated location of the like physical objects.
  • Exemplary flowcharts are provided herein for illustrative purposes and are non-limiting examples of methods.
  • One of ordinary skill in the art will recognize that exemplary methods may include more or fewer steps than those illustrated in the exemplary flowcharts, and that the steps in the exemplary flowcharts may be performed in a different order than the order shown in the illustrative flowcharts.

Abstract

Some embodiments provide systems and methods to enable object replacement. A central computing system can receive data associated with quantities of like physical objects from remote systems. The central computing system can adjust the first quantity of the like physical objects stored in the first one of the remote systems based on the second quantity of the like physical objects stored in the at least another one of the remote systems. The central computing system can determine the like physical objects are absent from the facility. An autonomous robot device can detect a vacant space at the designated location at which the like physical objects are supposed to be disposed. The autonomous robot device using the image capturing device can capture an image of the vacant space. The central computing system can determine a set of like replacement physical objects to be disposed in the vacant space.

Description

    CROSS-REFERENCE TO RELATED PATENT APPLICATION
  • This application is a continuation of U.S. application Ser. No. 16/013,469 filed Jun. 20, 2018, which claims the benefit of U.S. Provisional Application No. 62/522,883 filed on Jun. 21, 2017, all of which are incorporated herein by reference in their entirety.
  • BACKGROUND
  • Object replacement can be a slow and error prone process causing delays in filling vacant spaces in facilities, in which physical objects are designated to be disposed.
  • BRIEF DESCRIPTION OF DRAWINGS
  • Illustrative embodiments are shown by way of example in the accompanying drawings and should not be considered as a limitation of the present disclosure. The accompanying figures, which are incorporated in and constitute a part of this specification, illustrate one or more embodiments of the disclosure and, together with the description, help to explain the inventive aspects of the present disclosure. In the figures:
  • FIGS. 1A-B are a block diagrams illustrating an autonomous robot device navigating in a facility according to exemplary embodiments of the present disclosure;
  • FIG. 2 illustrates a network diagram of an object replacement system in accordance with an exemplary embodiment;
  • FIG. 3 illustrates a block diagram an exemplary computing device in accordance with an exemplary embodiment;
  • FIG. 4 is a flowchart illustrating a process implemented by the object replacement system according to an exemplary embodiment; and
  • FIG. 5 is a flowchart illustrating a process implemented by the object replacement system according to an exemplary embodiment.
  • DETAILED DESCRIPTION
  • Described in detail herein are systems and methods for an object replacement system. A central computing system, which can include a data storage facility and can be operatively coupled to remote systems, can be configured to receive data associated with quantities of like physical objects from the remote systems. The central computing system can determine that data corresponding to a first quantity of the like physical objects stored in a first one of the remote systems fails to correspond to data corresponding to a second quantity of the like physical objects stored at least another one of the remote systems. The central computing system can adjust the data corresponding to the first quantity of the like physical objects stored in the first one of the remote systems based on the data corresponding to the second quantity of the like physical objects stored in the at least another one of the remote systems. The central computing system can generate an expected value for the quantity of the like physical objects at a facility associated with the first one of the remote systems, in response to the execution of the reconciliation of the plurality of quantities of the like physical objects. The central computing system can trigger, via the central computing system, an alert in response to determining the expected quantity is less than a threshold amount.
  • An autonomous robot device can receive the alert indicating the expected value for the quantity of the like physical objects is less than a threshold amount. The autonomous robot device can determine the designated location of the like physical objects within the facility. The autonomous robot device can autonomously navigate to the designated location of the like physical objects, and can detect, via an image capturing device, a vacant space at the designated location at which the like physical objects are supposed to be disposed. The autonomous robot device, using the image capturing device, can capture an image of the vacant space. The autonomous robot device can transmit the image to the central computing system, which can extract the physical attributes of the vacant space. The central computing system can query a database to retrieve attributes associated with the like physical objects for the designated location. The central computing system can determine a set of replacement physical objects based on the physical attributes of the vacant space and the attributes associated with the physical object that are supposed to be in the vacant space. The physical attributes of the vacant space include the shape, size and dimensions of the vacant space.
  • The central computing system can generate a replaceability score for each of the plurality of replacement physical objects in the set based on a calculated probability that the like physical objects are replaceable by the replacement physical objects in the set. As one example, replacement physical objects that have similar dimensions to the vacant space and/or the absent physical objects can result in a higher replaceability score. As another example, replacement physical objects having similar ingredients, functions, or uses can result in a higher replaceability score. The central computing system can rank the replacement physical objects based on the replaceability score.
  • The central computing system can transmit instructions to the autonomous robot device to retrieve a set of like replacement physical objects from a location in the facility based on the ranking and to deposit the set of like replacement physical objects in the vacant space at the designated location. The autonomous robot device can receive the instructions from the central computing system, can navigate to the location in the facility of the set of like replacement objects, can pick up the set of like replacement objects, and can navigate to the designated location. The autonomous robot device can deposit the set of like replacement objects in the vacant space at the designated location to fill the vacant space.
  • FIGS. 1A-B are a block diagrams illustrating an autonomous robot device navigating in the facility according to exemplary embodiments of the present disclosure. With reference to FIG. 1A, physical objects 102A can be disposed in a first area 100 of a facility. The physical objects 102A can be disposed on a shelving unit 104. A label 106 can be disposed below the physical objects 102A. The label 106 can include a string of alphanumeric characters and/or a machine-readable element 108 encoded with an identifier associated with the physical object disposed above the corresponding label 106.
  • An autonomous robot device 110 can navigate autonomously to the first area 100 of the facility. The autonomous robot device 110 can be a driverless vehicle, an unmanned aerial craft, and/or the like. The autonomous robot device 110 can include an image capturing device 112, motive assemblies 114, a picking unit 115, a controller 116, an optical scanner 118, a drive motor 120, a GPS receiver 122, accelerometer 124 and a gyroscope 126, and can be configured to roam autonomously through a facility. The picking unit 115 can be an articulated arm. The autonomous robot device 110 can be and intelligent device capable of performing tasks without human control. The controller 116 can be programmed to control an operation of the image capturing device 112, motive assemblies 114, (e.g., via the drive motor 120), in response to various inputs including inputs from the GPS receiver 122, the accelerometer 124, and the gyroscope 126. The drive motor 120 can control the operation of the motive assemblies 122 directly and/or through one or more drive trains (e.g., gear assemblies and/or belts). In this non-limiting example, the motive assemblies 122 are wheels affixed to the bottom end of the autonomous robot device 110. The motive assemblies 122 can be but are not limited to wheels, tracks, rotors, rotors with blades, and propellers. The motive assemblies 122 can facilitate 360 degree movement for the autonomous robot device 110. The image capturing device 112 can be a still image camera or a moving image camera.
  • The controller 116 of the autonomous robot device 110 can be configured to control the drive motor 120 to drive the motive assemblies 114 so that the autonomous robot device 110 can autonomously navigate through the facility based on inputs from the GPS receiver 122, accelerometer 124 and gyroscope 126. The GPS receiver 122 can be an L-band radio processor capable of solving the navigation equations in order to determine a position of the autonomous robot device 110, determine a velocity and precise time (PVT) by processing the signal broadcasted by GPS satellites. The accelerometer 124 and gyroscope 126 can determine the direction, orientation, position, acceleration, velocity, tilt, pitch, yaw, and roll of the autonomous robot device 110. In exemplary embodiments, the controller 116 can implement one or more algorithms, such as a Kalman filter, for determining a position of the autonomous robot device
  • As noted above, physical objects 102A can be disposed on a shelving unit 104 in a facility. The autonomous robot device 110 can roam to the first area 100 in the facility using the motive assemblies 114 and the controller 116 can control the image capturing device 112 to capture images of the set of physical objects 102A in a designated location 109 and the respective labels 106 including the string and/or machine-readable elements 108. The autonomous robot device 110 can be programmed with a map of the facility and/or can generate a map of the facility using simultaneous localization and mapping (SLAM). The autonomous robot device 110 can navigate around the facility based on inputs from the GPS receiver 122, the accelerometer 124, and/or the gyroscope 126. The autonomous robot device 110 can be configured to capture images after an amount of time that elapses between captures, a distance traveled within the facility, continuously, and/or the like. The autonomous robot device 110 can determine from the captured image of the designated location 109 that the set of like physical objects 102A is absent from the shelving unit 104 at the designated location 109, i.e., there is a vacant space at the designated location 109. The autonomous robot device 110 can use machine vision to determine the set of like physical objects 102A is absent from the designated location 109 in the shelving unit. Machine vision can be used to provide imaging-based automatic inspection and analysis of the facility. The autonomous robot device 110 can extract the identifier from the machine-readable element 108 disposed adjacent to the vacant space, and associated with the absent set of like physical objects 102A from the captured image using machine vision. Alternatively or in addition to, the autonomous robot device 110 can extract the identifier by scanning the machine-readable element 108 using the optical scanner 118. The autonomous robot device 110 can transmit the identifier to a computing system. The autonomous robot device 110 can also transmit the captured images and/or the detected attributes associated with the absent physical objects 102A. The computing system will be discussed in greater detail with reference to FIG. 5.
  • With reference to FIG. 1B, replacement physical objects 102B can be disposed at a second area 150 of the facility. Similar to the physical objects 102A disposed in first area 100, the physical objects 102B can be disposed on a shelving unit 104. A label 106 can be disposed below the physical objects 104. The label 106 can include a string of alphanumeric characters and/or a machine-readable element 108 encoded with an identifier associated with the physical object disposed above the corresponding label 106.
  • The autonomous robot device 110 can receive instructions from the computing system to navigate to a second area 150 of the facility and pick up a set of replacement physical objects 102B to deposit in the designated location of the set of absent physical objects 102A from the first location of the facility. The instructions can include identification information associated with the replacement physical object 102B, and a quantity of the replacement physical object to be picked up by the autonomous robot device 110. The identification information can include an identifier associated with the replacement physical object 102B or other attributes (i.e. name, size, type, or color) associated with the replacement physical object 102B. The autonomous robot device 110 can navigate to the second location 150 of the facility. The autonomous robot device 110 can capture images of the physical objects 102B disposed on the shelving unit 104. The autonomous robot device 110 can extract attributes of the replacement physical object 102B from the captured images. The autonomous robot device 110 can identify the replacement physical object based on the attributes extracted from the captured images and the identification information received in the instructions. The autonomous robot device 110 can also use machine vision to identify the replacement physical object 102B. The autonomous robot device 110 can extract the identifier from the machine-readable element 108 associated with the absent set of like replacement physical object 102B from the captured image using machine vision. Alternatively or in addition to, the autonomous robot device 110 can extract the identifier by scanning the machine-readable element 108 using the optical scanner 118. In response to confirming the replacement physical object 102B is present on the shelving unit 104, the autonomous robot device 110 can pick up a set of replacement physical objects 102B, using the picking unit 115. The autonomous robot device 110 can carry the set of replacement physical objects 102 and navigate to the first area 100. The autonomous robot device 110 can deposit the set of replacement physical objects 102B in the designated location (e.g. designated area 109 as shown in FIG. 1A) of the absent physical objects 102A.
  • FIG. 2 illustrates a network diagram of an object replacement system in accordance with an exemplary embodiment. The object replacement system 250 can include one or more databases 205, one or more central computing systems 200, one or more autonomous robotic devices 110, and one or more remote systems 240 communicating over communication network 215. The remote systems 240 can include a remote system database 242. The central computing system 200 can execute one or more instances of a control engine 220 and a decision engine 225. The control engine 220 and decision engine 225 can be an executable application residing on the computing system 200 to implement the object replacement system 250 as described herein.
  • In an example embodiment, one or more portions of the communications network 215 can be an ad hoc network, an intranet, an extranet, a virtual private network (VPN), a local area network (LAN), a wireless LAN (WLAN), a wide area network (WAN), a wireless wide area network (WWAN), a metropolitan area network (MAN), a portion of the Internet, a portion of the Public Switched Telephone Network (PSTN), a cellular telephone network, a wireless network, a WiFi network, a WiMax network, any other type of network, or a combination of two or more such networks.
  • The central computing system 200 includes one or more computers or processors configured to communicate with the databases 205, autonomous robotic devices 110 and remote systems 240, via the network 215. The computing system 200 hosts one or more applications configured to interact with one or more components of the object replacement system 250. The databases 305 may store information/data, as described herein. For example, the databases 205 can include an events data storage facility 230, a physical objects database 230, and a facilities database 235. The events data storage facility 230 can store information associated with events. The physical objects database 230 can store information associated with physical objects. The facilities database 235 can store information associated with facilities. The databases 205 can be located at one or more geographically distributed locations from the central computing system 200. Alternatively, the databases 205 can be included within the computing system 200.
  • In one embodiment, multiple remote systems 240 can be in communication with a decision engine 225 residing on a central computing system. Each of the remote systems 240 can be in communication with a remote system database 242 and the decision engine 225 can be in communication with an event data storage facility 232. Each of the remote systems databases 242 can store data associated with the respective remote system 240. Various events can occur in different facilities. The events can include actions occurring at one or more remote systems 240. The action can be database actions, associated with data of physical objects disposed at the facility. The action can be executed by the one or more remote systems on the remote system databases 242 of the respective one or more remote systems 240. The events can be transmitted from the one or more remote systems 240 to the central computing system 200.
  • The central computing system 200 can execute the decision engine 225 in response to receiving the events which occur at remote systems 240. The decision engine 225 can store the received events in the event data storage facility 230. The decision engine 225 can determine whether the actions that occurred at the one more remote systems 240 from which the events were transmitted, prompt any actions which need to be taken on data associated with other remote systems 240. In response to determining actions which need to be taken on data associated with the other remote systems 240, the decision engine 225 can transmit instructions to each of the other remote systems 240 to execute actions on the data stored in the respective databases, based on the actions associated with data which have taken place at the one or more remote systems 240 which transmitted the events. The other remote systems 240 can execute the actions based on the received instructions. The actions can be deleting data, inserting data, merging date, and/or any other database related action. For example, the actions can be adjusting quantities of physical objects disposed at facilities at remote system databases 242 which store data associated with the respective physical objects.
  • The central computing system 200 can determine a quantity of a set of like physical objects designated to be disposed at a facility is absent from the facility, based on the adjustment of data corresponding to the quantities of physical objects at remote system databases 242. The central computing system 200 can execute the control engine 220 in response to determining the quantity of the set of like physical objects designated to be disposed at a facility is absent from the facility. The control engine 220 can query the physical objects database 230 to retrieve identification information of the physical object along with the designated location of the set of like physical objects in the facility. The control engine 220 can instruct an autonomous robotic device 110 disposed in the facility to verify the set of like physical objects are absent from the facility. The instructions can include the identification information of the set of like physical objects and the location of the set of like physical objects in the facility.
  • The autonomous robot device 110 can navigate to the designated location of the set of like physical objects in the facility. The controller 116 can control the image capturing device 112 to capture images of the designated location of the set of like physical objects and the respective labels including the string and/or machine-readable elements. The autonomous robot device 110 can determine from the captured image that the set of like physical objects 102 is absent from the designated location. The autonomous robot device 110 can use machine vision to determine the set of like physical objects is absent from the designated location. Machine vision can be used to provide imaging-based automatic inspection and analysis of the facility. The autonomous robot device 110 can extract the identifier from the machine-readable element disposed adjacent to the designated location, and associated with the absent set of like physical objects from the captured image using machine vision. Alternatively, or in addition, the autonomous robot device can extract the identifier by scanning the machine-readable element using the optical scanner 118. The autonomous robot device 110 can transmit a verification that the set of like physical objects are absent from the designated location. In some embodiments, the autonomous robot device 110 can also transmit the captured images and/or the detected attributes associated with the designated location of the absent set of like physical objects to the central computing system 200. The attributes can include size, dimensions and proximity to other physical objects.
  • The control engine 220 can receive the verification of the set of like physical objects are absent from the designated location. The control engine 220 can also receive the captured images and/or detected attributes of the designated location of the set of like physical objects. In the event the control engine 220 receives images of the designated location, the control engine 220 can extract attributes from the captured images of the designated location. The attributes can include size, dimensions and proximity to other physical objects. The control engine 220 can query the physical objects database 230 to retrieve information associated with the absent set of like physical objects. The information can include, name, type, size, dimensions, color and other information associated with a physical objects. The control engine 220 can also query the facilities database 235 to determine rules associated with physical objects disposed in the facility. The control engine 220 can determine a replacement physical object based on the information associated with the absent set of like physical objects, the attributes associated with the designated location, and the rules of the facility associated with physical objects disposed in the facility. The control engine 220 can query the physical objects database 230 to determine a quantity of the replacement physical object disposed at the facility, the identification information of the replacement physical object, and location of replacement physical object disposed at the facility. In response to determining the quantity of the replacement physical object is greater than a threshold amount, the control engine 220 can instruct the autonomous robotic device to retrieve a specified a quantity of the replacement physical object from the location of the replacement physical object in the facility, and deposit the specified quantity of replacement physical object at the designed location of the absent set of like physical objects.
  • In some embodiments, the control engine 220 can determine a group of replacement physical objects. The control engine 220 can generate a replaceability score for each of the group of replacement objects based on the similarity to the absent set of physical objects, the attributes associated with the designated location and the rules of the facility associated with physical objects disposed in the facility. The control engine 220 can rank each of the replacement physical objects based on score. The control engine 220 can select a replacement physical object from the group of replacement physical objects based on rank and quantity of replacement physical objects disposed in the facility.
  • The autonomous robotic device 110 can receive the instructions from the control engine 220. The instructions can include identification information associated with the replacement physical object, and a quantity of the replacement physical object to be picked up by the autonomous robot device 110. The identification information can include an identifier associated with the replacement physical object or other attributes (i.e. name, size, type, or color) associated with the replacement physical object. The autonomous robot device 110 can navigate to the location of the facility where the replacement object is disposed. The autonomous robot device 110 can capture images of the physical objects at the location where the replacement physical object is disposed. The autonomous robot device 110 can extract attributes of the replacement physical object from the captured images. The autonomous robot device 110 can identify the replacement physical object based on the attributes extracted from the captured images and the identification information received in the instructions. The autonomous robot device 110 can also use machine vision to identify the replacement physical object. In response to confirming the replacement physical object is present, the autonomous robot device 110 can pick up a set of replacement physical objects, based on the quantity received in the instructions. The autonomous robot device 110 can carry the set of replacement physical objects and navigate to the designated location of the absent set of like physical objects. The autonomous robot device 110 can deposit the set of replacement physical objects in the designated location of the absent physical objects.
  • As a non-limiting example, the object replacement system 350 can be implemented in a retail store and/or e-commerce environment. The remote systems 240 can be associated with one or more retail store or e-commerce website. Each of the remote systems 240 can be associated with a remote system database 242 and each of the aforementioned remote systems 240 can generate events. For example, a Sales/Returns remote system 240 can generate an event of a sale of a product. Data associated with the sale of the product can be committed to the remote system database 242 associated with the Sales/Returns remote system 240.
  • The event can be transmitted from the Sales/Returns remote system 240 can transmitted to the central computing system 240. The event can include data associated with the sale of the product including the identification of the product and the quantity of the product that has been sold. The decision engine 225 can determine the remote systems 240 affected by the sale of product. The decision engine 240 can determine the Inventory Adjustment remote system 240 is affected by the sale of the product as the inventory of the product should be decreased by the quantity of product sold based on the received event. Furthermore, the decision engine 225 can also determine based on an adjustment to the inventory of the product, the quantity of the product disposed in the retail store is less than a threshold amount and a purchase order needs to be generated for more of the product for the facility. Accordingly, the decision engine 225 can determine the PO Create/Update remote system 240 is also affected by the event of the sale of the product. The decision engine 225 can transmit instructions to the Inventory Adjustment remote system 240 can the PO create/update remote system 240 to update the respective remote system databases 242. The Inventory Adjustment remote system 240 can adjust the inventory of the sold product in the remote system database 242 associated with the Inventory Adjustment remote system 240, in response to receiving instructions from the decision engine 225. The PO create/update remote system 240 can generate and store a new purchase order in the remote system database 242 associated with the PO create/update remote system 240, in response to receiving instructions from the decision engine 225. The updated data can trigger a change in a sales forecast and demand forecast associated with the product.
  • The control engine 220 can determine the product like the sold product is now absent from the facility based on the Inventory Adjustment remote system 240 adjusting the inventory of the product in the remote system database 242. The control engine 220 can query the physical objects database 230 to retrieve identification information of the product along with the designated location of the product in the retail store. The control engine 220 can instruct an autonomous robotic device 110 disposed in the facility to verify product is now absent from the retail store.
  • The autonomous robot device 110 can navigate to the designated location of the set of the product. The autonomous robot device 110 can determine from the captured image that the set of like physical objects 102 is absent from the designated location. The autonomous robot device 110 can transmit a verification that the product is absent from the designated location. In some embodiments, the autonomous robot device 110 can also transmit the captured images and/or the detected attributes associated with the designated location of the absent product to the central computing system 200. The attributes can include, size, dimensions, proximity to other physical objects, demand forecast, sales forecast and vendor pack size rules.
  • The control engine 220 can receive the verification of the product are absent from the designated location. The control engine 220 can also receive the captured images and/or detected attributes of the designated location of the product. In the event the control engine 220 receives images of the designated location, the control engine 220 can extract attributes from the captured images of the designated location. The control engine 220 can query the physical objects database 330 to retrieve information associated with the absent the product. The information can include, name, type, size, dimensions, color and other information associated with the product. The control engine 220 can also query the facilities database 235 to determine rules associated with the products disposed in the retail store. For example, the rules can control the display of age-restricted products such as alcohol and cigarettes. The control engine 220 can determine a replacement product based on the information associated with the absent product, the attributes associated with the designated location, and the rules of the facility associated with physical objects disposed in the facility. For example, in the event the absent product is a 12-Pack of Coca-Cola, the control engine 220 can determine 2 6-Packs of Pepsi can be a replacement product, as 2 6-Packs of Pepsi are similar product to 12-Pack of Coca-Cola, and the same size, shape and dimensions as 12-pack of Coca-Cola. The control engine 220 can query the physical objects database 230 to determine a quantity of the replacement product disposed at the facility, the identification information of the replacement product, and location of replacement product disposed at the retail store. In response to determining the quantity of the replacement product is greater than a threshold amount, the control engine 220 can instruct the autonomous robotic device to retrieve a specified a quantity of the replacement product from the location of the replacement product in the retail store, and deposit the specified quantity of replacement product at the designed location of the absent product.
  • The autonomous robotic device 110 can receive the instructions from the control engine 220. The instructions can include identification information associated with the replacement product, location of the replacement product in the retail store and a quantity of the replacement product to be picked up by the autonomous robot device 110. The identification information can include an identifier associated with the replacement product or other attributes (i.e. name, size, type, or color) associated with the replacement product. The autonomous robot device 110 can navigate to the location of the retail store where the replacement product is disposed. In response to confirming the replacement product is present, the autonomous robot device 110 can pick up a set of replacement products, based on the quantity received in the instructions. The autonomous robot device 110 can carry the set of replacement products and navigate to the designated location of the absent product. The autonomous robot device 110 can deposit the set of replacement products in the designated location of the absent product.
  • FIG. 3 is a block diagram of an exemplary computing device suitable for implementing embodiments of the object replacement system. The computing device may be, but is not limited to, a smartphone, laptop, tablet, desktop computer, server or network appliance. The computing device 300 can be embodied as the central computing system, remote system, and/or autonomous robot device. The computing device 300 includes one or more non-transitory computer-readable media for storing one or more computer-executable instructions or software for implementing exemplary embodiments. The non-transitory computer-readable media may include, but are not limited to, one or more types of hardware memory, non-transitory tangible media (for example, one or more magnetic storage disks, one or more optical disks, one or more flash drives, one or more solid state disks), and the like. For example, memory 306 included in the computing device 300 may store computer-readable and computer-executable instructions or software (e.g., applications 330 such as the decision engine 225 and the control engine 220) for implementing exemplary operations of the computing device 300. The computing device 300 also includes configurable and/or programmable processor 302 and associated core(s) 304, and optionally, one or more additional configurable and/or programmable processor(s) 302′ and associated core(s) 304′ (for example, in the case of computer systems having multiple processors/cores), for executing computer-readable and computer-executable instructions or software stored in the memory 306 and other programs for implementing exemplary embodiments of the present disclosure. Processor 302 and processor(s) 302′ may each be a single core processor or multiple core (304 and 304′) processor. Either or both of processor 302 and processor(s) 302′ may be configured to execute one or more of the instructions described in connection with computing device 300.
  • Virtualization may be employed in the computing device 300 so that infrastructure and resources in the computing device 300 may be shared dynamically. A virtual machine 312 may be provided to handle a process running on multiple processors so that the process appears to be using only one computing resource rather than multiple computing resources. Multiple virtual machines may also be used with one processor.
  • Memory 306 may include a computer system memory or random access memory, such as DRAM, SRAM, EDO RAM, and the like. Memory 306 may include other types of memory as well, or combinations thereof. The computing device 300 can receive data from input/output devices such as, a reader 334 and an image capturing device 332.
  • A user may interact with the computing device 300 through a visual display device 314, such as a computer monitor, which may display one or more graphical user interfaces 316, multi touch interface 320 and a pointing device 318.
  • The computing device 300 may also include one or more storage devices 326, such as a hard-drive, CD-ROM, or other computer readable media, for storing data and computer-readable instructions and/or software that implement exemplary embodiments of the present disclosure (e.g., applications such as the decision engine 225 and the control engine 220). For example, exemplary storage device 326 can include one or more databases 328 for storing information regarding the physical objects, facilities and events. The databases 328 may be updated manually or automatically at any suitable time to add, delete, and/or update one or more data items in the databases.
  • The computing device 300 can include a network interface 308 configured to interface via one or more network devices 324 with one or more networks, for example, Local Area Network (LAN), Wide Area Network (WAN) or the Internet through a variety of connections including, but not limited to, standard telephone lines, LAN or WAN links (for example, 802.11, T1, T3, 56 kb, X.25), broadband connections (for example, ISDN, Frame Relay, ATM), wireless connections, controller area network (CAN), or some combination of any or all of the above. In exemplary embodiments, the computing system can include one or more antennas 322 to facilitate wireless communication (e.g., via the network interface) between the computing device 300 and a network and/or between the computing device 300 and other computing devices. The network interface 308 may include a built-in network adapter, network interface card, PCMCIA network card, card bus network adapter, wireless network adapter, USB network adapter, modem or any other device suitable for interfacing the computing device 300 to any type of network capable of communication and performing the operations described herein.
  • The computing device 300 may run any operating system 310, such as any of the versions of the Microsoft® Windows® operating systems, the different releases of the Unix and Linux operating systems, any version of the MacOS® for Macintosh computers, any embedded operating system, any real-time operating system, any open source operating system, any proprietary operating system, or any other operating system capable of running on the computing device 300 and performing the operations described herein. In exemplary embodiments, the operating system 310 may be run in native mode or emulated mode. In an exemplary embodiment, the operating system 310 may be run on one or more cloud machine instances.
  • FIG. 4 is a flowchart illustrating a process implemented by an object replacement system according to an exemplary embodiment. In operation 400, a central computing system (e.g. central computing system 200 as shown in FIG. 2) including a data storage facility (e.g. events data storage facility 232 as shown in FIGS. 2-3 and physical objects database 330 and facilities database 235 as shown in FIG. 3) and operatively coupled to remote systems (e.g. remote systems 240 as shown in FIGS. 2-3), receives data associated with quantities of like physical objects (physical objects 102A-B as shown in FIGS. 1A-B) from the remote systems. In operation 402, the central computing system can determine a first quantity of the like physical objects stored in a first one of the remote systems fails to correspond to a second quantity of the like physical objects stored at least another one of the remote systems. In operation 504, the central computing system can adjust data corresponding to the first quantity of the like physical objects stored in the first one of the remote systems based on data corresponding to the second quantity of the like physical objects stored in the at least another one of the remote systems. In operation 406, the central computing system can generate an expected quantity of the like physical objects at a facility associated with the first one of the remote systems, in response to the execution of the reconciliation of the plurality of quantities of the like physical objects. In operation 408, the central computing system can trigger an alert in response to determining the expected quantity is less than a threshold amount.
  • In operation 410, an autonomous robot device (e.g. autonomous robot device 110 as shown in FIGS. 1A-B and 3) can receive the alert indicating the expected quantity of the like physical objects is less than a threshold amount. The autonomous robot device can include an image capturing device (e.g. image capturing device 112 as shown in FIGS. 1A and 3) In operation 412, the autonomous robot device can determine the designated location (e.g. designated location 109 as shown in FIG. 1A) of the like physical objects within the facility. In operation 414, the autonomous robot device can autonomously navigate to the designated location of the like physical objects. In operation 416, the autonomous robot device can detect via the image capturing device, a vacant space at the designated location at the designated location at which the like physical objects are supposed to be disposed. In operation 418 the autonomous robot device using the image capturing device can capture an image of the vacant space. In operation 420, the autonomous robot device can transmit the image to the central computing system. In operation 422, the central computing system can receive the image. In operation 424 the central computing system can extract the physical attributes of the vacant space. In operation 426, the central computing system can query the database to retrieve attributes associated with the like physical objects. In operation 428, the central computing system can determine a set of like replacement physical objects based on the physical attributes of the vacant space and the attributes associated with the physical object.
  • FIG. 5 is a flowchart illustrating a process implemented by an object replacement system according to an exemplary embodiment. In operation 500, a central computing system (e.g. central computing system 200 as shown in FIG. 3) can transmit instructions to an autonomous robot devices (e.g. autonomous robot device 110 as shown in FIGS. 1A-1B and 3) to retrieve a set of like replacement objects (e.g. physical objects 102 as shown in FIG. 1A-1B) from a location (e.g. second location 150 as shown in FIG. 1B) in the facility and deposit the set of like replacement objects at the vacant space at the designated location (e.g. first location 100 as shown in FIG. 1A) of like physical objects. In operation 502, the autonomous robot device can receive the instructions from the central computing system. In operation 504, the autonomous robot device can navigate to the location in the facility of the set of like replacement objects. In operation 506, the autonomous robot device can pick up the set of like replacement objects. In operation 508, the autonomous robot device can navigate to the designated location of the like physical objects. In operation 510, the autonomous robot device can deposit the set of like replacement objects to in the vacant space at the designated location of the like physical objects.
  • In describing exemplary embodiments, specific terminology is used for the sake of clarity. For purposes of description, each specific term is intended to at least include all technical and functional equivalents that operate in a similar manner to accomplish a similar purpose. Additionally, in some instances where a particular exemplary embodiment includes a multiple system elements, device components or method steps, those elements, components or steps may be replaced with a single element, component or step. Likewise, a single element, component or step may be replaced with multiple elements, components or steps that serve the same purpose. Moreover, while exemplary embodiments have been shown and described with references to particular embodiments thereof, those of ordinary skill in the art will understand that various substitutions and alterations in form and detail may be made therein without departing from the scope of the present disclosure. Further still, other aspects, functions and advantages are also within the scope of the present disclosure.
  • Exemplary flowcharts are provided herein for illustrative purposes and are non-limiting examples of methods. One of ordinary skill in the art will recognize that exemplary methods may include more or fewer steps than those illustrated in the exemplary flowcharts, and that the steps in the exemplary flowcharts may be performed in a different order than the order shown in the illustrative flowcharts.

Claims (20)

1. An object replacement system, the system comprising:
a central computing system including a data storage facility and operatively coupled to a plurality of remote systems, the central computing system configured to:
receive data associated with a plurality of quantities of like physical objects from the plurality of remote systems;
generate an expected quantity value of the like physical objects at a facility associated with a first one of the remote systems;
determine that the expected quantity value of the like physical objects at the facility is less than a threshold amount;
trigger an alert in response to the determination that the expected quantity is less than a threshold amount;
cause, in response to the triggering of the alert, an activation of at least a first image capturing device at the facility to capture an image of a vacant space at a designated location at which the like physical objects are supposed to be disposed;
receive the image and extract physical attributes of the vacant space based on the image;
query a database to retrieve attributes associated with the like physical objects; and
determine a set of like replacement physical objects based on the physical attributes of the vacant space and the attributes associated with the like physical object to be placed in to the vacant space.
2. The system of claim 1, further comprising:
a plurality of image capture devices disposed in the facility, wherein each of the plurality of image capture devices is in selective communication with the central computing system, and the plurality of image capture devices comprises the first image capture device, wherein the triggering the alert comprises causing the activation of at least the first image capturing device proximate the designated location to capture the image of the vacant space at the designated location.
3. The system of claim 1, wherein the central computing system is further configured to:
determine a quantity of the set of like physical objects designated to be disposed at the designated location and that are expected absent from the facility based on data corresponding to quantities of physical objects associated with the first one of the remote systems; and
retrieve, from a physical objects database, identification information of the like physical objects of the set of like physical objects, and the designated location of the set of like physical objects in the facility;
wherein the triggering the alert comprises causing the activation of at least the first image capturing device proximate the designated location to capture the image of the vacant space at the designated location.
4. The system of claim 3, wherein the central computing system is configured to access rules associated with the designated location and physical objects disposed in the facility associated with the first one of the remote systems, and wherein the central computing system in determine the set of like replacement physical objects is configured to determine the set of like replacement physical objects based on the physical attributes of the vacant space, the attributes associated with the like physical object to be placed in to the vacant space, and the set of rules.
5. The system of claim 1, wherein the central computing system is configured to rank each different like replacement physical objects of a plurality of different like replacement physical objects, and select the set of like replacement physical objects from the plurality of different like replacement physical objects to be placed into the vacant space.
6. The system of claim 5, wherein the central computing system is configured to generate a replaceability score for each of the like replacement physical objects of the set of like replacement physical objects based on a calculated probability that the like physical objects are replaceable by a respective like replacement physical object of the set of like replacement physical objects, based on the physical attributes of the vacant space and the attributes associated with the like physical object.
7. The system of claim 6, wherein the central computing system is configured to rank the each of the respective like replacement physical objects of the set of like replacement physical objects based on the replaceability score for each of the respective like replacement physical objects of the set of like replacement physical objects.
8. The system of claim 1, wherein the central computing system is configured to transmit instructions to implement a retrieval of the set of like replacement objects from a location in the facility and deposit the set of like replacement objects at the vacant space at the designated location at which the like physical objects are supposed to be disposed.
9. The system of claim 8, wherein the central computing system is configured to transmit the instructions to at least one autonomous robot device of a plurality of autonomous robot devices disposed in the facility, wherein the at least one autonomous robot device is in selective communication with the central computing system and is disposed in the facility associated with the first one of the remote systems, and wherein the instructions cause the at least one autonomous robot devices to:
navigate, in response to receiving the instructions from the central computing system, to the location in the facility of the set of like replacement objects;
pick up the set of like replacement objects;
navigate to the designated location of the like physical objects; and
deposit the set of like replacement objects to in the vacant space at the designated location of the like physical objects.
10. The system of claim 1, wherein the central computing system is configured to access remote databases each associated with a respective one of the plurality of remote systems to retrieve a respective one of the plurality of quantities of the like physical objects associated with the respective one of the remote systems.
11. A physical object replacement method, the method comprising:
receiving, at a central computing system including a data storage facility and operatively coupled to a plurality of remote systems, data associated with a plurality of quantities of like physical objects from the plurality of remote systems;
generating, via the central computing system, an expected quantity value of the like physical objects at a facility associated with a first one of the remote systems;
determining, via the central computing system, that the expected quantity value of the like physical objects at the facility is less than a threshold amount;
triggering, via the central computing system, an alert in response to the determination that the expected quantity is less than a threshold amount;
causing, via the central computing system and in response to the triggering of the alert, an activation of at least a first image capturing device at the facility to capture an image of a vacant space at a designated location at which the like physical objects are supposed to be disposed;
receiving, via the central computing system, the image and extract physical attributes of the vacant space based on the image;
querying, via the central computing system, a database to retrieve attributes associated with the like physical objects; and
determining, via the central computing system, a set of like replacement physical objects based on the physical attributes of the vacant space and the attributes associated with the like physical object to be placed in to the vacant space.
12. The method of claim 11, wherein the triggering the alert comprises causing the activation of at least the first image capturing device proximate the designated location to capture the image of the vacant space at the designated location, wherein the first image capturing device is one of a plurality of image capture devices disposed in the facility and each of the plurality of image capture devices is in selective communication with the central computing system.
13. The method of claim 11, further comprising:
wherein the central computing system is further configured to:
determining, via the central computing system, a quantity of the set of like physical objects designated to be disposed at the designated location and that are expected absent from the facility based on data corresponding to quantities of physical objects associated with the first one of the remote systems; and
retrieving, via the central computing system and from a physical objects database, identification information of the like physical objects of the set of like physical objects, and the designated location of the set of like physical objects in the facility;
wherein the triggering the alert comprises causing the activation of at least the first image capturing device proximate the designated location to capture the image of the vacant space at the designated location.
14. The method of claim 13, further comprising:
wherein the central computing system is configured to
accessing, via the central computing system, rules associated with the designated location and physical objects disposed in the facility associated with the first one of the remote systems; and
wherein the in determining the set of like replacement physical objects comprises determining the set of like replacement physical objects based on the physical attributes of the vacant space, the attributes associated with the like physical object to be placed in to the vacant space, and the set of rules.
15. The method of claim 11, further comprising:
ranking, via the central computing systems, each different like replacement physical objects of a plurality of different like replacement physical objects; and
selecting the set of like replacement physical objects from the plurality of different like replacement physical objects to be placed into the vacant space.
16. The method of claim 15, further comprising generating, via the central computing system, a replaceability score for each of the like replacement physical objects of the set of like replacement physical objects based on a calculated probability that the like physical objects are replaceable by a respective like replacement physical object of the set of like replacement physical objects, based on the physical attributes of the vacant space and the attributes associated with the like physical object.
17. The method of claim 16, further comprising ranking each of the respective like replacement physical objects of the set of like replacement physical objects based on the replaceability score for each of the respective like replacement physical objects of the set of like replacement physical objects.
18. The method of claim 11, further comprising:
transmitting, via the central computing system, instructions to implement a retrieval of the set of like replacement objects from a location in the facility and deposit the set of like replacement objects at the vacant space at the designated location at which the like physical objects are supposed to be disposed.
19. The method of claim 18, wherein the
transmitting the instructions comprises transmitting the instructions to at least one autonomous robot device of a plurality of autonomous robot devices disposed in the facility, wherein the at least one autonomous robot device is in selective communication with the central computing system and is disposed in the facility associated with the first one of the remote systems, and wherein the instructions cause the at least one autonomous robot devices to:
navigate, in response to receiving the instructions from the central computing system, to the location in the facility of the set of like replacement objects;
pick up the set of like replacement objects;
navigate to the designated location of the like physical objects; and
deposit the set of like replacement objects to in the vacant space at the designated location of the like physical objects.
20. The method of claim 11, further comprising:
accessing, via the central computing system, remote databases each associated with a respective one of the plurality of remote systems; and
retrieving, from the remote databases, a respective one of the plurality of quantities of the like physical objects associated with the respective one of the remote systems.
US17/366,492 2017-06-21 2021-07-02 Systems and methods for object replacement Abandoned US20210334742A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US17/366,492 US20210334742A1 (en) 2017-06-21 2021-07-02 Systems and methods for object replacement
US18/243,851 US20230419252A1 (en) 2017-06-21 2023-09-08 Systems and methods for object replacement

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201762522883P 2017-06-21 2017-06-21
US16/013,469 US20180374036A1 (en) 2017-06-21 2018-06-20 Systems and Methods for Object Replacement
US17/366,492 US20210334742A1 (en) 2017-06-21 2021-07-02 Systems and methods for object replacement

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US16/013,469 Continuation US20180374036A1 (en) 2017-06-21 2018-06-20 Systems and Methods for Object Replacement

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US18/243,851 Continuation US20230419252A1 (en) 2017-06-21 2023-09-08 Systems and methods for object replacement

Publications (1)

Publication Number Publication Date
US20210334742A1 true US20210334742A1 (en) 2021-10-28

Family

ID=64693349

Family Applications (3)

Application Number Title Priority Date Filing Date
US16/013,469 Abandoned US20180374036A1 (en) 2017-06-21 2018-06-20 Systems and Methods for Object Replacement
US17/366,492 Abandoned US20210334742A1 (en) 2017-06-21 2021-07-02 Systems and methods for object replacement
US18/243,851 Pending US20230419252A1 (en) 2017-06-21 2023-09-08 Systems and methods for object replacement

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US16/013,469 Abandoned US20180374036A1 (en) 2017-06-21 2018-06-20 Systems and Methods for Object Replacement

Family Applications After (1)

Application Number Title Priority Date Filing Date
US18/243,851 Pending US20230419252A1 (en) 2017-06-21 2023-09-08 Systems and methods for object replacement

Country Status (2)

Country Link
US (3) US20180374036A1 (en)
WO (1) WO2018237013A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU2018203338B2 (en) * 2018-05-14 2020-06-25 Deutsche Post Ag Autonomous robot vehicle
US20210349468A1 (en) * 2020-05-11 2021-11-11 Autoguide, LLC Identifying elements in an environment

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6341269B1 (en) * 1999-01-26 2002-01-22 Mercani Technologies, Inc. System, method and article of manufacture to optimize inventory and merchandising shelf space utilization
US20030171979A1 (en) * 2002-03-11 2003-09-11 Jenkins Margalyn Toi System and method for selecting and arranging products on a shelf
US20030182176A1 (en) * 2002-03-25 2003-09-25 Jurgen Monnerjahn Method of computer-supported assortment optimization and computer system
US20050114196A1 (en) * 2003-11-20 2005-05-26 Tor Schoenmeyr Product assortment optimization systems, products and methods
US20110288684A1 (en) * 2010-05-20 2011-11-24 Irobot Corporation Mobile Robot System
US20120029687A1 (en) * 2010-07-28 2012-02-02 Par Systems, Inc. Robotic storage and retrieval systems
US8285584B2 (en) * 2004-03-08 2012-10-09 Sap Ag System and method for performing assortment planning
US8417559B2 (en) * 2008-04-25 2013-04-09 Fair Isaac Corporation Assortment planning based on demand transfer between products
US20140058781A1 (en) * 2012-08-24 2014-02-27 Kishore Padmanabhan Assortment planning and optimization
US20150019391A1 (en) * 2013-06-26 2015-01-15 Amazon Technologies, Inc. Detecting item interaction and movement
US20150324725A1 (en) * 2014-05-12 2015-11-12 Blackhawk Network, Inc. Optimized Planograms
US9205886B1 (en) * 2011-05-06 2015-12-08 Google Inc. Systems and methods for inventorying objects
US20160114488A1 (en) * 2014-10-24 2016-04-28 Fellow Robots, Inc. Customer service robot and related systems and methods
US20160210640A1 (en) * 2015-01-20 2016-07-21 Oracle International Corporation Assortment optimization using incremental swapping with demand transference
US20170323253A1 (en) * 2016-05-04 2017-11-09 Wal-Mart Stores, Inc. Distributed Autonomous Robot Systems and Methods

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6341269B1 (en) * 1999-01-26 2002-01-22 Mercani Technologies, Inc. System, method and article of manufacture to optimize inventory and merchandising shelf space utilization
US20030171979A1 (en) * 2002-03-11 2003-09-11 Jenkins Margalyn Toi System and method for selecting and arranging products on a shelf
US20030182176A1 (en) * 2002-03-25 2003-09-25 Jurgen Monnerjahn Method of computer-supported assortment optimization and computer system
US20050114196A1 (en) * 2003-11-20 2005-05-26 Tor Schoenmeyr Product assortment optimization systems, products and methods
US8285584B2 (en) * 2004-03-08 2012-10-09 Sap Ag System and method for performing assortment planning
US8417559B2 (en) * 2008-04-25 2013-04-09 Fair Isaac Corporation Assortment planning based on demand transfer between products
US20110288684A1 (en) * 2010-05-20 2011-11-24 Irobot Corporation Mobile Robot System
US20120029687A1 (en) * 2010-07-28 2012-02-02 Par Systems, Inc. Robotic storage and retrieval systems
US9205886B1 (en) * 2011-05-06 2015-12-08 Google Inc. Systems and methods for inventorying objects
US20140058781A1 (en) * 2012-08-24 2014-02-27 Kishore Padmanabhan Assortment planning and optimization
US20150019391A1 (en) * 2013-06-26 2015-01-15 Amazon Technologies, Inc. Detecting item interaction and movement
US20150324725A1 (en) * 2014-05-12 2015-11-12 Blackhawk Network, Inc. Optimized Planograms
US20160114488A1 (en) * 2014-10-24 2016-04-28 Fellow Robots, Inc. Customer service robot and related systems and methods
US20160210640A1 (en) * 2015-01-20 2016-07-21 Oracle International Corporation Assortment optimization using incremental swapping with demand transference
US20170323253A1 (en) * 2016-05-04 2017-11-09 Wal-Mart Stores, Inc. Distributed Autonomous Robot Systems and Methods

Also Published As

Publication number Publication date
WO2018237013A1 (en) 2018-12-27
US20180374036A1 (en) 2018-12-27
US20230419252A1 (en) 2023-12-28

Similar Documents

Publication Publication Date Title
US20230419252A1 (en) Systems and methods for object replacement
US10083418B2 (en) Distributed autonomous robot systems and mehtods
US10494180B2 (en) Systems and methods for distributed autonomous robot interfacing using live image feeds
US10614274B2 (en) Distributed autonomous robot systems and methods with RFID tracking
US10625941B2 (en) Distributed autonomous robot systems and methods
US20180215545A1 (en) Systems and Methods for Resolving Issues in a Distributed Autonomous Robot System
US10614538B2 (en) Object detection using autonomous robot devices
US20190333006A1 (en) System and Method for Automated Fulfillment of Orders in a Facility
US11707839B2 (en) Distributed autonomous robot interfacing systems and methods
WO2021137925A1 (en) Improved asset delivery system
US10346798B2 (en) Systems and methods for detecting missing labels
US20240086828A1 (en) Aerial vehicle delivery of items
US20180229841A1 (en) Laser-Guided UAV Delivery System
US10782822B2 (en) Augmented touch-sensitive display system
US20180299901A1 (en) Hybrid Remote Retrieval System
US20200148232A1 (en) Unmanned Aerial/Ground Vehicle (UAGV) Detection System and Method
US20190050902A1 (en) Systems, devices, and methods for automatically triggering unsolicited events in response to detection of users

Legal Events

Date Code Title Description
AS Assignment

Owner name: WALMART APOLLO, LLC, ARKANSAS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:WAL-MART STORES, INC.;REEL/FRAME:056756/0894

Effective date: 20180321

Owner name: WAL-MART STORES, INC., ARKANSAS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:NAZARIAN, EHSAN;NEMATI, BEHZAD;REEL/FRAME:056748/0235

Effective date: 20170621

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO PAY ISSUE FEE