US20200410434A1 - Managing objects with assigned status in an automated tool control system - Google Patents

Managing objects with assigned status in an automated tool control system Download PDF

Info

Publication number
US20200410434A1
US20200410434A1 US16/915,750 US202016915750A US2020410434A1 US 20200410434 A1 US20200410434 A1 US 20200410434A1 US 202016915750 A US202016915750 A US 202016915750A US 2020410434 A1 US2020410434 A1 US 2020410434A1
Authority
US
United States
Prior art keywords
status
predefined location
deposited
location
predefined
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
US16/915,750
Inventor
David C. Fly
Mathew J. Lipsey
Preston C. Phillips
Jason Newport
Andrew R. Lobo
Joseph Chwan
Frederick J. Rogers
Sean W. RYAN
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Snap On Inc
Original Assignee
Snap On Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Snap On Inc filed Critical Snap On Inc
Priority to US16/915,750 priority Critical patent/US20200410434A1/en
Publication of US20200410434A1 publication Critical patent/US20200410434A1/en
Assigned to SNAP-ON INCORPORATED reassignment SNAP-ON INCORPORATED ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: RYAN, SEAN W., CHWAN, JOSEPH, LOBO, ANDREW R., NEWPORT, JASON D., PHILLIPS, PRESTON C., ROGERS, FREDERICK J., FLY, DAVID C., LIPSEY, MATTHEW J.
Assigned to SNAP-ON INCORPORATED reassignment SNAP-ON INCORPORATED CORRECTIVE ASSIGNMENT TO CORRECT THE COMBIND DECLARATION AND ASSIGNMENT AGREEMENT SIGNED BY INVENTOR ANDREW R. LOBO PREVIOUSLY RECORDED AT REEL: 055361 FRAME: 0352. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT. Assignors: LOBO, ANDREW R., RYAN, SEAN W., CHWAN, JOSEPH, NEWPORT, JASON D., PHILLIPS, PRESTON C., ROGERS, FREDERICK J., FLY, DAVID C., LIPSEY, MATTHEW J.
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/08Logistics, e.g. warehousing, loading or distribution; Inventory or stock management
    • G06Q10/087Inventory or stock management, e.g. order filling, procurement or balancing against orders
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25HWORKSHOP EQUIPMENT, e.g. FOR MARKING-OUT WORK; STORAGE MEANS FOR WORKSHOPS
    • B25H3/00Storage means or arrangements for workshops facilitating access to, or handling of, work tools or instruments
    • B25H3/02Boxes
    • B25H3/021Boxes comprising a number of connected storage elements
    • B25H3/023Boxes comprising a number of connected storage elements movable relative to one another for access to their interiors
    • B25H3/028Boxes comprising a number of connected storage elements movable relative to one another for access to their interiors by sliding extraction from within a common frame
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K19/00Record carriers for use with machines and with at least a part designed to carry digital markings
    • G06K19/06Record carriers for use with machines and with at least a part designed to carry digital markings characterised by the kind of the digital marking, e.g. shape, nature, code
    • G06K19/067Record carriers with conductive marks, printed circuits or semiconductor circuit elements, e.g. credit or identity cards also with resonating or responding marks without active components
    • G06K19/07Record carriers with conductive marks, printed circuits or semiconductor circuit elements, e.g. credit or identity cards also with resonating or responding marks without active components with integrated circuit chips
    • G06K19/0723Record carriers with conductive marks, printed circuits or semiconductor circuit elements, e.g. credit or identity cards also with resonating or responding marks without active components with integrated circuit chips the record carrier comprising an arrangement for non-contact communication, e.g. wireless communication circuits on transponder cards, non-contact smart cards or RFIDs
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/10009Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation sensing by radiation using wavelengths larger than 0.1 mm, e.g. radio-waves or microwaves
    • G06K7/10366Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation sensing by radiation using wavelengths larger than 0.1 mm, e.g. radio-waves or microwaves the interrogation device being adapted for miscellaneous applications
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/583Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content

Definitions

  • the present subject matter relates to automated tool control systems, and to techniques and equipment to manage objects in automated tool control systems.
  • the process of removing tools needing maintenance from the worksite, repairing the tools, and returning the tools to the worksite rely on manual methods. For instance, when a user notices that a tool is broken, the user may order repair of the tool. A repair person may pick up the tool for repair from the worksite. However, if the user is not there to hand the repair person the tool and also if the user fails to share that the information about the repair with his or her colleagues, a wrong tool may be picked up by the repair person causing a delay in the repair process.
  • the person picking tools for repair from worksites may be different from the person performing the actual repair on the tools. Further, the person returning the repaired tools to the worksites may be different from either of the two persons involved in the repair process.
  • a delay in communication amongst the people involved in the repair process also leads to delay in the repair process. Miscommunication amongst the people involved in the repair process may lead to further delay in the repair process.
  • automated tool control systems have been developed which automatically manage objects, such as tools, according to the status assigned to the object.
  • the various systems and methods disclosed herein relate to automated tool control systems used which manage objects with assigned status.
  • FIG. 1 illustrates an exemplary automated tool control system 100 according to example aspects of the subject technology.
  • the automated control system 100 includes a computing device 102 , a database 104 , tool control storage devices 106 A, 106 B, and 106 C (hereinafter collectively referred to as “tool control storage devices 106 ”), and a network 108 .
  • the automated control system 100 can have more or fewer computing devices (e.g., 102 ), databases (e.g., 104 ), and/or tool control storage devices (e.g., 106 A, 106 B, and 106 C) than those shown in FIG. 1 .
  • the computing device 102 can represent various forms of processing devices that have a processor, a memory, and communications capability.
  • the processor may execute computer instructions stored in memory.
  • the computing device 102 is configured to communicate with the database 104 and the tool control storage devices 106 via the network 108 .
  • processing devices can include a desktop computer, a laptop computer, a handheld computer, a personal digital assistant (PDA), or a combination of any of these processing devices or other processing devices.
  • PDA personal digital assistant
  • the computing device 102 may have applications installed thereon.
  • the applications may include an administrative client software application for automatically controlling and managing the identifications of tools, the locations of the tools, and status of the tools.
  • the administrative client software application may be used to manipulate and store data and store and display information relative to the data to system users.
  • the administrative client software application may associate an identification (ID) to the tools. For example, when a tool is added to the automated tool control system, the tool may be assigned an ID.
  • the system administrator may assign an ID to the tool or the automated tool control system may auto-generate an ID and assign the ID to the tool.
  • the administrative client software application may track the locations of the tools and status of the tools based on the data from the tool control storage devices 106 .
  • the database 104 is a data storage for storing data associated with tools in the automated tool control system and the system users.
  • the database 104 may store data associated with the locations and statuses of the tools.
  • the tool control storage devices 106 each has a processor, a memory, and communications capability.
  • the processor may execute computer instructions stored in memory.
  • the tool control storage device 106 has a data link, such as a wired or wireless link, for exchanging data with the administrative client software application on the computing device 102 and the database 104 .
  • the tool control storage devices 106 transfer and receive data to and from the database 104 via the network.
  • the tool control storage device 106 is a toolbox in some embodiments.
  • the tool control storage devices 106 may more generally be tool lockers or any other secure storage devices or enclosed secure storage areas (e.g., a tool crib or walk-in tool locker).
  • Each of the tool control storage devices 106 is an example of a highly automated inventory control system that utilizes multiple different sensing technologies for identifying inventory conditions of objects in the storage unit.
  • the tool control storage devices 106 use machine imaging or RF sensing methodologies for identifying inventory conditions of objects in the storage unit.
  • Illustrative features include the ability to process complex image data with efficient utilization of system resources, autonomous image and camera calibrations, identification of characteristics of tools from image data, adaptive timing for capturing inventory images, efficient generation of reference data for checking inventory status, autonomous compensation of image quality, etc. Further features include the ability to emit and receive RF sensing signals such as RF identification (RFID) signals, to process the received signals to identify particular tools, and to cross-reference tool information obtained through the multiple different sensing modalities (e.g., camera and RFID based modalities) to provide advanced features.
  • RFID RF identification
  • FIGS. 2A and 2B illustrate various exemplary tool control storage devices 106 .
  • the tool control storage device 106 includes a user interface 305 , an access control device 306 , such as a card reader, for verifying identity and authorization levels of a user intending to access tool control storage device 106 , and multiple tool storage drawers 330 for storing tools.
  • the storage system may include shelves, compartments, containers, or other object storage devices from which tools or objects are issued and/or returned, or which contain the storage device from which the objects are issued and/or returned.
  • the storage system includes storage hooks, hangers, toolboxes with drawers, lockers, cabinets with shelves, safes, boxes, closets, vending machines, barrels, crates, and other material storage means.
  • User interface 305 is an input and/or output device of the tool control storage device 106 , configured to display information to a user.
  • Information may include work instructions, tool selection, safety guidelines, torque settings, system and tool status alerts and warnings.
  • the user interface 305 may be configured to display the information in text strings and images in the default language assigned to the user who currently has access to the tool control storage device 106 .
  • the tool control storage device 106 may include speakers as another output device of the tool control storage device 106 for outputting the information.
  • the access control device 306 authenticates a user's authorization for accessing automated tool control system 100 . Specifically, the access control device 306 is used to limit or allow access to the tool storage drawers 330 .
  • the methods and systems used to electronically identify the user requesting access may include any one or more of the following technologies, and others not mentioned, individually or in combination: RFID proximity sensors with cards; magstripe cards and scanners; barcode cards and scanners; common access cards and readers; biometric sensor ID systems, including facial recognition, fingerprint recognition, handwriting analysis, iris recognition, retinal scan, vein matching, voice analysis, and/or multimodal biometric systems.
  • the access control device 306 through the use of one or more electronically controlled locking devices or mechanisms, keeps some or all the storage drawers 330 locked in a closed position until access the control device 306 authenticates a user's authorization for accessing the tool control storage device 106 . If access control device 306 determines that a user is authorized to access the tool control storage device 106 , it unlocks some or all of the storage drawers 330 , depending on the user's authorization level, allowing the user to remove or replace tools. In particular, the access control device 306 may identify predetermined authorized access levels to the system, and allow or deny physical access by the user to the three dimensional space or object storage devices based on those predetermined authorized levels of access.
  • the tool control storage device 106 includes several different sensing subsystems.
  • the tool control storage device 106 includes a first sensing subsystem in the form of an image sensing subsystem configured to capture images of contents or storage locations of the system.
  • the image sensing subsystem may include lens-based cameras, CCD cameras, CMOS cameras, video cameras, or any types of device that captures images.
  • the tool control storage device 106 may further include a second sensing subsystem that, in one example, takes the form of an RFID sensing subsystem including one or more RFID antennas, RFID transceivers, and RFID processors.
  • the RFID sensing subsystem is configured to emit RF sensing signals, receive RFID signals returned from RFID tags mounted on or incorporated in tools or other inventory items in response to the RF sensing signals, and process the received RFID signals to identify individual tools or inventory items.
  • the image sensing subsystem is described in further detail below in relation to FIG. 3B . While FIG. 3B corresponds to the specific embodiment of the tool control storage device 106 shown in FIG. 1 , the teachings illustrated in FIG. 3B can be applied to each of the embodiments of FIG. 1 .
  • the RFID sensing subsystem may be configured to sense RFID tags of tools located in all the storage drawers 330 of the tool control storage device 106 , or configured to sense RFID tags of tools located in a particular subset of the drawers 330 of the tool control storage device 106 .
  • the RFID sensing subsystem is configured to sense RFID tags of tools located only in the top-most and bottom-most drawers 330 of tool control storage device 106 , and the RFID sensing subsystem includes RFID antennas disposed directly above the top-most and bottom-most drawers 330 within tool control storage device 106 to sense RFID tags of tools located in those drawers.
  • RFID antennas disposed directly above the top-most and bottom-most drawers 330 within tool control storage device 106 to sense RFID tags of tools located in those drawers.
  • Other configurations of RFID antennas can also be used.
  • the tool control storage device 106 further includes a data processing system, such as a computer, for processing images captured by the image sensing device, for processing RFID signals captured by the RFID antennas and transceivers, and/or for processing other sensing signals received by other sensing subsystems.
  • the data processing system includes one or more processors (e.g., micro-processors) and memory storing program instructions for causing the tool control storage device 106 to communicate electronically directly or through a network with sensing devices and obtain data from sensing devices relative to the presence or absence data of objects within the three dimensional space or object storage device.
  • Images, RFID signals, and other sensing signals captured or received by the sensing subsystems are processed by the data processing system for determining an inventory condition of the system or each storage drawer.
  • inventory condition as used throughout this disclosure means information relating to an existence/presence or non-existence/absence condition of objects in the storage system.
  • FIG. 3A shows a detailed view of one drawer 330 of the tool control storage device 106 in an open position.
  • each storage drawer 300 includes a foam base 180 having a plurality of storage locations, such as tool cutouts 181 , for storing tools.
  • Each cutout is specifically contoured and shaped for fittingly receiving a tool with corresponding shapes.
  • Tools may be secured in each storage location by using hooks, Velcro, latches, pressure from the foam, etc.
  • each storage drawer 330 includes multiple storage locations for storing various types of tools.
  • a storage location is a location in a storage system for storing or securing objects.
  • each tool has a specific pre-designated storage location in the tool storage system.
  • one or more tools in the drawer 330 may have an RFID tag mounted or attached thereon.
  • FIG. 3B shows a perspective view of an imaging subsystem in the tool control storage device 106 according to an embodiment.
  • the tool control storage device 106 includes an imaging compartment 315 which houses an image sensing subsystem comprising three cameras 310 and a light directing device, such as a mirror 312 having a reflection surface disposed at about 45 degrees downwardly relative to a vertical surface, for directing light reflected from the drawers 330 to the cameras 310 .
  • the directed light after arriving at the cameras 310 , allows the cameras 310 to form images of the drawers 330 .
  • the shaded area 340 below the mirror 312 represents a viewing field of the imaging sensing subsystem of the tool control storage device 106 .
  • the imaging subsystem scans a portion of an open drawer 336 that passes through the field of view of the imaging sensing subsystem, for example as the drawer 336 is opened and/or closed.
  • the imaging subsystem thereby captures an image of at least that portion of the drawer 336 that was opened. Processing of the captured image is used to determine the inventory conditions of tools and/or storage locations in the portion of the drawer 336 that was opened.
  • the image sensing subsystem captures an image of a particular drawer 330 and performs an inventory of the drawer in response to detecting movement of the particular drawer.
  • the image sensing subsystem may perform an inventory of the drawer in response to detecting that the drawer is closing or has become completely closed.
  • the image sensing subsystem may image the drawer both as it is opening and as it closes.
  • the RF sensing subsystem is generally configured to perform inventory checks of drawers or shelves having RF-based tags associated therewith.
  • the RF-based tags may be RFID tags that are attached to or embedded within the tools.
  • the RF-based tag encodes an identifier unique to the tool, such that both the tool type (e.g., screwdriver, torque wrench, or the like) and the unique tool (e.g., a particular torque wrench, from among a plurality of torque wrenches of the model and type) can be identified from reading the RF-based tag.
  • the information encoded in the RF-based tag is generally unique to the tool such that it can be used to distinguish between two tools that are of a same type, same model, same age, same physical appearance, etc.
  • the RF sensing system includes antennas mounted in or around the tool control storage device 106 .
  • the antennas may be mounted inside the tool control storage device 106 and be configured to only detect the presence of RF-based tags that are located within the tool control storage device 106 (or other defined three dimensional space).
  • each antenna may be mounted so as to only detect the presence of RF-based tags that are located within a particular drawer or compartment of the tool control storage device 106 , and different antennas may be associated with and mounted in different drawers or compartments.
  • some antennas may further be configured to detect the presence of RF-based tags in the vicinity of the tool control storage device 106 even if the tags are not located within the tool control storage device 106 .
  • Each antenna is coupled to an RF transceiver that is operative to cause the antenna to emit an RF sensing signal used to excite the RF-based tags located within the vicinity of the antenna, and is operative to sense RF identification signals returned by the RF-based tags in response to the RF sensing signal.
  • One or more RF processors control the operation of the RF transceivers and process the RF identification signals received through the antennas and transceivers.
  • the RF sensing subsystem performs an RF-based scan of the tool control storage device 106 when a drawer or compartment storing tools having RF identification tags is completely closed.
  • the RF-based scan can be performed in response to detecting that the drawer has been completely closed, or performed at any time when the drawer is completely closed.
  • the RF-based scan can also be triggered by a user logging into or logging out of the tool control storage device 106 .
  • an RF-based scan can be performed in response to similar triggers causing a camera-based inventory of the tool control storage device 106 to be performed.
  • the RF processor typically needs to perform multiple sequential scans in order to ensure that all RF-based tags are detected. Specifically, the RF processor generally does not know how many RF tags it needs to detect, since one or more tags may be missing (e.g., if a tool has been checked out). Further, the RF processor cannot generally ensure that all RF tags in its vicinity have been detected in response to a single scan operation (corresponding to the emission of one RF sensing signal, and the processing of any RF identification responses received in response to the one RF sensing signal).
  • the RF processor will generally perform ten, twenty, or more sequential RF-based scans any time an inventory of the tool control storage device 106 is to be performed. Because multiple RF-based scans need to be performed, the RF scanning operation may require 10 or more seconds to be performed, resulting in significant inconvenience to users of the tool control storage device 106 .
  • imaging-based inventory scans of the tool control storage device 106 have the disadvantage that they cannot distinguish between physically identical tools.
  • RF-based scans of the tool control storage device 106 may suffer from significant delay, and cannot determine if an RF tag alone (instead of an RF tag attached to its associated tool) has been returned to the drawer or storage compartment. Both scanning methodologies, when used alone, are thus susceptible to fraud (by using a tool cut-out, or using a RFID tag removed from tool) and inconvenience.
  • each technology may not be suitable for inventorying all tools in a particular tool control storage device 106 ; for example, some tools may be too small to have an RF-based tag mounted thereon, or attaching of such a tag to the tool may cause the tool to be unwieldy. The inventory of such tools may thus be better suited to visual-scanning methodologies even in tool control storage devices 106 capable of RF-based sensing.
  • the tool control storage device 106 advantageously uses multiple scanning methodologies in combination in some embodiments.
  • the tool control storage device 106 may firstly perform a first inventory scan based on an image-based scan to obtain a quick (e.g., near instantaneous) determination of whether any tools are missing from the tool control storage device 106 based on the image-based scan alone.
  • the result of the first inventory scan is additionally used to determine how many RF-based tags are expected to be in the tool control storage device 106 .
  • the first inventory scan is used to determine that ‘n’ tools having associated RF tags are missing from the tool control storage device 106 .
  • the first inventory scan is then used to determine that the ‘m-n’ RF-based tags should be searched for using the second inventory scan (e.g., an RF-based scan).
  • the second inventory scan (e.g., an RF-based scan) is performed a single time, and only needs to be repeated if less than ‘m-n’ RF-based tags are detected by the first iteration of the second inventory scan (e.g., the RF-based scan).
  • the second inventory scan can be completed very efficiently—notably in situations in which only one or a few secondary scans are needed to detect all of the ‘m-n’ RF-based tags that are expected to be detected in the tool control storage device 106 .
  • an inventory cross-check is performed between the results of the first and second inventory scans to ensure that the results of the two scans are consistent. Specifically, the inventory cross-check is performed to ensure that both inventory scans have identified the same tools as being present in the tool control storage device 106 and have identified the same tools as being absent from the tool control storage device 106 . User alerts are issued if the results of the two inventory scans are not consistent with each other.
  • the sensing technologies and sensing devices used in the tool control storage device 106 can include one or more of:
  • a physically defined, secure three dimensional object storage device is provided.
  • the storage device is the container from which tools and/or objects are issued and/or returned.
  • the physically defined, secure three dimensional object storage device is equipped with a processor and software operative to cause the device to communicate electronically directly or through a network with sensing devices and to obtain data from sensing devices indicating the presence or absence data of objects within the three dimensional object storage device.
  • the sensing devices used within the three dimensional object storage device include machine vision identification devices such as cameras or RFID antennas and decoders.
  • the physically defined, secure three dimensional object storage device is equipped with an electronically controlled locking mechanism, along with an access control device including a processor and software means to electronically identify a user requesting access to the secure area or object storage device in some embodiments.
  • the processor and software identify predetermined authorized access levels to the system, and allow or deny physical access by the user to the three dimensional space or object storage devices based on those predetermined authorized levels of access.
  • the access control device used to electronically identify the user requesting access uses RFID proximity sensors with cards in some embodiments.
  • the physically defined, secure object storage device is equipped with drawers.
  • At least one RFID antenna is attached inside the storage device and is configured for scanning for RFID tags within the storage device. In embodiments with multiple RFID antennas, different RFID antennas may be distributed throughout the storage device.
  • a user scans or approaches an access card to the access control device of the storage device.
  • the processor of the access control device determines an access level of the user based on the access card. If the user is determined to be authorized for access to the storage device, the authorized user gains access to the object storage device.
  • the sensing subsystems and data processing system of the storage device are activated. Light emitting diodes (LEDs) used for providing light to the system are activated, and cameras are activated.
  • the latch of the storage system is unlocked, and the user opens one or more drawers and removes or returns one or more objects.
  • LEDs Light emitting diodes
  • the RFID scanning subsystem need not be activated and the system can use only imaging data.
  • the imaging subsystem is used to optionally image the drawer as it opens and to image the drawer as it is closed (or once it is closed), and object presence and absence is determined using only the captured images.
  • a camera-based scan of the drawer is optionally performed prior to or as the drawer opens. Additionally, the RFID sensing subsystem is activated and an RFID scan may be completed prior to opening the drawer to identify all RFID tags present in the storage system (or all RFID tags present in the drawer being opened). Specifically, an RFID scan is optionally performed prior to opening of the drawer. Additionally, a camera-based scan of the drawer is performed as the drawer closes. In response to the drawer being fully closed, or in response to the user logging out of the storage system an RFID scan of the drawer or box is performed.
  • the imaging subsystem determines and reports object presence and absence in the drawer, and the RFID subsystem scan confirms presence and absence of the specific objects in the drawer or box using the RFID tag data.
  • imaging data and RFID tag data are combined to report presence and absence of all scanned tools, plus presence or absence of serialized items through use of RFID data.
  • the inventory scan results are depicted on a display.
  • object status is transmitted via network to a primary database and/or to an administrative application. LED lights are turned off, the lock is engaged, and cameras are set in idle state.
  • the storage system can perform other actions. For example, the system can activate or initiate an RFID scan on the contents of the object storage device on a scheduled or timed basis between user accesses and thereby confirm that the contents of the storage device have not changed since the last user access.
  • an automated asset management system such as a toolbox, may use both camera-based and radio-frequency (RF) based sensing technologies to sense the presence and/or other attributes of a particular tool (or of multiple tools).
  • the camera-based sensing may provide an instantaneous (or near-instantaneous) indication of whether the particular tool is present in or absent from the system.
  • the RF-based sensing may enable the system to differentiate between multiple tools that are identical to the camera-based sensing module (e.g., similar torque wrenches), for example by distinguishing between the tools' serial numbers (or other unique identifiers) or other unique tool identifiers encoded in a RF-based tag.
  • the automated asset management system may be configured to more efficiently perform RF-based sensing by leveraging the combined use of the camera-based and RF-based sensing modalities as described in more detail below.
  • the scan data can be used to identify whether a specific tool (from among multiple similar tools) has been checked out or checked back in to the tool control storage device 106 .
  • the scan data can thus be used to determine how many times a particular tool has been checked out, and/or for how long a duration the particular tool has been checked out.
  • the tool control storage device 106 can thus determine whether the particular tool should be scheduled for a re-calibration or other upkeep, for example.
  • the tool control storage device 106 can thus individually track the usage of different torque wrenches and ensure that each torque wrench is recalibrated after a certain number of uses.
  • the inventory performed by the tool control storage device 106 using multiple sensing technologies can be used to identify the individual user who received and/or returned the object/tool, identify the object/tool which is being issued or returned, place a time stamp on each transaction within the system, and store the item and user data in a database
  • the processor and memory storing executable software program instructions of the tool control storage device 106 can exchange data with the administrative software application (e.g., computing device 102 ).
  • Data may include tool issue and return data, tool statuses, user access, work location, and other data related to tool transactions and usage.
  • sensing technologies allow the usage of the tools to be tracked.
  • the tracked usage of a tool indicates that the tools needs attention (e.g., repair, calibration, replace, etc.)
  • the tool may be removed from the worksite for repair.
  • managing the repair processes are prone to delays resulting in inefficiencies because of logistical challenges.
  • the repair process is automatically managed.
  • FIG. 4A shows a schematic diagram illustrating an exemplary hierarchy structure 400 A of automated tool control system 100 .
  • the hierarchy structure of automated tool control system 100 includes locations including Root, Cal Lab, Production/Maintenance, Test, Building A, Building B, Line 1, Line 2, Station 1, Station 2, Tool Crib, and multiple automated tool control devices ATC1-9 and AC1-9 (e.g., tool control storage devices 106 ).
  • Each of the locations within the hierarchy is assigned attributes.
  • attributes may be devices in the location, employees/users assigned to the devices in the location, objects/tools stored in the device in the location, object statuses assigned to the objects/tools in the locations.
  • the locations may be a bin or a storage device having the function of accepting and temporarily storing objects with status, automatically distributing alerts to employees who can fix the issues indicated in the status, and then ensuring the repaired or replacement object is returned to the appropriate storage location.
  • objects may include any item stored and managed in an automated storage device or tool crib that can be issued to a manufacturing, service and maintenance and repair environments.
  • Objects may include tools, measuring tools, torque wrenches, pneumatic and electric power tools.
  • objects may include jigs, fixtures, clamps and other work and tool holding devices.
  • objects include machining tools including drill bits, reamers, boring tools, end and side mills, and abrasive devices, grinding wheels, whetstones, files, sanding devices, and the like.
  • objects may be electric and electronic devices, such as meters, oscilloscopes, laptops, tablets, computers, hand held scanners, welding equipment, and the like.
  • Statuses of the objects/tools may be assigned by users via a user interface 305 of a tool control storage device 106 .
  • statuses may include broken, needs inspection, out of calibration, out for repair, replacement requested, and sharpening drills and other cutting tools.
  • the statuses are not limited to the foregoing list of statuses.
  • Other statuses may be assigned to describe the status of the tools in the tool control storage devices 106 .
  • a user A logs into a tool control storage device ATC3 in a storage location Building A/Line 2. After the user A logs into the tool control storage device ATC3, the user A checks out a torque wrench from the tool control storage device ATC3.
  • direct camera imaging of the object's characteristics, imaging of colored tags on the object, RFID tags, bar codes, and bar code readers may be used to identify that the torque wrench is checked out.
  • a container may include a tube or a sleeve that accommodates the item to be repaired or replaced.
  • the identifying mark may include visual marks, such as barcodes or QR codes.
  • user A may use the user interface 305 of the tool control storage device 106 (e.g., ATC 3 ) to associate the torque wrench with the identifying mark on the container.
  • the torque wrench in the container is deposited in a locked container corresponding to an Out of Cal location.
  • the torque wrench is automatically assigned a “Out of Calibration” status.
  • the tool control storage device 106 which monitors the usage of the stored tool may assign a status to the stored tool based on the historical data of the stored tool. For example, once the torque wrench is deposited in the locked container or when the status of the torque wrench indicates “Out of Calibration”, an alert is automatically distributed to a runner (e.g., a human or a robot). For example, the alert to the runner may be automatically distributed in response to the locked container sensing the torque wrench being deposited. In some embodiments, the runner may be automatically notified when the torque wrench is assigned the status of “Our of Calibration”.
  • the Out of Cal location is configured with instructions to automatically distribute an alert signal to users associated with the torque wrench, system administrators, and other related locations in the hierarchy.
  • the related locations may include the Cal Lab location.
  • user A and Cal Lab technicians may receive the alert signal from the Out of Cal location.
  • alert signals may contain information regarding the tool identification, the currently assigned status, the Out of Cal location, and process instructions associated with the assigned status. The alert signals may include additional information that may be useful for repair/replace processes.
  • the runner In response to receiving the alert, the runner is dispatched to pick up the torque wrench from the storage location Building A/Line 2.
  • the runner or the user A scans the identifying mark on the container and confirms the content of the container is what was input by the user A and recorded in the system.
  • the runner may issue or check the uncalibrated torque wrench out of the Out of Cal location and check the torque wrench in into the Cal Lab location.
  • the uncalibrated torque wrench is then processed by Cal Lab technicians using standard procedures established by the Cal Lab team and tracked in sublocations inside the Cal Lab location.
  • the recalibrated torque wrench is then issued out of the Cal Lab location and returned to the tool control storage device ATC3 in a storage location Building A/Line 2.
  • Each transaction is recorded in a log that captures tool identification data, time stamps, technician/user ID, and the like for each activity.
  • the transactional data are stored in the database 104 .
  • the transactional data may be viewed from the administrative client software application on the computing device 102 or from the user interface 305 of the tool control storage device 106 .
  • FIG. 4B shows a schematic diagram illustrating an exemplary hierarchy structure 400 B of automated tool control system 100 .
  • the hierarchy structure 400 B includes locations Root, Production/Maintenance, Building A, Line 2, and Aircraft #1.
  • an alert may be automatically distributed to a runner to facilitate a pick-up of the object A from the Broken Bin location.
  • the object A moves through the sublocations Incoming, In Process, and Outgoing within the Cal Lab location.
  • the object A is placed in a Return Bin location.
  • an alert may be automatically distributed to a runner who, in response to the alert, will pick up the object A from the Return Bin location and return the object A to the tool control storage device in the Line 2 location.
  • an object B (not illustrated) in a tool control storage device in the Aircraft #1 location is determined to be broken
  • the object B is placed in the Broken Bin location as described above.
  • the object B is automatically assigned a status “broken”.
  • an alert may be automatically distributed to a runner to facilitate a pick-up of the object B from the Broken Bin location.
  • the object B moves through the sublocations Incoming, In Process, and Outgoing within the Repair Lab location.
  • the object B is placed in a Return Bin location where the object B placed in the Return Bin location is returned to the tool control storage device in the Aircraft #1 location.
  • alert signals are automatically distributed when the objects (e.g., object A and object B) moves from one location to another location.
  • the hierarchy structures 400 A and 400 B may include the “Status Remediation” location where the status assigned to objects could be cleared while the object remains in the “Status Remediation” location by a user with appropriate access rights to that location.
  • Registering and recording items deposited in the “Status Remediation” location could be a manual data entry by the technician or tool crib attendant.
  • the technicians and tool crib attendant may enter data via the user interface 305 of the objects issued to the technician currently in possession of the item, the status may be assigned, and the electronic inventory transaction from the tool user to the Status Remediation location may be viewed.
  • the item transactions and transfers may be automatically registered and recorded in the database 104 .
  • FIG. 5 conceptually illustrates an exemplary electronic system 500 with which some implementations of the subject technology can be implemented.
  • the computing device 102 and the tool control storage devices 106 may be, or may include all or part of, the electronic system components that are discussed below with respect to the electronic system 500 .
  • the electronic system 500 can be a computer, phone, personal digital assistant (PDA), or any other sort of electronic device.
  • PDA personal digital assistant
  • Such an electronic system includes various types of computer readable media and interfaces for various other types of computer readable media.
  • the electronic system 500 includes a bus 508 , processing unit(s) 512 , a system memory 504 , a read-only memory (ROM) 510 , a permanent storage device 502 , an input device interface 514 , an output device interface 506 , and a network interface 516 .
  • processing unit(s) 512 includes a bus 508 , processing unit(s) 512 , a system memory 504 , a read-only memory (ROM) 510 , a permanent storage device 502 , an input device interface 514 , an output device interface 506 , and a network interface 516 .
  • ROM read-only memory
  • the bus 508 collectively represents all system, peripheral, and chipset buses that communicatively connect the numerous internal devices of the electronic system 500 .
  • the bus 508 communicatively connects the processing unit(s) 512 with the ROM 510 , system memory 504 , and permanent storage device 502 .
  • the processing unit(s) 512 retrieves instructions to execute and data to process in order to execute the processes of the subject disclosure.
  • the processing unit(s) can be a single processor or a multi-core processor in different implementations.
  • the ROM 510 stores static data and instructions that are needed by the processing unit(s) 512 and other modules of the electronic system.
  • the permanent storage device 502 is a read-and-write memory device. This device is a non-volatile memory unit that stores instructions and data even when the electronic system 500 is off. Some implementations of the subject disclosure use a mass-storage device (for example, a magnetic or optical disk, or flash memory) as the permanent storage device 502 .
  • the system memory 504 is a read-and-write memory device. However, unlike the storage device 502 , the system memory 504 is a volatile read-and-write memory, such as a random access memory.
  • the system memory 504 stores some of the instructions and data that the processor needs at runtime. In some implementations, the processes of the subject disclosure are stored in the system memory 504 , the permanent storage device 502 , or the ROM 510 .
  • the various memory units include instructions for displaying graphical elements and identifiers associated with respective applications, receiving a predetermined user input to display visual representations of shortcuts associated with respective applications, and displaying the visual representations of shortcuts.
  • the processing unit(s) 512 retrieves instructions to execute and data to process in order to execute the processes of some implementations.
  • the bus 508 also connects to the input and output device interfaces 514 and 506 .
  • the input device interface 514 enables the user to communicate information and select commands to the electronic system.
  • Input devices used with the input device interface 514 include, for example, alphanumeric keyboards and pointing devices (also called “cursor control devices”).
  • the output device interface 506 enables, for example, the display of images generated by the electronic system 500 .
  • Output devices used with the output device interface 506 include, for example, printers and display devices, for example, cathode ray tubes (CRT) or liquid crystal displays (LCD). Some implementations include devices, for example, a touchscreen that functions as both input and output devices.
  • CTR cathode ray tubes
  • LCD liquid crystal displays
  • the bus 508 also couples the electronic system 500 to a network (not shown) through a network interface.
  • the computer can be a part of a network of computers (for example, a LAN, a WAN, or an Intranet, or a network of networks, for example, the Internet). Any or all components of the electronic system 500 can be used in conjunction with the subject disclosure.
  • Computer readable storage medium also referred to as computer readable medium.
  • processing unit(s) e.g., one or more processors, cores of processors, or other processing units
  • Examples of computer readable media include, but are not limited to, magnetic media, optical media, electronic media, etc.
  • the computer readable media does not include carrier waves and electronic signals passing wirelessly or over wired connections.
  • the term “software” is meant to include, for example, firmware residing in read-only memory or other form of electronic storage, or applications that may be stored in magnetic storage, optical, solid state, etc., which can be read into memory for processing by a processor.
  • multiple software aspects of the subject disclosure can be implemented as sub-parts of a larger program while remaining distinct software aspects of the subject disclosure.
  • multiple software aspects can also be implemented as separate programs.
  • any combination of separate programs that together implement a software aspect described here is within the scope of the subject disclosure.
  • the software programs when installed to operate on one or more electronic systems, define one or more specific machine implementations that execute and perform the operations of the software programs.
  • a computer program (also known as a program, software, software application, script, or code) can be written in any form of programming language, including compiled or interpreted languages, declarative or procedural languages, and it can be deployed in any form, including as a standalone program or as a module, component, subroutine, object, or other unit suitable for use in a computing environment.
  • a computer program may, but need not, correspond to a file in a file system.
  • a program can be stored in a portion of a file that holds other programs or data (e.g., one or more scripts stored in a markup language document), in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub programs, or portions of code).
  • a computer program can be deployed to be executed on one computer or on multiple computers that are located at one site or distributed across multiple sites and interconnected by a communication network.
  • Some implementations include electronic components, for example, microprocessors, storage, and memory that store computer program instructions in a machine-readable or computer-readable medium (alternatively referred to as computer-readable storage media, machine-readable media, or machine-readable storage media).
  • computer-readable media include RAM, ROM, read-only compact discs (CD-ROM), recordable compact discs (CD-R), rewritable compact discs (CD-RW), read-only digital versatile discs (e.g., DVD-ROM, dual-layer DVD-ROM), a variety of recordable/rewritable DVDs (e.g., DVD-RAM, DVD-RW, DVD+RW, etc.), flash memory (e.g., SD cards, mini-SD cards, micro-SD cards, etc.), magnetic or solid state hard drives, read-only and recordable Blu-Ray® discs, ultra-density optical discs, any other optical or magnetic media, and floppy disks.
  • CD-ROM compact discs
  • CD-R recordable compact
  • the computer-readable media can store a computer program that is executable by at least one processing unit and includes sets of instructions for performing various operations.
  • Examples of computer programs or computer code including machine code, for example, produced by a compiler, and files including higher-level code that are executed by a computer, an electronic component, or a microprocessor using an interpreter.
  • ASICs application specific integrated circuits
  • FPGAs field programmable gate arrays
  • integrated circuits execute instructions that are stored on the circuit itself.
  • the terms “computer”, “server”, “processor”, and “memory” all refer to electronic or other technological devices. These terms exclude people or groups of people.
  • display or displaying means displaying on an electronic device.
  • computer readable medium and “computer readable media” are entirely restricted to tangible, physical objects that store information in a form that is readable by a computer. These terms exclude any wireless signals, wired download signals, and any other ephemeral signals.
  • implementations of the subject matter described in this specification can be implemented on a computer having a display device, e.g., a CRT or LCD monitor, for displaying information to the user and a keyboard and a pointing device, e.g., a mouse or a trackball, by which the user can provide input to the computer.
  • a display device e.g., a CRT or LCD monitor
  • a keyboard and a pointing device e.g., a mouse or a trackball
  • Other kinds of devices can be used to provide for interaction with a user as well; for example, feedback provided to the user can be any form of sensory feedback, e.g., visual feedback, auditory feedback, or tactile feedback; and input from the user can be received in any form, including acoustic, speech, or tactile input.
  • a computer can interact with a user by sending documents to and receiving documents from a device that is used by the user; for example, by sending web pages to a web browser on a user's client device in response
  • Embodiments of the subject matter described in this specification can be implemented in a computing system that includes a back end component, e.g., as a data server, or that includes a middleware component, e.g., an application server, or that includes a front end component, e.g., a client computer having a graphical user interface or a web browser through which a user can interact with an implementation of the subject matter described in this specification, or any combination of one or more such back end, middleware, or front end components.
  • the components of the system can be interconnected by any form or medium of digital data communication, e.g., a communication network. Examples of communication networks include a local area network (LAN) and a wide area network (WAN), an inter-network (e.g., the Internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks).
  • LAN local area network
  • WAN wide area network
  • Internet inter-network
  • peer-to-peer networks
  • the computing system can include clients and servers.
  • a client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
  • a server transmits data (e.g., an HTML page) to a client device (e.g., for purposes of displaying data to and receiving user input from a user interacting with the client device).
  • client device e.g., for purposes of displaying data to and receiving user input from a user interacting with the client device.
  • Data generated at the client device e.g., a result of the user interaction
  • any specific order or hierarchy of steps in the processes disclosed is an illustration of example approaches. Based upon design preferences, it is understood that the specific order or hierarchy of steps in the processes may be rearranged, or that all illustrated steps be performed. Some of the steps may be performed simultaneously. For example, in certain circumstances, multitasking and parallel processing may be advantageous. Moreover, the separation of various system components in the embodiments described above should not be understood as requiring such separation in all embodiments, and it should be understood that the described program components and systems can generally be integrated together in a single software product or packaged into multiple software products.
  • the phrase “at least one of” preceding a series of items, with the term “and” or “or” to separate any of the items, modifies the list as a whole, rather than each member of the list (i.e., each item).
  • the phrase “at least one of” does not require selection of at least one of each item listed; rather, the phrase allows a meaning that includes at least one of any one of the items, and/or at least one of any combination of the items, and/or at least one of each of the items.
  • phrases “at least one of A, B, and C” or “at least one of A, B, or C” each refer to only A, only B, or only C; any combination of A, B, and C; and/or at least one of each of A, B, and C.
  • phrases such as an aspect, the aspect, another aspect, some aspects, one or more aspects, an implementation, the implementation, another implementation, some implementations, one or more implementations, an embodiment, the embodiment, another embodiment, some embodiments, one or more embodiments, a configuration, the configuration, another configuration, some configurations, one or more configurations, the subject technology, the disclosure, the present disclosure, other variations thereof and alike are for convenience and do not imply that a disclosure relating to such phrase(s) is essential to the subject technology or that such disclosure applies to all configurations of the subject technology.
  • a disclosure relating to such phrase(s) may apply to all configurations, or one or more configurations.
  • a disclosure relating to such phrase(s) may provide one or more examples.
  • a phrase such as an aspect or some aspects may refer to one or more aspects and vice versa, and this applies similarly to other foregoing phrases.
  • the users collect usage data associated with users, or may make use of the usage data
  • the users are provided with opportunities to control whether programs or features collect usage data (e.g., a user's preferences), and to control the user interface (UI) associated with applications based on the collected usage data.
  • the users may also be provided with options to turn on or turn off certain features or functions provided by the systems.
  • the users may elect to disable features and functions (e.g., control the UI associated with applications based on the collected usage data) offered by the systems discussed herein.
  • users may stipulate that certain data be treated in one or more ways before it is stored or used, so that personally identifiable information is removed.
  • a user's identity may be treated so that no personally identifiable information can be determined for the user, or a user's geographic location may be generalized where location information is obtained (such as to a city, zip code, or state level), so that a particular location of a user cannot be determined.
  • location information such as to a city, zip code, or state level

Abstract

The present application describes an automated inventory control system that comprises one or more storage devices containing a plurality of storage locations for storing objects and first and second predefined locations. The first and second predefined locations for receiving one or more objects includes a sensing system that is configured to sense when an object is deposited at the first and second predefined locations, respectively. The one or more processors are configured to automatically assign a first status to the object and cause transmission of an alert indicating the first status of the deposited object when an object is deposited at the first predefined location. The one or more processors are configured to track a plurality of transactions associated with the deposited object after a user checks the deposited object out of the first predefined location.

Description

    CROSS-REFERENCE TO RELATED APPLICATION
  • This application claims the benefit of U.S. Provisional Application No. 62/868,818, filed on Jun. 28, 2019, in the U.S. Patent and Trademark Office, the disclosure of which is incorporated by reference herein in its entirety.
  • TECHNICAL FIELD
  • The present subject matter relates to automated tool control systems, and to techniques and equipment to manage objects in automated tool control systems.
  • BACKGROUND
  • When tools are used in a manufacturing or service environment, it is important to monitor tool status. In the aerospace industry, for instance, it is important to ensure that tools are properly maintained to perform precise and accurate manufacturing, assembly, or repairing of aircraft or missile parts. However, tools may experience breakage, out of calibration conditions, and other failures requiring inspection, repair, calibration, or replacement of the tools. It is important to efficiently remove the tools experiencing some types of failures from the worksite and take actions needed to return the tools to the worksite.
  • Currently, the process of removing tools needing maintenance from the worksite, repairing the tools, and returning the tools to the worksite rely on manual methods. For instance, when a user notices that a tool is broken, the user may order repair of the tool. A repair person may pick up the tool for repair from the worksite. However, if the user is not there to hand the repair person the tool and also if the user fails to share that the information about the repair with his or her colleagues, a wrong tool may be picked up by the repair person causing a delay in the repair process.
  • Normally, there are multiple entities involved in processing repairs on tools. For example, the person picking tools for repair from worksites may be different from the person performing the actual repair on the tools. Further, the person returning the repaired tools to the worksites may be different from either of the two persons involved in the repair process. A delay in communication amongst the people involved in the repair process also leads to delay in the repair process. Miscommunication amongst the people involved in the repair process may lead to further delay in the repair process.
  • As such, manual methods for managing the repair processes are prone to delays resulting in inefficiencies. Managing the repair processes has logistical challenges, but relying on manual methods may further result in higher costs, lost times, and excessive waste.
  • Accordingly, there is a need for an improved system that enables efficiently removing items from worksites, processing necessary steps as required, and returning the items to the worksites.
  • DESCRIPTION
  • In the following detailed description, numerous specific details are set forth by way of examples in order to provide a thorough understanding of the relevant teachings. However, it should be apparent to those skilled in the art that the present teachings may be practiced without such details. In other instances, well known methods, procedures, components, and/or circuitry have been described at a relatively high-level, without detail, in order to avoid unnecessarily obscuring aspects of the present teachings.
  • To address the issues described in the Background, automated tool control systems have been developed which automatically manage objects, such as tools, according to the status assigned to the object. The various systems and methods disclosed herein relate to automated tool control systems used which manage objects with assigned status.
  • Reference now is made in detail to the examples illustrated in the accompanying drawings and discussed below.
  • FIG. 1 illustrates an exemplary automated tool control system 100 according to example aspects of the subject technology. The automated control system 100 includes a computing device 102, a database 104, tool control storage devices 106A, 106B, and 106C (hereinafter collectively referred to as “tool control storage devices 106”), and a network 108. In some aspects, the automated control system 100 can have more or fewer computing devices (e.g., 102), databases (e.g., 104), and/or tool control storage devices (e.g., 106A, 106B, and 106C) than those shown in FIG. 1.
  • The computing device 102 can represent various forms of processing devices that have a processor, a memory, and communications capability. The processor may execute computer instructions stored in memory. The computing device 102 is configured to communicate with the database 104 and the tool control storage devices 106 via the network 108. By way of non-limiting example, processing devices can include a desktop computer, a laptop computer, a handheld computer, a personal digital assistant (PDA), or a combination of any of these processing devices or other processing devices.
  • The computing device 102 may have applications installed thereon. For example, the applications may include an administrative client software application for automatically controlling and managing the identifications of tools, the locations of the tools, and status of the tools. In some embodiments, the administrative client software application may be used to manipulate and store data and store and display information relative to the data to system users.
  • The administrative client software application may associate an identification (ID) to the tools. For example, when a tool is added to the automated tool control system, the tool may be assigned an ID. The system administrator may assign an ID to the tool or the automated tool control system may auto-generate an ID and assign the ID to the tool. As will be described below in further detail, the administrative client software application may track the locations of the tools and status of the tools based on the data from the tool control storage devices 106.
  • The database 104 is a data storage for storing data associated with tools in the automated tool control system and the system users. The database 104 may store data associated with the locations and statuses of the tools.
  • The tool control storage devices 106 (i.e., 106A, 106B, and 106C) each has a processor, a memory, and communications capability. The processor may execute computer instructions stored in memory. The tool control storage device 106 has a data link, such as a wired or wireless link, for exchanging data with the administrative client software application on the computing device 102 and the database 104. The tool control storage devices 106 transfer and receive data to and from the database 104 via the network.
  • The tool control storage device 106 is a toolbox in some embodiments. The tool control storage devices 106 may more generally be tool lockers or any other secure storage devices or enclosed secure storage areas (e.g., a tool crib or walk-in tool locker). Each of the tool control storage devices 106 is an example of a highly automated inventory control system that utilizes multiple different sensing technologies for identifying inventory conditions of objects in the storage unit. In one example, the tool control storage devices 106 use machine imaging or RF sensing methodologies for identifying inventory conditions of objects in the storage unit.
  • Illustrative features include the ability to process complex image data with efficient utilization of system resources, autonomous image and camera calibrations, identification of characteristics of tools from image data, adaptive timing for capturing inventory images, efficient generation of reference data for checking inventory status, autonomous compensation of image quality, etc. Further features include the ability to emit and receive RF sensing signals such as RF identification (RFID) signals, to process the received signals to identify particular tools, and to cross-reference tool information obtained through the multiple different sensing modalities (e.g., camera and RFID based modalities) to provide advanced features.
  • FIGS. 2A and 2B illustrate various exemplary tool control storage devices 106. The tool control storage device 106 includes a user interface 305, an access control device 306, such as a card reader, for verifying identity and authorization levels of a user intending to access tool control storage device 106, and multiple tool storage drawers 330 for storing tools. Instead of drawers 330, the storage system may include shelves, compartments, containers, or other object storage devices from which tools or objects are issued and/or returned, or which contain the storage device from which the objects are issued and/or returned. In further examples, the storage system includes storage hooks, hangers, toolboxes with drawers, lockers, cabinets with shelves, safes, boxes, closets, vending machines, barrels, crates, and other material storage means.
  • User interface 305 is an input and/or output device of the tool control storage device 106, configured to display information to a user. Information may include work instructions, tool selection, safety guidelines, torque settings, system and tool status alerts and warnings. For instance, the user interface 305 may be configured to display the information in text strings and images in the default language assigned to the user who currently has access to the tool control storage device 106. Although not illustrated in FIGS. 2A and 2B, the tool control storage device 106 may include speakers as another output device of the tool control storage device 106 for outputting the information.
  • The access control device 306 authenticates a user's authorization for accessing automated tool control system 100. Specifically, the access control device 306 is used to limit or allow access to the tool storage drawers 330. The methods and systems used to electronically identify the user requesting access may include any one or more of the following technologies, and others not mentioned, individually or in combination: RFID proximity sensors with cards; magstripe cards and scanners; barcode cards and scanners; common access cards and readers; biometric sensor ID systems, including facial recognition, fingerprint recognition, handwriting analysis, iris recognition, retinal scan, vein matching, voice analysis, and/or multimodal biometric systems.
  • The access control device 306, through the use of one or more electronically controlled locking devices or mechanisms, keeps some or all the storage drawers 330 locked in a closed position until access the control device 306 authenticates a user's authorization for accessing the tool control storage device 106. If access control device 306 determines that a user is authorized to access the tool control storage device 106, it unlocks some or all of the storage drawers 330, depending on the user's authorization level, allowing the user to remove or replace tools. In particular, the access control device 306 may identify predetermined authorized access levels to the system, and allow or deny physical access by the user to the three dimensional space or object storage devices based on those predetermined authorized levels of access.
  • The tool control storage device 106 includes several different sensing subsystems. In an illustrative example, the tool control storage device 106 includes a first sensing subsystem in the form of an image sensing subsystem configured to capture images of contents or storage locations of the system. The image sensing subsystem may include lens-based cameras, CCD cameras, CMOS cameras, video cameras, or any types of device that captures images. The tool control storage device 106 may further include a second sensing subsystem that, in one example, takes the form of an RFID sensing subsystem including one or more RFID antennas, RFID transceivers, and RFID processors. The RFID sensing subsystem is configured to emit RF sensing signals, receive RFID signals returned from RFID tags mounted on or incorporated in tools or other inventory items in response to the RF sensing signals, and process the received RFID signals to identify individual tools or inventory items.
  • The image sensing subsystem is described in further detail below in relation to FIG. 3B. While FIG. 3B corresponds to the specific embodiment of the tool control storage device 106 shown in FIG. 1, the teachings illustrated in FIG. 3B can be applied to each of the embodiments of FIG. 1. The RFID sensing subsystem may be configured to sense RFID tags of tools located in all the storage drawers 330 of the tool control storage device 106, or configured to sense RFID tags of tools located in a particular subset of the drawers 330 of the tool control storage device 106. In one example, the RFID sensing subsystem is configured to sense RFID tags of tools located only in the top-most and bottom-most drawers 330 of tool control storage device 106, and the RFID sensing subsystem includes RFID antennas disposed directly above the top-most and bottom-most drawers 330 within tool control storage device 106 to sense RFID tags of tools located in those drawers. Other configurations of RFID antennas can also be used.
  • The tool control storage device 106 further includes a data processing system, such as a computer, for processing images captured by the image sensing device, for processing RFID signals captured by the RFID antennas and transceivers, and/or for processing other sensing signals received by other sensing subsystems. The data processing system includes one or more processors (e.g., micro-processors) and memory storing program instructions for causing the tool control storage device 106 to communicate electronically directly or through a network with sensing devices and obtain data from sensing devices relative to the presence or absence data of objects within the three dimensional space or object storage device. Images, RFID signals, and other sensing signals captured or received by the sensing subsystems are processed by the data processing system for determining an inventory condition of the system or each storage drawer. The term inventory condition as used throughout this disclosure means information relating to an existence/presence or non-existence/absence condition of objects in the storage system.
  • FIG. 3A shows a detailed view of one drawer 330 of the tool control storage device 106 in an open position. In some embodiments, each storage drawer 300 includes a foam base 180 having a plurality of storage locations, such as tool cutouts 181, for storing tools. Each cutout is specifically contoured and shaped for fittingly receiving a tool with corresponding shapes. Tools may be secured in each storage location by using hooks, Velcro, latches, pressure from the foam, etc.
  • In general, each storage drawer 330 includes multiple storage locations for storing various types of tools. As used throughout this disclosure, a storage location is a location in a storage system for storing or securing objects. In one embodiment, each tool has a specific pre-designated storage location in the tool storage system. Further, one or more tools in the drawer 330 may have an RFID tag mounted or attached thereon.
  • FIG. 3B shows a perspective view of an imaging subsystem in the tool control storage device 106 according to an embodiment. As illustrated in FIG. 3B, the tool control storage device 106 includes an imaging compartment 315 which houses an image sensing subsystem comprising three cameras 310 and a light directing device, such as a mirror 312 having a reflection surface disposed at about 45 degrees downwardly relative to a vertical surface, for directing light reflected from the drawers 330 to the cameras 310. The directed light, after arriving at the cameras 310, allows the cameras 310 to form images of the drawers 330. The shaded area 340 below the mirror 312 represents a viewing field of the imaging sensing subsystem of the tool control storage device 106. As shown at 340, the imaging subsystem scans a portion of an open drawer 336 that passes through the field of view of the imaging sensing subsystem, for example as the drawer 336 is opened and/or closed. The imaging subsystem thereby captures an image of at least that portion of the drawer 336 that was opened. Processing of the captured image is used to determine the inventory conditions of tools and/or storage locations in the portion of the drawer 336 that was opened.
  • In general, the image sensing subsystem captures an image of a particular drawer 330 and performs an inventory of the drawer in response to detecting movement of the particular drawer. For example, the image sensing subsystem may perform an inventory of the drawer in response to detecting that the drawer is closing or has become completely closed. In other examples, the image sensing subsystem may image the drawer both as it is opening and as it closes.
  • The RF sensing subsystem is generally configured to perform inventory checks of drawers or shelves having RF-based tags associated therewith. The RF-based tags may be RFID tags that are attached to or embedded within the tools. In general, the RF-based tag encodes an identifier unique to the tool, such that both the tool type (e.g., screwdriver, torque wrench, or the like) and the unique tool (e.g., a particular torque wrench, from among a plurality of torque wrenches of the model and type) can be identified from reading the RF-based tag. In particular, the information encoded in the RF-based tag is generally unique to the tool such that it can be used to distinguish between two tools that are of a same type, same model, same age, same physical appearance, etc.
  • The RF sensing system includes antennas mounted in or around the tool control storage device 106. In general, the antennas may be mounted inside the tool control storage device 106 and be configured to only detect the presence of RF-based tags that are located within the tool control storage device 106 (or other defined three dimensional space). In some examples, each antenna may be mounted so as to only detect the presence of RF-based tags that are located within a particular drawer or compartment of the tool control storage device 106, and different antennas may be associated with and mounted in different drawers or compartments. In further embodiments, some antennas may further be configured to detect the presence of RF-based tags in the vicinity of the tool control storage device 106 even if the tags are not located within the tool control storage device 106.
  • Each antenna is coupled to an RF transceiver that is operative to cause the antenna to emit an RF sensing signal used to excite the RF-based tags located within the vicinity of the antenna, and is operative to sense RF identification signals returned by the RF-based tags in response to the RF sensing signal. One or more RF processors control the operation of the RF transceivers and process the RF identification signals received through the antennas and transceivers.
  • In some embodiments, the RF sensing subsystem performs an RF-based scan of the tool control storage device 106 when a drawer or compartment storing tools having RF identification tags is completely closed. In particular, the RF-based scan can be performed in response to detecting that the drawer has been completely closed, or performed at any time when the drawer is completely closed. In some examples, the RF-based scan can also be triggered by a user logging into or logging out of the tool control storage device 106. In general, an RF-based scan can be performed in response to similar triggers causing a camera-based inventory of the tool control storage device 106 to be performed.
  • As part of performing an RF-based scan of the tool control storage device 106, the RF processor typically needs to perform multiple sequential scans in order to ensure that all RF-based tags are detected. Specifically, the RF processor generally does not know how many RF tags it needs to detect, since one or more tags may be missing (e.g., if a tool has been checked out). Further, the RF processor cannot generally ensure that all RF tags in its vicinity have been detected in response to a single scan operation (corresponding to the emission of one RF sensing signal, and the processing of any RF identification responses received in response to the one RF sensing signal). As a result, the RF processor will generally perform ten, twenty, or more sequential RF-based scans any time an inventory of the tool control storage device 106 is to be performed. Because multiple RF-based scans need to be performed, the RF scanning operation may require 10 or more seconds to be performed, resulting in significant inconvenience to users of the tool control storage device 106.
  • As noted above, imaging-based inventory scans of the tool control storage device 106 have the disadvantage that they cannot distinguish between physically identical tools. Further, RF-based scans of the tool control storage device 106 may suffer from significant delay, and cannot determine if an RF tag alone (instead of an RF tag attached to its associated tool) has been returned to the drawer or storage compartment. Both scanning methodologies, when used alone, are thus susceptible to fraud (by using a tool cut-out, or using a RFID tag removed from tool) and inconvenience. Further, each technology may not be suitable for inventorying all tools in a particular tool control storage device 106; for example, some tools may be too small to have an RF-based tag mounted thereon, or attaching of such a tag to the tool may cause the tool to be unwieldy. The inventory of such tools may thus be better suited to visual-scanning methodologies even in tool control storage devices 106 capable of RF-based sensing.
  • In order to address the deficiencies of the scanning methodologies when used individually, the tool control storage device 106 advantageously uses multiple scanning methodologies in combination in some embodiments. For example, the tool control storage device 106 may firstly perform a first inventory scan based on an image-based scan to obtain a quick (e.g., near instantaneous) determination of whether any tools are missing from the tool control storage device 106 based on the image-based scan alone. The result of the first inventory scan is additionally used to determine how many RF-based tags are expected to be in the tool control storage device 106. For example, in a tool control storage device 106 that usually stores ‘m’ tools having associated RF tags, the first inventory scan is used to determine that ‘n’ tools having associated RF tags are missing from the tool control storage device 106. The first inventory scan is then used to determine that the ‘m-n’ RF-based tags should be searched for using the second inventory scan (e.g., an RF-based scan).
  • In turn, the second inventory scan (e.g., an RF-based scan) is performed a single time, and only needs to be repeated if less than ‘m-n’ RF-based tags are detected by the first iteration of the second inventory scan (e.g., the RF-based scan). Thus, the second inventory scan can be completed very efficiently—notably in situations in which only one or a few secondary scans are needed to detect all of the ‘m-n’ RF-based tags that are expected to be detected in the tool control storage device 106.
  • Finally, an inventory cross-check is performed between the results of the first and second inventory scans to ensure that the results of the two scans are consistent. Specifically, the inventory cross-check is performed to ensure that both inventory scans have identified the same tools as being present in the tool control storage device 106 and have identified the same tools as being absent from the tool control storage device 106. User alerts are issued if the results of the two inventory scans are not consistent with each other.
  • While the above example has focused on an embodiment using camera-based and RF-based sensing technologies, the automated asset management system can use other combinations of multiple-sensing technologies. The sensing technologies and sensing devices used in the tool control storage device 106 can include one or more of:
      • Optical identification sensors, such as: sensors for detecting one dimensional barcodes with line scanner/camera; sensors for detecting two dimensional barcodes with camera/other imaging sensor; machine vision identification sensors with camera/other imaging sensor (using various sensing approaches, including UV, infrared (IR), visible light, or the like); and laser scanning;
      • RF identification sensors, such as: RFID tags affixed to/embedded in tools (active RFID tags and/or passive RFID tags); other RF technologies used in similar capacity, such as Ruby, Zigbee, WiFi, NFC, Bluetooth, Bluetooth lower energy (BLE), or the like;
      • Direct electronic connection to tool, such as: tools that have attached/embedded connectors that plug into identification system (as opposed to wireless);
      • Weight sensor(s), such as: scales to detect weight of objects; multiple scales to detect weight distribution;
      • Contact switches/sensors, such as: single go/no-go sensors; array of sensors to detect shape/outline;
      • Sonic emitter/detector pair; and/or
      • Magnetic induction/sensing, such as ferrous tool locator products.
  • A detailed example of one illustrative embodiment is provided below. In the illustrative embodiment, a physically defined, secure three dimensional object storage device is provided. The storage device is the container from which tools and/or objects are issued and/or returned. The physically defined, secure three dimensional object storage device is equipped with a processor and software operative to cause the device to communicate electronically directly or through a network with sensing devices and to obtain data from sensing devices indicating the presence or absence data of objects within the three dimensional object storage device. In the example, the sensing devices used within the three dimensional object storage device include machine vision identification devices such as cameras or RFID antennas and decoders.
  • The physically defined, secure three dimensional object storage device is equipped with an electronically controlled locking mechanism, along with an access control device including a processor and software means to electronically identify a user requesting access to the secure area or object storage device in some embodiments. The processor and software identify predetermined authorized access levels to the system, and allow or deny physical access by the user to the three dimensional space or object storage devices based on those predetermined authorized levels of access. The access control device used to electronically identify the user requesting access uses RFID proximity sensors with cards in some embodiments.
  • In some embodiments, the physically defined, secure object storage device is equipped with drawers. At least one RFID antenna is attached inside the storage device and is configured for scanning for RFID tags within the storage device. In embodiments with multiple RFID antennas, different RFID antennas may be distributed throughout the storage device.
  • In operation, in some embodiments, a user scans or approaches an access card to the access control device of the storage device. The processor of the access control device determines an access level of the user based on the access card. If the user is determined to be authorized for access to the storage device, the authorized user gains access to the object storage device. In turn, the sensing subsystems and data processing system of the storage device are activated. Light emitting diodes (LEDs) used for providing light to the system are activated, and cameras are activated. In turn, the latch of the storage system is unlocked, and the user opens one or more drawers and removes or returns one or more objects.
  • Note that if the user opens an imaging-only drawer (i.e., a drawer whose inventory condition is determined using imaging only, and not using RFID), then the RFID scanning subsystem need not be activated and the system can use only imaging data. Specifically, the imaging subsystem is used to optionally image the drawer as it opens and to image the drawer as it is closed (or once it is closed), and object presence and absence is determined using only the captured images.
  • However, if the user opens a drawer for which RFID scanning is used to determine inventory conditions, a camera-based scan of the drawer is optionally performed prior to or as the drawer opens. Additionally, the RFID sensing subsystem is activated and an RFID scan may be completed prior to opening the drawer to identify all RFID tags present in the storage system (or all RFID tags present in the drawer being opened). Specifically, an RFID scan is optionally performed prior to opening of the drawer. Additionally, a camera-based scan of the drawer is performed as the drawer closes. In response to the drawer being fully closed, or in response to the user logging out of the storage system an RFID scan of the drawer or box is performed. The imaging subsystem thus determines and reports object presence and absence in the drawer, and the RFID subsystem scan confirms presence and absence of the specific objects in the drawer or box using the RFID tag data. Thus, imaging data and RFID tag data are combined to report presence and absence of all scanned tools, plus presence or absence of serialized items through use of RFID data. The inventory scan results are depicted on a display. As the user logs out, object status is transmitted via network to a primary database and/or to an administrative application. LED lights are turned off, the lock is engaged, and cameras are set in idle state.
  • Additionally, the storage system can perform other actions. For example, the system can activate or initiate an RFID scan on the contents of the object storage device on a scheduled or timed basis between user accesses and thereby confirm that the contents of the storage device have not changed since the last user access.
  • For example, an automated asset management system, such as a toolbox, may use both camera-based and radio-frequency (RF) based sensing technologies to sense the presence and/or other attributes of a particular tool (or of multiple tools). The camera-based sensing may provide an instantaneous (or near-instantaneous) indication of whether the particular tool is present in or absent from the system. The RF-based sensing may enable the system to differentiate between multiple tools that are identical to the camera-based sensing module (e.g., similar torque wrenches), for example by distinguishing between the tools' serial numbers (or other unique identifiers) or other unique tool identifiers encoded in a RF-based tag. Further, the automated asset management system may be configured to more efficiently perform RF-based sensing by leveraging the combined use of the camera-based and RF-based sensing modalities as described in more detail below.
  • As noted above, the scan data can be used to identify whether a specific tool (from among multiple similar tools) has been checked out or checked back in to the tool control storage device 106. The scan data can thus be used to determine how many times a particular tool has been checked out, and/or for how long a duration the particular tool has been checked out. The tool control storage device 106 can thus determine whether the particular tool should be scheduled for a re-calibration or other upkeep, for example. In one example, the tool control storage device 106 can thus individually track the usage of different torque wrenches and ensure that each torque wrench is recalibrated after a certain number of uses.
  • The inventory performed by the tool control storage device 106 using multiple sensing technologies can be used to identify the individual user who received and/or returned the object/tool, identify the object/tool which is being issued or returned, place a time stamp on each transaction within the system, and store the item and user data in a database
  • The processor and memory storing executable software program instructions of the tool control storage device 106 can exchange data with the administrative software application (e.g., computing device 102). Data may include tool issue and return data, tool statuses, user access, work location, and other data related to tool transactions and usage.
  • As described above, sensing technologies allow the usage of the tools to be tracked. When the tracked usage of a tool indicates that the tools needs attention (e.g., repair, calibration, replace, etc.), the tool may be removed from the worksite for repair. However, as noted above, managing the repair processes are prone to delays resulting in inefficiencies because of logistical challenges. To efficiently remove the tool from worksites and return the tool to the worksite, the repair process is automatically managed.
  • FIG. 4A shows a schematic diagram illustrating an exemplary hierarchy structure 400A of automated tool control system 100. The hierarchy structure of automated tool control system 100 includes locations including Root, Cal Lab, Production/Maintenance, Test, Building A, Building B, Line 1, Line 2, Station 1, Station 2, Tool Crib, and multiple automated tool control devices ATC1-9 and AC1-9 (e.g., tool control storage devices 106). Each of the locations within the hierarchy is assigned attributes. For example, attributes may be devices in the location, employees/users assigned to the devices in the location, objects/tools stored in the device in the location, object statuses assigned to the objects/tools in the locations. The locations may be a bin or a storage device having the function of accepting and temporarily storing objects with status, automatically distributing alerts to employees who can fix the issues indicated in the status, and then ensuring the repaired or replacement object is returned to the appropriate storage location.
  • For example, objects may include any item stored and managed in an automated storage device or tool crib that can be issued to a manufacturing, service and maintenance and repair environments. Objects may include tools, measuring tools, torque wrenches, pneumatic and electric power tools. In some embodiments, objects may include jigs, fixtures, clamps and other work and tool holding devices. In some other embodiments, objects include machining tools including drill bits, reamers, boring tools, end and side mills, and abrasive devices, grinding wheels, whetstones, files, sanding devices, and the like. Further, objects may be electric and electronic devices, such as meters, oscilloscopes, laptops, tablets, computers, hand held scanners, welding equipment, and the like.
  • Statuses of the objects/tools may be assigned by users via a user interface 305 of a tool control storage device 106. For example, statuses may include broken, needs inspection, out of calibration, out for repair, replacement requested, and sharpening drills and other cutting tools. The statuses are not limited to the foregoing list of statuses. Other statuses may be assigned to describe the status of the tools in the tool control storage devices 106.
  • For instance, a user A (not illustrated) logs into a tool control storage device ATC3 in a storage location Building A/Line 2. After the user A logs into the tool control storage device ATC3, the user A checks out a torque wrench from the tool control storage device ATC3. In some embodiments, direct camera imaging of the object's characteristics, imaging of colored tags on the object, RFID tags, bar codes, and bar code readers may be used to identify that the torque wrench is checked out.
  • In an example, if it was determined, during use, that the torque wrench was overtightened and is now out of calibration. Rather than returning the torque wrench to the tool control storage device ATC3 in a storage location Building A/Line 2, user A would place the torque wrench in a container marked with an identifying mark (not illustrated). For example, a container may include a tube or a sleeve that accommodates the item to be repaired or replaced. The identifying mark may include visual marks, such as barcodes or QR codes.
  • When placing the torque wrench in the container with the identifying mark, user A may use the user interface 305 of the tool control storage device 106 (e.g., ATC 3) to associate the torque wrench with the identifying mark on the container. The torque wrench in the container is deposited in a locked container corresponding to an Out of Cal location.
  • Once the torque wrench is deposited in the locked container corresponding to the Out of Cal location, the torque wrench is automatically assigned a “Out of Calibration” status. In some embodiments, the tool control storage device 106 which monitors the usage of the stored tool may assign a status to the stored tool based on the historical data of the stored tool. For example, once the torque wrench is deposited in the locked container or when the status of the torque wrench indicates “Out of Calibration”, an alert is automatically distributed to a runner (e.g., a human or a robot). For example, the alert to the runner may be automatically distributed in response to the locked container sensing the torque wrench being deposited. In some embodiments, the runner may be automatically notified when the torque wrench is assigned the status of “Our of Calibration”.
  • In some embodiments, the Out of Cal location is configured with instructions to automatically distribute an alert signal to users associated with the torque wrench, system administrators, and other related locations in the hierarchy. For example, the related locations may include the Cal Lab location. For example, user A and Cal Lab technicians may receive the alert signal from the Out of Cal location. In some embodiments, alert signals may contain information regarding the tool identification, the currently assigned status, the Out of Cal location, and process instructions associated with the assigned status. The alert signals may include additional information that may be useful for repair/replace processes.
  • In response to receiving the alert, the runner is dispatched to pick up the torque wrench from the storage location Building A/Line 2. When the runner picks up the torque wrench from the storage location of Building A/Line 2, the runner or the user A scans the identifying mark on the container and confirms the content of the container is what was input by the user A and recorded in the system.
  • The runner may issue or check the uncalibrated torque wrench out of the Out of Cal location and check the torque wrench in into the Cal Lab location. Once the uncalibrated torque wrench is transferred to the Cal Lab location, the uncalibrated torque wrench is then processed by Cal Lab technicians using standard procedures established by the Cal Lab team and tracked in sublocations inside the Cal Lab location. When the uncalibrated torque wrench is recalibrated, the recalibrated torque wrench is then issued out of the Cal Lab location and returned to the tool control storage device ATC3 in a storage location Building A/Line 2.
  • Each transaction is recorded in a log that captures tool identification data, time stamps, technician/user ID, and the like for each activity. In some embodiments, the transactional data are stored in the database 104. For example, the transactional data may be viewed from the administrative client software application on the computing device 102 or from the user interface 305 of the tool control storage device 106.
  • These structures are important, for example in Aerospace Repair and maintenance Organizations' perspective. For instance, managing drill bits is a monumental task. In most cases, the technicians are issued bits, but the bits are lost to the system. There is a huge amount of waste and in most cases, the MRO business have no idea of the extent of avoidable costs. Thus, these structures would reduce or minimize those avoidable costs.
  • FIG. 4B shows a schematic diagram illustrating an exemplary hierarchy structure 400B of automated tool control system 100. The hierarchy structure 400B includes locations Root, Production/Maintenance, Building A, Line 2, and Aircraft #1.
  • For example, when an object A (not illustrated) in a tool control storage device in the Line 2 location is determined to be out of calibration, the object A is placed in the Out of Cal location as described above. In response to the object B being placed in the Broken Bin, the object A is automatically assigned a status “Out of Calibration”. At the same time, an alert may be automatically distributed to a runner to facilitate a pick-up of the object A from the Broken Bin location. When the object A is transferred from the Out of Cal location to the Cal Lab location, the object A moves through the sublocations Incoming, In Process, and Outgoing within the Cal Lab location. Once the object A is recalibrated at the Cal Lab location, the object A is placed in a Return Bin location. When the object A is placed in the Return Bin location, an alert may be automatically distributed to a runner who, in response to the alert, will pick up the object A from the Return Bin location and return the object A to the tool control storage device in the Line 2 location.
  • For example, when an object B (not illustrated) in a tool control storage device in the Aircraft #1 location is determined to be broken, the object B is placed in the Broken Bin location as described above. In response to the object B being placed in the Broken Bin, the object B is automatically assigned a status “broken”. At the same time, an alert may be automatically distributed to a runner to facilitate a pick-up of the object B from the Broken Bin location. When the object B is transferred from the Broken Bin location to the Repair Lab location, the object B moves through the sublocations Incoming, In Process, and Outgoing within the Repair Lab location. Once the object B is repaired at the Repair Lab location, the object B is placed in a Return Bin location where the object B placed in the Return Bin location is returned to the tool control storage device in the Aircraft #1 location.
  • In some embodiments, alert signals are automatically distributed when the objects (e.g., object A and object B) moves from one location to another location.
  • Although not illustrated, the hierarchy structures 400A and 400B may include the “Status Remediation” location where the status assigned to objects could be cleared while the object remains in the “Status Remediation” location by a user with appropriate access rights to that location. Registering and recording items deposited in the “Status Remediation” location could be a manual data entry by the technician or tool crib attendant. The technicians and tool crib attendant may enter data via the user interface 305 of the objects issued to the technician currently in possession of the item, the status may be assigned, and the electronic inventory transaction from the tool user to the Status Remediation location may be viewed. In some embodiments, the item transactions and transfers may be automatically registered and recorded in the database 104.
  • FIG. 5 conceptually illustrates an exemplary electronic system 500 with which some implementations of the subject technology can be implemented. In one or more implementations, the computing device 102 and the tool control storage devices 106 may be, or may include all or part of, the electronic system components that are discussed below with respect to the electronic system 500. The electronic system 500 can be a computer, phone, personal digital assistant (PDA), or any other sort of electronic device. Such an electronic system includes various types of computer readable media and interfaces for various other types of computer readable media. The electronic system 500 includes a bus 508, processing unit(s) 512, a system memory 504, a read-only memory (ROM) 510, a permanent storage device 502, an input device interface 514, an output device interface 506, and a network interface 516.
  • The bus 508 collectively represents all system, peripheral, and chipset buses that communicatively connect the numerous internal devices of the electronic system 500. For instance, the bus 508 communicatively connects the processing unit(s) 512 with the ROM 510, system memory 504, and permanent storage device 502.
  • From these various memory units, the processing unit(s) 512 retrieves instructions to execute and data to process in order to execute the processes of the subject disclosure. The processing unit(s) can be a single processor or a multi-core processor in different implementations.
  • The ROM 510 stores static data and instructions that are needed by the processing unit(s) 512 and other modules of the electronic system. The permanent storage device 502, on the other hand, is a read-and-write memory device. This device is a non-volatile memory unit that stores instructions and data even when the electronic system 500 is off. Some implementations of the subject disclosure use a mass-storage device (for example, a magnetic or optical disk, or flash memory) as the permanent storage device 502.
  • Other implementations use a removable storage device (for example, a floppy disk, flash drive) as the permanent storage device 502. Like the permanent storage device 502, the system memory 504 is a read-and-write memory device. However, unlike the storage device 502, the system memory 504 is a volatile read-and-write memory, such as a random access memory. The system memory 504 stores some of the instructions and data that the processor needs at runtime. In some implementations, the processes of the subject disclosure are stored in the system memory 504, the permanent storage device 502, or the ROM 510. For example, the various memory units include instructions for displaying graphical elements and identifiers associated with respective applications, receiving a predetermined user input to display visual representations of shortcuts associated with respective applications, and displaying the visual representations of shortcuts. From these various memory units, the processing unit(s) 512 retrieves instructions to execute and data to process in order to execute the processes of some implementations.
  • The bus 508 also connects to the input and output device interfaces 514 and 506. The input device interface 514 enables the user to communicate information and select commands to the electronic system. Input devices used with the input device interface 514 include, for example, alphanumeric keyboards and pointing devices (also called “cursor control devices”). The output device interface 506 enables, for example, the display of images generated by the electronic system 500. Output devices used with the output device interface 506 include, for example, printers and display devices, for example, cathode ray tubes (CRT) or liquid crystal displays (LCD). Some implementations include devices, for example, a touchscreen that functions as both input and output devices.
  • Finally, as shown in FIG. 5, the bus 508 also couples the electronic system 500 to a network (not shown) through a network interface. In this manner, the computer can be a part of a network of computers (for example, a LAN, a WAN, or an Intranet, or a network of networks, for example, the Internet). Any or all components of the electronic system 500 can be used in conjunction with the subject disclosure.
  • Many of the above-described features and applications are implemented as software processes that are specified as a set of instructions recorded on a computer readable storage medium (also referred to as computer readable medium). When these instructions are executed by one or more processing unit(s) (e.g., one or more processors, cores of processors, or other processing units), they cause the processing unit(s) to perform the actions indicated in the instructions. Examples of computer readable media include, but are not limited to, magnetic media, optical media, electronic media, etc. The computer readable media does not include carrier waves and electronic signals passing wirelessly or over wired connections.
  • Unless otherwise stated, all measurements, values, ratings, positions, magnitudes, sizes, and other specifications that are set forth in this specification are approximate, not exact. They are intended to have a reasonable range that is consistent with the functions to which they relate and with what is customary in the art to which they pertain.
  • Except as stated immediately above, nothing that has been stated or illustrated is intended or should be interpreted to cause a dedication of any component, step, feature, object, benefit, advantage, or equivalent to the public.
  • In this specification, the term “software” is meant to include, for example, firmware residing in read-only memory or other form of electronic storage, or applications that may be stored in magnetic storage, optical, solid state, etc., which can be read into memory for processing by a processor. Also, in some implementations, multiple software aspects of the subject disclosure can be implemented as sub-parts of a larger program while remaining distinct software aspects of the subject disclosure. In some implementations, multiple software aspects can also be implemented as separate programs. Finally, any combination of separate programs that together implement a software aspect described here is within the scope of the subject disclosure. In some implementations, the software programs, when installed to operate on one or more electronic systems, define one or more specific machine implementations that execute and perform the operations of the software programs.
  • A computer program (also known as a program, software, software application, script, or code) can be written in any form of programming language, including compiled or interpreted languages, declarative or procedural languages, and it can be deployed in any form, including as a standalone program or as a module, component, subroutine, object, or other unit suitable for use in a computing environment. A computer program may, but need not, correspond to a file in a file system. A program can be stored in a portion of a file that holds other programs or data (e.g., one or more scripts stored in a markup language document), in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub programs, or portions of code). A computer program can be deployed to be executed on one computer or on multiple computers that are located at one site or distributed across multiple sites and interconnected by a communication network.
  • These functions described above can be implemented in digital electronic circuitry, in computer software, firmware, or hardware. The techniques can be implemented using one or more computer program products. Programmable processors and computers can be included in or packaged as mobile devices. The processes and logic flows can be performed by one or more programmable processors and by one or more programmable logic circuitry. General and special purpose computing devices and storage devices can be interconnected through communication networks.
  • Some implementations include electronic components, for example, microprocessors, storage, and memory that store computer program instructions in a machine-readable or computer-readable medium (alternatively referred to as computer-readable storage media, machine-readable media, or machine-readable storage media). Some examples of such computer-readable media include RAM, ROM, read-only compact discs (CD-ROM), recordable compact discs (CD-R), rewritable compact discs (CD-RW), read-only digital versatile discs (e.g., DVD-ROM, dual-layer DVD-ROM), a variety of recordable/rewritable DVDs (e.g., DVD-RAM, DVD-RW, DVD+RW, etc.), flash memory (e.g., SD cards, mini-SD cards, micro-SD cards, etc.), magnetic or solid state hard drives, read-only and recordable Blu-Ray® discs, ultra-density optical discs, any other optical or magnetic media, and floppy disks. The computer-readable media can store a computer program that is executable by at least one processing unit and includes sets of instructions for performing various operations. Examples of computer programs or computer code including machine code, for example, produced by a compiler, and files including higher-level code that are executed by a computer, an electronic component, or a microprocessor using an interpreter.
  • While the above discussion primarily refers to microprocessor or multi-core processors that execute software, some implementations are performed by one or more integrated circuits, for example, application specific integrated circuits (ASICs) or field programmable gate arrays (FPGAs). In some implementations, such integrated circuits execute instructions that are stored on the circuit itself.
  • As used in this disclosure, the terms “computer”, “server”, “processor”, and “memory” all refer to electronic or other technological devices. These terms exclude people or groups of people. For the purposes of the specification, the terms display or displaying means displaying on an electronic device. As used in this disclosure, the terms “computer readable medium” and “computer readable media” are entirely restricted to tangible, physical objects that store information in a form that is readable by a computer. These terms exclude any wireless signals, wired download signals, and any other ephemeral signals.
  • To provide for interaction with a user, implementations of the subject matter described in this specification can be implemented on a computer having a display device, e.g., a CRT or LCD monitor, for displaying information to the user and a keyboard and a pointing device, e.g., a mouse or a trackball, by which the user can provide input to the computer. Other kinds of devices can be used to provide for interaction with a user as well; for example, feedback provided to the user can be any form of sensory feedback, e.g., visual feedback, auditory feedback, or tactile feedback; and input from the user can be received in any form, including acoustic, speech, or tactile input. In addition, a computer can interact with a user by sending documents to and receiving documents from a device that is used by the user; for example, by sending web pages to a web browser on a user's client device in response to requests received from the web browser.
  • Embodiments of the subject matter described in this specification can be implemented in a computing system that includes a back end component, e.g., as a data server, or that includes a middleware component, e.g., an application server, or that includes a front end component, e.g., a client computer having a graphical user interface or a web browser through which a user can interact with an implementation of the subject matter described in this specification, or any combination of one or more such back end, middleware, or front end components. The components of the system can be interconnected by any form or medium of digital data communication, e.g., a communication network. Examples of communication networks include a local area network (LAN) and a wide area network (WAN), an inter-network (e.g., the Internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks).
  • The computing system can include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. In some embodiments, a server transmits data (e.g., an HTML page) to a client device (e.g., for purposes of displaying data to and receiving user input from a user interacting with the client device). Data generated at the client device (e.g., a result of the user interaction) can be received from the client device at the server.
  • It is understood that any specific order or hierarchy of steps in the processes disclosed is an illustration of example approaches. Based upon design preferences, it is understood that the specific order or hierarchy of steps in the processes may be rearranged, or that all illustrated steps be performed. Some of the steps may be performed simultaneously. For example, in certain circumstances, multitasking and parallel processing may be advantageous. Moreover, the separation of various system components in the embodiments described above should not be understood as requiring such separation in all embodiments, and it should be understood that the described program components and systems can generally be integrated together in a single software product or packaged into multiple software products.
  • The previous description is provided to enable any person skilled in the art to practice the various aspects described herein. Various modifications to these aspects will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other aspects. Where reference to an element in the singular is not intended to mean “one and only one” unless specifically so stated, but rather “one or more”. Unless specifically stated otherwise, the term “some” refers to one or more. Pronouns in the masculine (e.g., his) include the feminine and neuter gender (e.g., her and its) and vice versa. Headings and subheadings, if any, are used for convenience only and do not limit the subject disclosure.
  • As used herein, the phrase “at least one of” preceding a series of items, with the term “and” or “or” to separate any of the items, modifies the list as a whole, rather than each member of the list (i.e., each item). The phrase “at least one of” does not require selection of at least one of each item listed; rather, the phrase allows a meaning that includes at least one of any one of the items, and/or at least one of any combination of the items, and/or at least one of each of the items. By way of example, the phrases “at least one of A, B, and C” or “at least one of A, B, or C” each refer to only A, only B, or only C; any combination of A, B, and C; and/or at least one of each of A, B, and C.
  • Phrases such as an aspect, the aspect, another aspect, some aspects, one or more aspects, an implementation, the implementation, another implementation, some implementations, one or more implementations, an embodiment, the embodiment, another embodiment, some embodiments, one or more embodiments, a configuration, the configuration, another configuration, some configurations, one or more configurations, the subject technology, the disclosure, the present disclosure, other variations thereof and alike are for convenience and do not imply that a disclosure relating to such phrase(s) is essential to the subject technology or that such disclosure applies to all configurations of the subject technology. A disclosure relating to such phrase(s) may apply to all configurations, or one or more configurations. A disclosure relating to such phrase(s) may provide one or more examples. A phrase such as an aspect or some aspects may refer to one or more aspects and vice versa, and this applies similarly to other foregoing phrases.
  • To the extent that the systems discussed herein collect usage data associated with users, or may make use of the usage data, the users are provided with opportunities to control whether programs or features collect usage data (e.g., a user's preferences), and to control the user interface (UI) associated with applications based on the collected usage data. The users may also be provided with options to turn on or turn off certain features or functions provided by the systems. In some aspects, the users may elect to disable features and functions (e.g., control the UI associated with applications based on the collected usage data) offered by the systems discussed herein. In addition, users may stipulate that certain data be treated in one or more ways before it is stored or used, so that personally identifiable information is removed. For example, a user's identity may be treated so that no personally identifiable information can be determined for the user, or a user's geographic location may be generalized where location information is obtained (such as to a city, zip code, or state level), so that a particular location of a user cannot be determined. Thus, the user has control over whether and how user information is collected, stored, and used by the disclosed systems.
  • All structural and functional equivalents to the elements of the various aspects described throughout this disclosure that are known or later come to be known to those of ordinary skill in the art are expressly incorporated herein by reference and intended to be encompassed by the subject technology. Moreover, nothing disclosed herein is intended to be dedicated to the public regardless of whether such disclosure is explicitly recited in the above description. Furthermore, to the extent that the term “include”, “have”, or the like is used in the disclosure, such term is intended to be inclusive in a manner similar to the term “comprise” as “comprise” is interpreted.
  • It will be understood that the terms and expressions used herein have the ordinary meaning as is accorded to such terms and expressions with respect to their corresponding respective areas of inquiry and study except where specific meanings have otherwise been set forth herein. Relational terms such as first and second and the like may be used solely to distinguish one entity or action from another without necessarily requiring or implying any actual such relationship or order between such entities or actions. The terms “comprises,” “comprising,” or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. An element proceeded by “a” or “an” does not, without further constraints, preclude the existence of additional identical elements in the process, method, article, or apparatus that comprises the element.
  • In the foregoing Description, it can be seen that various features are grouped together in various embodiments for the purpose of streamlining the disclosure.
  • While the foregoing has described what are considered to be the best mode and/or other examples, it is understood that various modifications may be made therein and that the subject matter disclosed herein may be implemented in various forms and examples, and that the teachings may be applied in numerous applications, only some of which have been described herein. It is intended that this disclosure cover any and all applications, modifications and variations that fall within the true scope of the present teachings.

Claims (24)

1. An automated inventory control system, comprising:
one or more storage devices containing a plurality of storage locations for storing objects;
a first predefined location for receiving one or more objects, the first predefined location including a sensing system configured to sense when an object is deposited at the first predefined location;
a second predefined location for receiving one or more objects; the second predefined location including a sensing system configured to sense when an object is deposited at the second predefined location;
one or more processors configured to:
when an object is deposited at the first predefined location, automatically assign a first status to the object and cause transmission of an alert indicating the first status of the deposited object,
track a plurality of transactions associated with the deposited object after a user checks the deposited object out of the first predefined location, and
when an object is deposited at the second predefined location, automatically assign a second status to the object and cause transmission of an alert indicating the second status of the deposited object.
2. The automated inventory control system of claim 1, wherein the first predefined location is configured to permit the user that received the transmitted alert indicating the first status of the deposited object to check the deposited object out of the first predefined location.
3. The automated inventory control system of claim 1,
wherein the data processor is configured to automatically assign the first status based on information sensed by the sensing system of the first predefined location corresponding to the deposited object.
4. The automated inventory control system of claim 3,
wherein the sensed information corresponds to visual markings on a container that contains the deposited object.
5. The automated inventory control system of claim 1, further comprising:
an input device configured to receive user input;
wherein the data processor is configured to automatically assign the first status based on user input received by the input device.
6. The automated tool control system of claim 1, further comprising:
a third predefined location for receiving one or more objects;
wherein the first and third predefined locations each correspond to a respective status, such that an object deposited at the first predefined location is automatically assigned a different first status than when deposited at the third predefined location.
7. The automated inventory control system of claim 1, wherein the sensing systems of the first and second predefined locations each comprise at least one of:
one or more cameras configured to obtain images of the plurality of storage locations,
one or more RF sensors configured to detect RFID tags,
one or more electrical connections configured to connect to respective objects,
one or more scales configured to detect weights of respective objects,
an array of contact sensors configured to detect a shape of an object,
one or more ultrasonic sensors, each comprising an emitter configured to emit sound waves and a detector configured to detect sound waves, or
one or more magnetic inductive sensors configured to detect metallic objects.
8. The automated inventory control system of claim 1, further comprising:
an input device configured to receive user input;
wherein the data processor is further configured to, based on user input received by the input device, clear the automatically assigned status of the object and assign a third status to the object based on the user input.
9. A method for an automated inventory control system, comprising the steps of:
storing an object in a storage location of a storage device;
receiving the object in a first predefined location;
based on receipt of the object in the first predefined location, automatically assigning a first status to the object;
transmitting an alert indicating the first status of the deposited object;
checking the deposited object out of the first predefined location;
tracking a plurality of transactions associated with the deposited object after the user checks the deposited object out of the first predefined location;
receiving the object in a second predefined location;
based on receipt of the object in the second predefined location, automatically assigning a second status to the object;
transmitting an alert indicating the second status of the deposited object;
checking the deposited object out of the second predefined location;
receiving the object in the storage location of the storage device.
10. The method of claim 9, wherein a user receives the transmitted alert indicating the first status of the deposited object and, in response to receiving the transmitted alert, checks the deposited object out of the first predefined location.
11. The method of claim 9, further comprising the step of:
sensing, at the first predefined location, information corresponding to the deposited object,
wherein the first status is automatically assigned based on the information sensed at the first predefined location.
12. The method of claim 11,
wherein the sensed information at the first predefined location corresponds to visual markings on a container that contains the deposited object.
13. The method of claim 9, further comprising the step of:
receiving user input at an input device,
wherein the first status is automatically assigned based on user input received by the input device.
14. The method of claim 9, further comprising the step of:
storing a second object in a second storage location of the storage device;
receiving the second object in a third predefined location,
receiving, in a fourth predefined location, a replacement object corresponding to the second object;
based on receipt of the replacement object in the fourth predefined location, automatically assigning a status to the replacement object;
transmitting an alert indicating the status of the replacement object;
checking the replacement object out of the fourth predefined location;
receiving the replacement object in the second storage location of the storage device.
15. The method of claim 9,
wherein receipt of the object at the first and second predefined locations are detected by sensing systems that each comprise at least one of:
one or more cameras configured to obtain images of the plurality of storage locations,
one or more RF sensors configured to detect RFID tags,
one or more electrical connections configured to connect to respective objects,
one or more scales configured to detect weights of respective objects,
an array of contact sensors configured to detect a shape of an object,
one or more ultrasonic sensors, each comprising an emitter configured to emit sound waves and a detector configured to detect sound waves, or
one or more magnetic inductive sensors configured to detect metallic objects.
16. The method of claim 9, further comprising the step of:
receiving user input at an input device;
based on the received user input, clearing the automatically assigned status of the object and assigning a third status to the object.
17. A non-transitory computer-readable medium storing executable instructions for carrying out a process comprising the steps of:
determining that an object is present in a storage location of a storage device;
determining that the object is present in a first predefined location;
based on the determination that the object is present in the first predefined location, automatically assigning a first status to the object;
causing the transmission of an alert indicating the first status of the deposited object;
tracking a plurality of transactions associated with the deposited object after the deposited object is checked out of the first predefined location;
determining that the object is present in a second predefined location;
based on the determination that the object is present in the second predefined location, automatically assigning a second status to the object;
causing the transmission of an alert indicating the second status of the deposited object;
determining that the object is present in the storage location of the storage device after the deposited object is checked out of the second predefined location.
18. The medium of claim 17, the process further comprising the step of:
allowing a user who received the transmitted alert indicating the first status of the deposited object to check the deposited object out of the first predefined location.
19. The medium of claim 17, the process further comprising the steps of:
receiving sensed information corresponding to the deposited object at the first location,
wherein the first status is assigned based on the sensed information.
20. The medium of claim 19,
wherein the sensed information at the first predefined location corresponds to visual markings on a container that contains the deposited object.
21. The medium of claim 17,
wherein the first status to the object is automatically assigned based on user input received by an input device.
22. The medium of claim 17, the process further comprising the steps of:
determining that a second object is present in a second storage location of the storage device;
determining that the second object is present in a third predefined location,
determining that a replacement object corresponding to the second object is present in a fourth predefined location;
based on the determination that the replacement object is present in the fourth predefined location, automatically assigning a status to the replacement object;
causing the transmission of an alert indicating the status of the replacement object;
determining that the replacement object is present in the second storage location of the storage device after the replacement object is checked out of the fourth predefined location.
23. The medium of claim 17,
wherein determination of the object at the first and second predefined locations are based on information received from sensing systems that each comprise at least one of:
one or more cameras configured to obtain images of the plurality of storage locations,
one or more RF sensors configured to detect RFID tags,
one or more electrical connections configured to connect to respective objects,
one or more scales configured to detect weights of respective objects,
an array of contact sensors configured to detect a shape of an object,
one or more ultrasonic sensors, each comprising an emitter configured to emit sound waves and a detector configured to detect sound waves, or
one or more magnetic inductive sensors configured to detect metallic objects.
24. The medium of claim 17, the process further comprising the step of:
receiving user input from an input device;
based on the received user input, clearing the automatically assigned status of the object and assigning a third status to the object.
US16/915,750 2019-06-28 2020-06-29 Managing objects with assigned status in an automated tool control system Pending US20200410434A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/915,750 US20200410434A1 (en) 2019-06-28 2020-06-29 Managing objects with assigned status in an automated tool control system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201962868818P 2019-06-28 2019-06-28
US16/915,750 US20200410434A1 (en) 2019-06-28 2020-06-29 Managing objects with assigned status in an automated tool control system

Publications (1)

Publication Number Publication Date
US20200410434A1 true US20200410434A1 (en) 2020-12-31

Family

ID=71728943

Family Applications (1)

Application Number Title Priority Date Filing Date
US16/915,750 Pending US20200410434A1 (en) 2019-06-28 2020-06-29 Managing objects with assigned status in an automated tool control system

Country Status (6)

Country Link
US (1) US20200410434A1 (en)
EP (1) EP3991114A1 (en)
JP (1) JP2022539192A (en)
CN (1) CN114341904A (en)
AU (1) AU2020304677A1 (en)
WO (1) WO2020264506A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200407162A1 (en) * 2019-06-28 2020-12-31 Snap-On Incorporated Automated tool control device managed in a tool crib management system
US20210263890A1 (en) * 2020-02-20 2021-08-26 Hsiu-Jen Lin Intelligent Storage System and an Intelligent Storage Method Thereof
US20220215333A1 (en) * 2021-01-06 2022-07-07 Kuo-Chin Chiang APP Management System for Identifying Storage Boxes and Method Using the APP Management System
US11495348B2 (en) * 2019-05-28 2022-11-08 Candice E. Lowry Artificial intelligence storage and tracking system for emergency departments and trauma centers

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8020768B2 (en) * 2009-04-01 2011-09-20 RFID Mexico, S.A. DE C.V. Portable container inventory control system
US20130332323A1 (en) * 2012-06-12 2013-12-12 Snap-On Incorporated Enabling communication between an inventory control system and a remote system over a network
US20140358740A1 (en) * 2008-08-08 2014-12-04 Snap-On Incorporated Image-based inventory control system with automatic calibration and image correction
US20180039807A1 (en) * 2012-03-01 2018-02-08 Proper Digital LLC Tooling system
US20180197140A1 (en) * 2017-01-12 2018-07-12 United Parcel Service Of America, Inc. Drop box item deposit sensor system and methods of using the same

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110291547B (en) * 2017-02-13 2024-03-08 实耐宝公司 Automated tool data generation in an automated asset management system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140358740A1 (en) * 2008-08-08 2014-12-04 Snap-On Incorporated Image-based inventory control system with automatic calibration and image correction
US8020768B2 (en) * 2009-04-01 2011-09-20 RFID Mexico, S.A. DE C.V. Portable container inventory control system
US20180039807A1 (en) * 2012-03-01 2018-02-08 Proper Digital LLC Tooling system
US20130332323A1 (en) * 2012-06-12 2013-12-12 Snap-On Incorporated Enabling communication between an inventory control system and a remote system over a network
US20180025565A1 (en) * 2012-06-12 2018-01-25 Snap-On Incorporated Tool training for automated tool control systems
US20180197140A1 (en) * 2017-01-12 2018-07-12 United Parcel Service Of America, Inc. Drop box item deposit sensor system and methods of using the same

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11495348B2 (en) * 2019-05-28 2022-11-08 Candice E. Lowry Artificial intelligence storage and tracking system for emergency departments and trauma centers
US11842418B2 (en) 2019-05-28 2023-12-12 Candice E. Lowry Artificial intelligence inventory tracking and procurement system for healthcare facilities
US20200407162A1 (en) * 2019-06-28 2020-12-31 Snap-On Incorporated Automated tool control device managed in a tool crib management system
US11655102B2 (en) * 2019-06-28 2023-05-23 Snap-On Incorporated Automated tool control device managed in a tool crib management system
US20210263890A1 (en) * 2020-02-20 2021-08-26 Hsiu-Jen Lin Intelligent Storage System and an Intelligent Storage Method Thereof
US20220215333A1 (en) * 2021-01-06 2022-07-07 Kuo-Chin Chiang APP Management System for Identifying Storage Boxes and Method Using the APP Management System

Also Published As

Publication number Publication date
CN114341904A (en) 2022-04-12
JP2022539192A (en) 2022-09-07
AU2020304677A1 (en) 2022-01-06
WO2020264506A1 (en) 2020-12-30
EP3991114A1 (en) 2022-05-04

Similar Documents

Publication Publication Date Title
US20200410434A1 (en) Managing objects with assigned status in an automated tool control system
US10922648B2 (en) Automated asset management system with multiple sensing technologies
US20210125141A1 (en) Data acquisition using machine-readable optical symbols
CN107283374B (en) Inventory control system with advanced functionality
US20190101463A1 (en) Monitoring of tool calibration status in automated tool control systems
US11562566B2 (en) Use of on-screen content identifiers in automated tool control systems
AU2018219356A1 (en) Automated tool data generation in automated asset management systems
US11655102B2 (en) Automated tool control device managed in a tool crib management system
US20200410447A1 (en) Language management in automated tool control systems
WO2014041567A2 (en) A system for monitoring objects

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

AS Assignment

Owner name: SNAP-ON INCORPORATED, WISCONSIN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:FLY, DAVID C.;LIPSEY, MATTHEW J.;PHILLIPS, PRESTON C.;AND OTHERS;SIGNING DATES FROM 20200625 TO 20200702;REEL/FRAME:055361/0352

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

AS Assignment

Owner name: SNAP-ON INCORPORATED, WISCONSIN

Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE COMBIND DECLARATION AND ASSIGNMENT AGREEMENT SIGNED BY INVENTOR ANDREW R. LOBO PREVIOUSLY RECORDED AT REEL: 055361 FRAME: 0352. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT;ASSIGNORS:FLY, DAVID C.;LIPSEY, MATTHEW J.;PHILLIPS, PRESTON C.;AND OTHERS;SIGNING DATES FROM 20200625 TO 20220127;REEL/FRAME:058979/0005

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED