US20200137527A9 - Asset floor map - Google Patents

Asset floor map Download PDF

Info

Publication number
US20200137527A9
US20200137527A9 US15/476,573 US201715476573A US2020137527A9 US 20200137527 A9 US20200137527 A9 US 20200137527A9 US 201715476573 A US201715476573 A US 201715476573A US 2020137527 A9 US2020137527 A9 US 2020137527A9
Authority
US
United States
Prior art keywords
asset
floor map
map
client device
floor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US15/476,573
Other versions
US20180249298A1 (en
US10798538B2 (en
Inventor
Priyanka Jain
Sameer MARDHEKAR
Anand BHAGWAT
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BMC Software Inc
Original Assignee
BMC Software Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by BMC Software Inc filed Critical BMC Software Inc
Priority to US15/476,573 priority Critical patent/US10798538B2/en
Assigned to BMC SOFTWARE, INC. reassignment BMC SOFTWARE, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BHAGWAT, Anand, JAIN, PRIYANKA, MARDHEKAR, SAMEER
Assigned to CREDIT SUISSE AG, CAYMAN ISLANDS BRANCH, AS COLLATERAL AGENT reassignment CREDIT SUISSE AG, CAYMAN ISLANDS BRANCH, AS COLLATERAL AGENT SECURITY INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BLADELOGIC, INC., BMC SOFTWARE, INC.
Priority to EP18716387.8A priority patent/EP3571856A1/en
Priority to PCT/US2018/014460 priority patent/WO2018136764A1/en
Publication of US20180249298A1 publication Critical patent/US20180249298A1/en
Assigned to CREDIT SUISSE, AG, CAYMAN ISLANDS BRANCH, AS COLLATERAL AGENT reassignment CREDIT SUISSE, AG, CAYMAN ISLANDS BRANCH, AS COLLATERAL AGENT SECURITY INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BLADELOGIC, INC., BMC SOFTWARE, INC.
Assigned to BLADELOGIC, INC., BMC ACQUISITION L.L.C., BMC SOFTWARE, INC. reassignment BLADELOGIC, INC. RELEASE OF PATENTS Assignors: CREDIT SUISSE AG, CAYMAN ISLANDS BRANCH
Publication of US20200137527A9 publication Critical patent/US20200137527A9/en
Assigned to THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A., AS COLLATERAL AGENT reassignment THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A., AS COLLATERAL AGENT SECURITY INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BLADELOGIC, INC., BMC SOFTWARE, INC.
Assigned to THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A., AS COLLATERAL AGENT reassignment THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A., AS COLLATERAL AGENT SECURITY INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BLADELOGIC, INC., BMC SOFTWARE, INC.
Publication of US10798538B2 publication Critical patent/US10798538B2/en
Application granted granted Critical
Assigned to ALTER DOMUS (US) LLC reassignment ALTER DOMUS (US) LLC GRANT OF SECOND LIEN SECURITY INTEREST IN PATENT RIGHTS Assignors: BLADELOGIC, INC., BMC SOFTWARE, INC.
Assigned to BMC SOFTWARE, INC., BLADELOGIC, INC. reassignment BMC SOFTWARE, INC. TERMINATION AND RELEASE OF SECURITY INTEREST IN PATENTS Assignors: ALTER DOMUS (US) LLC
Assigned to GOLDMAN SACHS BANK USA, AS SUCCESSOR COLLATERAL AGENT reassignment GOLDMAN SACHS BANK USA, AS SUCCESSOR COLLATERAL AGENT OMNIBUS ASSIGNMENT OF SECURITY INTERESTS IN PATENT COLLATERAL Assignors: CREDIT SUISSE AG, CAYMAN ISLANDS BRANCH, AS RESIGNING COLLATERAL AGENT
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • H04W4/043
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • G06F17/30241
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06K9/00362
    • G06K9/6267
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/08Logistics, e.g. warehousing, loading or distribution; Inventory or stock management
    • G06Q10/087Inventory or stock management, e.g. order filling, procurement or balancing against orders
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/35Categorising the entire scene, e.g. birthday party or wedding scene
    • G06V20/36Indoor scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/02Protocols based on web technology, e.g. hypertext transfer protocol [HTTP]
    • H04L67/18
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/50Network services
    • H04L67/52Network services specially adapted for the location of the user terminal
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/023Services making use of location information using mutual or relative location information between multiple location based services [LBS] targets or of distance thresholds
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/029Location-based management or tracking services
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/33Services specially adapted for particular environments, situations or purposes for indoor environments, e.g. buildings
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/80Services using short range communication, e.g. near-field communication [NFC], radio-frequency identification [RFID] or low energy communication
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/40Document-oriented image-based pattern recognition
    • G06V30/42Document-oriented image-based pattern recognition based on the type of document
    • G06V30/422Technical drawings; Geographical maps
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/12Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W84/00Network topologies
    • H04W84/02Hierarchically pre-organised networks, e.g. paging networks, cellular networks, WLAN [Wireless Local Area Network] or WLL [Wireless Local Loop]
    • H04W84/10Small scale networks; Flat hierarchical networks
    • H04W84/12WLAN [Wireless Local Area Networks]

Definitions

  • Embodiments relate to asset management, d. more specifically to the tagging and use of physical objects or locations.
  • those shared resources may include things or physical objects, such as, for example copiers, printers, fax machines, traveler's workstations or computers (e.g., an unassigned computer available for use by travelling or transient workers), refrigerators, coffee makers, and the like.
  • those shared resources may include locations, such as, for example, conference rooms, traveler's workstations or hot desks (i.e., an unassigned office space available to travelling or transient workers), break rooms, and the like.
  • a worker might need instructions on using the video or presentation equipment.
  • the printer paper supply might be low or empty and the worker may need to locate more paper.
  • IT Information Technology
  • Field Support personnel e.g., Technicians, Asset Managers, and the like
  • the process of building a floor map by discovering, identifying and pinning the asset at appropriate location on the floor map is very time consuming and requires many manual steps.
  • floor map building is a manual process, the location can be based on approximations and/or be error prone.
  • the number of assets on a floor has grown substantially (e.g., with bring your own device (BYOD) organizations).
  • BYOD bring your own device
  • the location of devices can change on a regular or random basis (e.g., daily, weekly, as a user moves hot desks, and the like). Therefore, keeping the visual floor map up-to-date is a significant challenge for IT personnel.
  • locating a required asset e.g., a specific printer
  • finding all useful assets within a periphery e.g., a video conferencing room or board room
  • finding live statuses of assets can be a fairly non-intuitive process without an up-to-date visual floor map.
  • Example embodiments describe systems and methods to identify and locate assets and to generate a visual asset floor map.
  • a non-transitory computer readable storage medium including executable code that, when executed by a processor, is configured to cause the processor to perform steps and a method to perform the steps including receiving, from a remote computing device, a floor map indicating a layout of a location, displaying, via a display interface of a client device, at least a portion of the floor map, capturing, using an application of the client device, signal strength data representing a signal field for at least one position on the floor map, identifying an asset within the layout of the location, determining at least one property that identities the asset using one of a discovery process using a wireless protocol and an image processing application programming interface (API) configured to classify an image and detect individual within the image, updating the floor map with the asset and the at least one property, and communicating the asset and the at least one property to the remote computing device.
  • API image processing application programming interface
  • Implementations can include one or more of the following features.
  • the location is can be floor of a building.
  • the signal strength data representing the signal field can be based on a magnetic field footprint captured using a Magnetic Indoor Positioning protocol.
  • the signal strength data representing the signal field can be based on a WIFI signal footprint captured using a captured using a WIFI Indoor Positioning protocol.
  • the asset can be a smart device, and the identifying of the asset can include detecting a communications protocol signal transmitted from the asset.
  • the asset can be a smart device, and the determining of at least one property that identities the asset includes determining a position of the asset on the floor map can include determining a position of the client device on the floor map using an Indoor Positioning System (IPS) protocol, measuring a channel frequency and a signal strength using the wireless protocol, and using a formula based on a free-space path loss (FSPL), the channel frequency and the signal strength to determine a distance between the client device and the asset.
  • IPS Indoor Positioning System
  • FSPL free-space path loss
  • the asset may not be smart device, and the identifying of the asset includes capturing an image of the asset, using the image processing API to communicate the image to an external tool configured to identify an object using the image, and receive an asset class associated with the asset from the external tool.
  • the asset may not be a smart device, and the determining of at least one property that identifies the asset includes determining a position of the asset on the floor map including determining a position of the client device on the floor map using an Indoor Positioning System (IPS) protocol, and measuring an inclination between at least two heights associated with the asset and use a trigonometric function and the inclination to determine a distance between the client device and the asset.
  • IPS Indoor Positioning System
  • a method in another general aspect, includes receiving, from a client device, a request for a floor map based on a. floor of a building, the floor map indicating a layout of the floor of the building, in response to receiving the request for the floor map, selecting a floor map from a database configured to store a plurality of maps, communicating the floor map to the client device, receiving, from the client device, information related to an asset, the information including at least one properly that identifies the asset and a position of the asset on the floor map, in response to receiving the information related to the asset, update a database configured to store data related to a plurality of assets, generating an annotated floor map based on the asset and the information related to the asset, and communicating the annotated floor map to the client device.
  • Implementations can include one or more of the following features.
  • the annotated floor map includes an icon representing the asset and an indicator, the icon representing the asset is located on the floor map at the position of the asset, and the indicator is located on the floor map at the position of the asset and indicates at least one of a type of the asset and a status of the asset.
  • the update of the database configured to store data related to the plurality of assets includes one of determining whether a record associated with the asset exists upon determining a record associated with the asset exists, update the record using the information related to the asset, and upon determining a record associated with the asset does not exist, generate a new record using the information related to the asset.
  • FIG. 1 illustrates a floor map according to at least one example embodiment.
  • FIGS. 2A, 2B, and 2C pictorially illustrate a route to an asset according to at least one example embodiment.
  • FIG. 3 pictorially illustrates a status and location of an asset according to at least one example embodiment.
  • FIG. 4 illustrates a block diagram of a flowchart according to at least one example embodiment.
  • FIG. 5 illustrates another block diagram of a flowchart according to at least one example embodiment.
  • FIG. 6 illustrates another block diagram of a flowchart according to at least one example embodiment.
  • FIGS. 7 and 8 illustrate a magnetic fingerprint of a floor map according to at least one example embodiment.
  • FIG. 9 illustrates trilateration of a device or asset according to at least one example embodiment.
  • FIGS. 10 and 11 illustrate locating a position of a device or asset according to at least one example embodiment.
  • FIG. 12 illustrates a block diagram of a computing device according to at least one example embodiment.
  • FIG. 13 illustrates a block diagram of another computing device according to at least one example embodiment.
  • the digital workplace is expected to be a live and dynamic workplace where both Information Technology (IT) Field Support personnel and end users can be expected to work in a smart and intuitive environment.
  • Example embodiments describe an automated technique for generating a visual asset floor map by (1) automatically identifying locations of smart (e.g., Internet of Things (IoT) enabled) devices on a floor map, (2) providing an intuitive technique for identifying location of non-Smart devices on the floor map, (3) identifying and classifying an asset type of non-Smart devices using techniques based on image and/or pattern recognition, and (4) using a crowd-sourced model to keep the floor map live and up-to-date in terms of device location and/or status.
  • IoT Internet of Things
  • a PoI can be a device (e.g., printer, computer, television, and/or the like) and/or a location (e.g., conference room, break room, rest room and/or the like).
  • FIG. 1 illustrates a floor map according to at least one example embodiment.
  • the floor map 100 can include at least one PoI or asset (hereinafter referred to as an asset) 105 , 110 , 115 , 120 , 125 .
  • asset 105 and 110 are illustrated as a conference room
  • asset 120 and 125 are illustrated as a display (e.g., a television, a computer monitor and the like)
  • asset 115 is illustrated as a printer.
  • Each asset 105 , 110 , 115 , 120 , 125 is illustrated as having an associated status indicator 130 .
  • Status indicator 130 can be configured to illustrate a status of the associated asset 105 , 110 , 115 , 120 , 125 .
  • the status indicator 130 can be color coded. For example, a red status indicator can indicate the asset 105 , 110 , 115 , 120 , 125 is not available and/or not operational. A yellow status indicator can indicate the asset 105 , 110 , 115 , 120 , 125 is available, but not operating properly. A green status indicator can indicate the asset 105 , 110 , 115 , 120 , 125 is available and operating properly.
  • the status indicator 130 can also be a callout box including text with status and other information about the asset 105 , 110 , 115 , 120 , 125 . Example implementations can include other colors and or other mechanisms for displaying text to show status of an asset 105 , 110 , 115 , 120 , 125 .
  • the floor map 100 can be generated and populated with any number of assets (e.g., hardware, network devices, equipment, rooms, and the like).
  • the floor map 100 can be dynamic (e.g., assets can be added, removed and/or relocated). Therefore, the floor map 100 can be updated in real time (e.g., a live map) as the floor map 100 updates itself after every usage (e.g., each time an end user interacts with the map and/or as IT personnel perform an build/rebuild operation).
  • an application operating on a computing device can refresh a display showing the floor map 100 regularly (e.g., on a configured time interval) or as the floor map 100 (e.g., data associated with the floor map 100 ) is updated or changes.
  • An example method uses various combinations of techniques to auto-generation of the floor map 100 .
  • the techniques can include at least one of (1) after discovering a smart device using IoT protocols, a position can be found on the map based on a combination of IPS (Internal Positioning System) that may work within buildings where GPS does not and Trilateration of the devices positions based on signal strength (e.g., WIFI, near field technologies, and the like), (2) position detection of non-smart devices (e.g., devices that are not IoT) using a combination of IPS and camera measure techniques e.g., to locate the distance and angle of the device), (3) use a machine learning technique to determine an asset class of the device and then helps build linkages within a configuration management database (CMDB) by recommending a mapping to a configuration items (CI) within the CMDB, (4) use crowd-sourcing to build exact positions more accurately, to keep the map up-to-date in real-time and to keep the floor maps more accurate with respect to asset positions and status, and (5)
  • example embodiments can solve and/or help solve many use cases related to floor maps.
  • at least one example implementation can (1) discover an asset in real-time and pin the asset on the floor map 100 , (2) keep a CMDB up-to-date using smart device techniques (e.g., IoT techniques), (3) keep the floor map up-to-date using crowdsourcing or keep the floor map up-to-date without using expensive (e.g., in human resources) and/or time consuming data entry effort on part of an IT organization, (4) search for an asset and find directions (e.g., routes within a building) to the asset on the floor map 100 , and (5) provide an up-to-date status and location of the asset to the end users.
  • smart device techniques e.g., IoT techniques
  • crowdsourcing e.g., crowdsourcing or keep the floor map up-to-date without using expensive (e.g., in human resources) and/or time consuming data entry effort on part of an IT organization
  • search for an asset and find directions e.g
  • FIGS. 2A, 2B, and 2C pictorially illustrate a route to an asset according to at least one example embodiment.
  • a floor map e.g., the floor map 100
  • a computing device e.g., mobile device, client device, smart phone, and/or the like
  • an application e.g., a user can discover an asset 210 which is then shown on the floor map 205 (or a portion of the floor map 205 ).
  • the user can then interact with the application to find a route to the asset 210 .
  • the user shown as an avatar 215
  • FIGS. 2A, 2B, and 2C pictorially illustrate the route to the asset as three screenshots. However, as single screenshot could be used.
  • FIG. 3 pictorially illustrates a status and location of an asset according to at least one example embodiment.
  • a user shown as an avatar 315
  • the asset 310 is illustrated as having an associated status indicator 320 .
  • the status indicator 320 can be configured to illustrate a status of the associated asset (e.g., red, yellow green as discussed above).
  • the status indicator 320 is shown with a callout box 325 including text with status and other information about the asset 310 .
  • the status indicator 320 and the callow box 325 can be implemented together as an indicator.
  • the callout box 325 can be shown automatically when the user zooms (pans) in on the asset 310 .
  • the callout box 325 can disappear automatically when the user zooms (pans) out from the asset 310 .
  • the callout box 325 can appear/disappear when the user performs some action on the asset 310 via the application.
  • FIGS. 4-6 are flowcharts of methods according to example embodiments.
  • the steps described with regard to FIGS. 4-6 may be performed due to the execution of software code stored in a memory and/or a non-transitory computer readable medium (e.g., memory 1214 ) associated with an apparatus (e.g., as shown in FIGS. 12 and 13 (described below)) and executed by at least one processor (e.g., processor 1212 ) associated with the apparatus.
  • processor e.g., processor 1212
  • alternative embodiments are contemplated such as a system embodied as a special purpose processor.
  • the steps described below are described as being executed by a processor, the steps are not necessarily executed by a same processor. In other words, at least one processor may execute the steps described below with regard to FIGS. 4-6 .
  • FIG. 4 illustrates a block diagram of a flowchart according to at least one example embodiment.
  • a building and floor level are selected.
  • a user can be interacting with a computing device executing an application configured to discover, identify and pin an asset as being located at a location on a floor map.
  • the user can select a building and a floor in the building.
  • the user can select the building based on a location of the device and/or through use of, for example, a dropdown list in the application.
  • the floor can be selected, for example, using a dropdown list in the application.
  • the dropdown list may include a floor(s) (e.g., a floor number) within the building.
  • a floor map based on selected building and floor level is loaded.
  • the building and floor level (or a combination thereof) can have a unique value representing the building and floor level.
  • the value could be the address of the building, a geo-positional value, an ID previously assigned, and the like and a number for the floor.
  • the unique value can be communicated from the computing device executing the application to a computing device including a map management system (e.g., map or asset management computing system 1350 ).
  • the map management system can then select a map as the floor map from a datastore (e.g., map storage 1310 ) including a plurality of maps at least a portion thereof each representing a floor map.
  • the map management system can then communicate the floor map to the computing device executing the application in response to the user of the application loading the floor map.
  • the application can then display the floor map on a display (e.g., within an active window) of the computing device executing the application.
  • Metadata representing at least one asset could be communicated from the map management system to the computing device executing the application with the floor map (e.g., in a same data packet or in a sequential (close in time) data packet).
  • the metadata could include information about or related to the asset (e.g., status, type, serial number, and the like) and a location (e.g., coordinates) on the map representing the floor map.
  • the floor map can then be annotated (e.g., overlaid) with the asset at the corresponding location on the floor map.
  • the map management system can generate an annotated floor map based on the asset and the information (e.g., status, type, serial number, location and the like) related to the asset
  • step S 415 processing continues to step S 425 . If the floor map is not calibrated (no in step S 415 ), processing continues to step S 420 .
  • the floor map can be identified as not calibrated the first time the floorplan is loaded into an application (by the user or by any other user).
  • the floor map includes metadata (e.g., communicated from the map management system to the computing device executing the application with the floor map) indicating the map has or has not been calibrated.
  • the floor map is calibrated.
  • the calibration operation can include walking a location (e.g., a floor of a building) with the application operating on the computing device.
  • the application can be in a calibration mode configured to capture data.
  • the captured data can be signal strength data or information representing a signal field for at least one position (e.g., coordinate) on the floor map.
  • the application can generate a table of coordinates on the map and a corresponding unique identifier (or signature or footprint or fingerpring of that position).
  • the identifier can be a magnetic field footprint (sometimes called a fingerprint) or a WIFI signal footprint (sometimes called a fingerprint) captured using a WIFI Indoor Positioning protocol.
  • Magnetic indoor positioning is based on buildings or structures each have a unique magnetic signature or fingerprint.
  • the steel structure within the building distorts the Earth's magnetic field in a unique way.
  • the user e.g., of the aforementioned application
  • can calibrate the map by marking a path on the map and then walking the same path in a calibration mode selected in the application.
  • the application then captures the magnetic field fingerprint (e.g., using a Magnetic Indoor Positioning protocol) readings along the path and builds the database of fingerprints as related to coordinates within the location.
  • the more that training paths and rescans e.g., using crowd-sourcing, the better the accuracy of this technique.
  • An example implementation can use extrapolation logic to generate missing fingerprints for the remaining (or unmeasured) coordinates.
  • FIG. 7 and FIG. 8 illustrate examples of magnetic fingerprint as related to coordinates within the location.
  • the floor map is calibrated for use in determining or calculating a distance.
  • signals e.g., associated with NFC Bluetooth, BLE, WIFI, and/or the like
  • attenuated e.g., power loss
  • structures e.g., walls
  • objects e.g., desks, assets, and the like. Therefore, calibration allows the determining or calculating of a distance to take into account the attenuation and/or distortion of the signals.
  • the calibrated footprint can have accuracy in the range of 2 to 4 meters.
  • a location of client device is determined.
  • an Indoor Positioning System (IPS) technique e.g., Magnetic Indoor Positioning, WIFI Indoor Positioning, and/or the like
  • the user e.g., as an avatar 215 , 315
  • the user can be shown in the application in different positions on the floor map as the user moves around the building floor. Showing the user on the floor map may give the user of the application a sense of where the user is relative to other elements (e.g., assets hallways, office rooms, and the like) displayed in association with the map.
  • an asset is located. For example, as the user moves around the building floor, the user can visually identify an asset that is not shown (or a different asset than what is shown). In another example, an asset can be detected based on a communications protocol signal (e.g., NFC, Bluetooth, BLE, WIFI, and/or the like) transmitted from a device (e.g., a smart device and/or an IoT enabled device).
  • a communications protocol signal e.g., NFC, Bluetooth, BLE, WIFI, and/or the like
  • a device e.g., a smart device and/or an IoT enabled device.
  • step S 435 If the asset is IoT enabled, also referred to as a smart device, (yes in step S 435 ), processing continues to node B which continues with FIG. 6 described below. If the asset is not IoT enabled, also referred to as not a smart device, (no in step S 435 ), processing continues to node A which continues with FIG. 5 described below. Processing returns from node A and node B at node C.
  • step S 440 the asset is pinned at the location on the floor map.
  • metadata including information associated with the IoT enabled device (or smart device) identified as an asset (see the discussion of FIG. 5 below) or the not IoT enabled device (or non-smart device/asset) identified as an asset (see the discussion of FIG. 6 below) that has been placed or pinned to a position on the floor map corresponding to the auto-located position in the application can be communicated from the computing device executing the application to the computing device including the map management system.
  • the metadata can include information that identifies the asset (e.g., Name, ID, MAC Address, IP Address, and the like) ascertained during discovery of the IoT enabled device (or non-smart device/asset) and/or asset identification of the not IoT enabled device (or non-smart device/asset).
  • the metadata can include information related to the location of the asset (e.g., coordinates on the floorplan or corresponding map) ascertained during the auto-locate process implemented for the asset.
  • step S 445 data is updated based on the asset.
  • the metadata can be processed by the map management system and stored in the datastore (e.g., map storage 1310 ) as an asset (e.g., asset 1314 ).
  • the map management system can be a configuration management database (CMDB) including assets identified as configuration items (CI).
  • CMDB configuration management database
  • the map management system can determine if matching asset(s) (as a CI) exist in the datastore. Should a new asset be discovered, the map management system can create a new record (e.g., new CI record).
  • the map management system can update an existing record (e.g., CI record) using a reconciliation process in the map management system (or CMDB).
  • the received metadata for the asset can be used to update the record for the asset (e.g., asset 1314 and/or asset state 1320 ) with regard to a location, a status, a MAC Address, an IP Address, and/or the like for the asset in the datastore (e.g., map storage 1310 and/or asset state storage 1318 ).
  • map management system and/or other IT systems can include information about an asset and/or a type of asset.
  • Characteristics of the asset e.g., possible asset states, possible asset actions, operating procedures, error logs, maintenance logs, and/or the like
  • linkages e.g., a joined table
  • the ID for the asset discovered as existing in another can be added (in an appropriate field) in the record for the asset (e.g., asset 1314 ) to create the linkage via a joined table.
  • system will be able to auto-generate an asset floor plan much more intuitively than any other existing/known application. End Users can also follow the same flow to discover more assets or detect changes in asset locations and keep the asset floor plan as real-time as possible.
  • Example embodiments can use two techniques for identifying and pinning various assets on the map.
  • the asset is an IoT enabled device
  • the asset can be discovered using IoT protocols (e.g., NFC, Bluetooth, BLE, WIFI, and/or the like).
  • IoT enabled devices or assets can include printers, laptops, mobile phones, monitors, smart TVs, projectors, hard disks, network access points, and/or the like.
  • IoT enabled devices or assets can be configured to communicate wirelessly using wireless protocols.
  • at least one property (or properties) that identify the device e.g., Name, ID, MAC Address, IP Address, and the like
  • channel frequency and signal strengths can be measured and/or determined and stored by the computing device executing the application.
  • FIG. 5 illustrates another block diagram of a flowchart according to at least one example embodiment.
  • the IoT enabled asset is discovered using a wireless protocol.
  • the computing device executing the application can receive the signal and determine that the asset is an IoT enabled device.
  • the computing device executing the application can then request the properties that identify the device (e.g., Name, ID, MAC Address, IP Address, and the like) from the IoT enabled device.
  • the IoT enabled device communicates the properties that identify the IoT enabled device to the computing device executing the application.
  • the application stores (e.g., in a memory of the computing device) the properties that identify the IoT enabled device.
  • step S 510 the asset is auto-located.
  • the IoT enabled device is scanned using various protocols including (but not limited to) WIFI, Bluetooth, and the like and channel Frequency (e.g., 2.4 GHz or 5 GHz) is captured along with the signal strength, for example, a Received Signal Strength Indicator (RSSI) in dBm (decibels per milliWatt).
  • RSSI Received Signal Strength Indicator
  • dBm decibels per milliWatt
  • Dist( m ) (27.55 ⁇ (20 log 10 F (in MHz)+
  • Equation 1 The constants used in Equation 1 depend on the free space path (e.g., obstacles) and can be tuned (e.g., varied) depending on the environment (e.g., as determined in the initial calibration described above). Also, a dB value should be calculated from a dBm value. This gives the approximate circular range in which device would be located. For example, for a device transmitting a WIFI signal at 2.4 GHz frequency with RSSI of ⁇ 27 dB would be located in a circle of approximately 7 meter radius.
  • readings can be measured as the user walks around the area
  • the application can store the change location of the computing device executing the application (e.g., using the IPS technique above) as well as new distance calculated based on changed RSSI reading.
  • Trilateration as illustrated in FIG. 9 , the exact (or a more precise) position of the IoT enabled device can be determined or calculated.
  • step S 515 the asset is mapped.
  • the IoT enabled device can be identified as an asset and the IoT enabled device can be placed or pinned to a position on the floor map corresponding to the auto-located position.
  • example embodiments use a technique based on camera measurement and image processing APIs (e.g., vision API) to identify the asset.
  • camera measurement and image processing APIs e.g., vision API
  • FIG. 6 illustrates another block diagram of a flowchart according to at least one example embodiment.
  • an image of the asset is captured.
  • the computing device executing the application can include a camera (e.g., camera 1211 ).
  • the camera can be used to take a picture of the asset.
  • an asset class associated with the asset can be identified using machine learning (ML) and an image processing application programming interface (API).
  • the image processing API can enable developers to understand the content of an image by encapsulating machine learning models in an easy to use representational state transfer (REST) API.
  • the image processing API can classify images into thousands of categories (e.g., as a type of asset) and detect individual objects (e.g., text) within an image.
  • Machine learning classification can be used to identify an asset class from a set of given classes. Over time (e.g., through learning iterations), a set of features can be identified for given asset classes. Then using sufficient training data a ML model can be built. The ML model can then be used to classify objects in a captured image to one of the asset class.
  • step S 615 the asset is auto-located.
  • an angle of the computing device executing the application can be used to estimate the distance to a point on the ground.
  • Other measurements like height, width can be used to improve accuracy.
  • the computing device executing the application can be held in front of the user, align the point in the camera toward the asset and using the application get a direct reading of the distance.
  • the height of where the computing device executing the application is held e.g., eye-level
  • the user can point the camera to the point where the asset touches the ground.
  • the computing device executing the application can measure an inclination (e.g., based on the aforementioned angle) between at least two heights associated with the asset and use a trigonometric function and the inclination to determine a distance between the client device and the asset as illustrated in FIG. 10 and FIG. 11 .
  • an inclination e.g., based on the aforementioned angle
  • step S 620 the asset is mapped.
  • the asset can be identified as an asset and can be placed or pinned to a position on the floor map corresponding to the auto-located position.
  • FIG. 12 illustrates a block diagram of a computer device according to at least one example embodiment.
  • the computer device is a client device 1200 .
  • the client device 1200 can be a desktop, laptop, mobile phone, mobile device, workstation, personal digital assistant, smartphone, tablet, a virtual machine, a virtual computing device and/or the like.
  • the client device 1200 can also be referred to as a user device an agent device, a client computing system, a user computing system, a device and/or the like.
  • the client device 1200 may be used by a user including IT personnel and/or general population (e.g., employees, managers and/or the like) authorized to have access to or use assets associated with an organization.
  • the client device 1200 may include a processor 1212 configured to execute one or more machine executable instructions or pieces of software, firmware, or a combination thereof.
  • the client device 1200 may include, in some embodiments, a memory 1214 configured to store one or more pieces of data, either temporarily, permanently, semi-permanently, or a combination thereof, Further, the memory 1214 may include volatile memory, non-volatile memory or a combination thereof.
  • the client device 1200 may include a storage medium 1215 configured to store data in a semi-permanent or substantially permanent form. In various embodiments, the storage medium 1215 may be included by the memory 1214 .
  • the memory 1214 and/or the storage medium 1215 may be referred to as and/or implemented as a non-transitory computer readable storage medium.
  • the client device 1200 may include one or more network interfaces 1216 configured to allow the client device 1200 to be part of and communicate via a communications network.
  • a Wi-Fi protocol may include, but are not limited to: Institute of Electrical and Electronics Engineers (IEEE) 802.11g, IEEE 802.11n, etc.
  • IEEE 802.11g Institute of Electrical and Electronics Engineers 802.11g
  • IEEE 802.11n etc.
  • a cellular protocol may include, but are not limited to: IEEE 802.16m (a.k.a. Wireless-MAN (Metropolitan Area Network) Advanced), Long Term Evolution (LTE) Advanced), Enhanced Data rates for GSM (Global System for Mobile Communications) Evolution (EDGE), Evolved High-Speed Packet Access (HSPA+), etc.
  • Examples of a wired protocol may include, but are not limited to: IEEE 802.3 (a.k.a. Ethernet), Fibre Channel, Power Line communication (e.g., HomePlug, IEEE 1901, etc.), etc. it is understood that the above are merely a few illustrative examples to which the disclosed subject matter is not limited.
  • the client device 1200 may include one or more other hardware components 1213 (e.g., a display or monitor, a keyboard, a mouse, a camera, a fingerprint reader, a video processor, etc.). It is understood that the above are merely a few illustrative examples to which the disclosed subject matter is not limited.
  • hardware components 1213 e.g., a display or monitor, a keyboard, a mouse, a camera, a fingerprint reader, a video processor, etc.
  • the client device 1200 may include one or more location services 1219 .
  • the location services 1219 may be configured to indicate where the client device 1200 is physically located within a certain amount of precision (often determined by the technology used for detecting the location).
  • this location service 1219 may include a Global Positioning System (GPS) receiver or detector.
  • the location service 1219 may include a control plane locator, such as, a device configured to determine the distance of the client device 1200 from one or more cell-phone (or other radio signal) towers or broadcasters.
  • the location service 1219 may be configured to estimate the client device's 1200 location based upon a time difference of arrival or other time-based technique.
  • the location service 1219 may be configured to estimate the user device's 102 location based upon a local-range (e.g., ⁇ 30 meters, Bluetooth, wireless local area network (WLAN) signals, near field communication (NFC), radio-frequency identification (RFID) tags, etc.) signals or another form of a local position system (LYS).
  • a local-range e.g., ⁇ 30 meters, Bluetooth, wireless local area network (WLAN) signals, near field communication (NFC), radio-frequency identification (RFID) tags, etc.
  • the location service 1219 may be configured. to make use of triangulation, trilateration, multilateration, or a combination thereof.
  • location service 1219 may be configured to make use of one or more of these examples either in combination or alone. It is understood that the above are merely a few illustrative examples to which the disclosed subject matter is not limited.
  • the client device 1200 may include an operating system (OS) 1217 configured to provide one or more services to an application 1230 and manage or act as an intermediary between the application 1230 and the various hardware components (e.g., the processor 1212 , a network interface 1216 , etc.) of the client device 1200 .
  • the client device 1200 may include one or more native applications, which may be installed locally (e.g., within the storage medium 1315 , etc.) and configured to be executed directly by the processor 1212 and directly interact with the OS 1217 .
  • the native applications may include pre-compiled machine executable code.
  • the native applications may include a script interpreter (e.g., C shell (csh), AppleScript, AutoHotkey, etc.) or a virtual execution machine (VM) (e.g., the Java Virtual Machine, the Microsoft Common Language Runtime, etc.) that are configured to translate source or object code into executable code which is then executed by the processor 1212 .
  • a script interpreter e.g., C shell (csh), AppleScript, AutoHotkey, etc.
  • VM virtual execution machine
  • the Java Virtual Machine e.g., the Java Virtual Machine, the Microsoft Common Language Runtime, etc.
  • the user may be an Information Technology (IT) Field Support personnel (e.g., Technicians, Asset Managers, and the like), using the application 1230 to build a floor map by discovering, identifying and pinning an asset at appropriate location on a floor map.
  • IT Information Technology
  • the user may be travelling to a new environment or work place, although the illustrated embodiment would be just as valid for a location that the user frequents. It is understood that the below is merely one illustrative example to which the disclosed subject matter is not limited. In such an embodiment, the user may wish to see or be made aware of the various assets, physical resources, or points of interests (POIs) around the user in this location.
  • POIs points of interests
  • a floor plan, floor map, and/or map includes a map or data structure that may be interpreted as a geographic diagram of a given or associated location or route.
  • the floor plan, floor map, and/or map can include a layout of a location (e.g., a floor of a building).
  • an asset is a term used to describe both physical objects, such as, for example a copier, printer, fax machine, traveler's workstation or computer, etc. and/or locations, such as, for example, a conference room, desk, etc.
  • asset may be used to both describe the object/location itself or a data structure that represents or is associated with the physical object/location itself and used to represent that physical object/location to a computing device (e.g., client device 1200 ) or a software application (e.g., application 1230 ).
  • a computing device e.g., client device 1200
  • a software application e.g., application 1230
  • the floor map may include a diagram of a rack of servers in data center.
  • the asset may include various server racks or particular server in a given rack.
  • the floor map may include a diagram of a computer network, and the asset may include various computing devices, access points, gateways, servers, and/or routers on the network. It is understood that the above are merely a few illustrative examples to which the disclosed subject matter is not limited.
  • the application 1230 may be configured to display an annotated map 1228 to the user on a display or display interface of the client device 1200 .
  • the annotated map 1228 may include a floor map (e.g., map 1312 ) and be annotated with one or more asset (e.g., asset 1314 ) retrieved or received from a remote computing device (e.g., map or asset management computing system 1350 ).
  • the annotated map 1228 may include the floor map 100 .
  • the floor map may show or describe the location of various structural features of a given location (e.g., a floor of an office building, etc.).
  • the structural features may include, but are not limited to, walls, doors, desks, furniture, sinks, toilets, elevators, plants, etc.
  • the floor map may be stored as images (e.g., a Joint Photographic Experts Group (jpeg) image, bitmap, scalable vector graphic, etc.) or as an array or other data structure that the displaying or manipulating application may read and display to the user as a human readable floor map.
  • the annotated map 1228 e.g., as floor map 100
  • the annotated map 1228 may include one or more assets (e.g., printer 115 , etc.).
  • the assets may include physical objects (e.g., printer 115 , etc.), locations (e.g., conference room 105 , etc.), or assets that are a combination of both (e.g., conference room 105 that includes a computer 125 , etc.).
  • these assets may be received by the displaying or manipulating application as a data structure that is then interpreted and displayed to the user as a human readable indicator (e.g., icon, rectangle, etc.).
  • the application 1230 may include a map annotator 1222 .
  • the map annotator 1222 may be configured to take a selected map and annotate it with the selected assets and the asset metadata (e.g., type, state, actions, etc.).
  • the map annotator 1222 may generate or produce the annotated map 1228 . In various embodiments, this annotated map 1228 may be similar to floor map 100 .
  • the application 1230 may include a map viewer 1224 .
  • the map viewer 1224 may be configured to display the annotated map 1228 to the user.
  • the map viewer 1224 may be configured to allow the user to select various assets, view the state information or metadata associated with the assets, zoom in or out of the annotated map 1228 , display a route between two or more locations, select an action, etc. It is understood that the above are merely a few illustrative examples to which the disclosed subject matter is not limited.
  • the map viewer 1224 may include a filter or search mechanism 1225 .
  • the user may be able to limit the assets displayed by the map viewer 1224 or included within the annotated map 1228 using a set of criteria supplied or selected by the user. For example, in one embodiment, the user may only wish to see assets of type printer. In such an embodiment, any assets not of type printer may be removed from the annotated map 1228 or simply not displayed by the map viewer 1224 .
  • the filter 1225 may select or filter assets based on other properties or associated with an asset (e.g., free conference rooms, working copiers, assets associated with the Finance department, assets with a red state, etc.). It is understood that the above are merely a few illustrative examples to which the disclosed subject matter is not limited.
  • the map viewer 1224 may include a router or path generating mechanism or component 1223 .
  • the router 1223 may be configured to generate or determine a route between two or more locations.
  • the router 1223 may determine a path between the current location of the client device 1200 and a selected or desired asset (e.g., asset 210 ).
  • this route or path may be graphical and displayed on the annotated map 1228 (as shown in FIGS. 2A, 2B and 2C .
  • the path may be described in text, graphics, audio directions, a combination thereof, or other forms. It is understood that the above are merely a few illustrative examples to which the disclosed subject matter is not limited.
  • the application 1230 may include a asset action responder 1226 .
  • the asset action responder 1226 may be configured to execute or request the execution of the steps or process defined by the selected action 167 .
  • the asset action responder 1226 may determine if the action may be executed locally (by the client device 1200 ). For example, a user may wish to view a file, or place a telephone call, send an email, etc. If the information needed to execute the action is available locally or may be obtained via local resources (hardware or software), the asset action responder 1226 may execute or perform the requested action.
  • the requested file may be included in the metadata or may be obtainable via an HTTP request
  • the client device 1200 may include a phone and the desired number may be included in the metadata, likewise when sending an email, etc.
  • the client device 1200 or application 1230 may have received one or more signals triggering location 1221 .
  • the application 1230 or client device 1200 may transmit its location information or a map request that includes the location information. It is understood that the above is merely one illustrative example to which the disclosed subject matter is not limited.
  • the client device 1200 or application 1230 includes a floor map calibration module 1232 .
  • the floor map calibration module 1232 can be configured to store (or cause to be stored) calibration information.
  • the calibration information can include information as to whether or not the floor map has been calibrated, calibration measurement data and/or calibration result data.
  • the floor map can be identified as not calibrated the first time the floorplan is loaded into the application 1230 (by the user or by any other user).
  • the floor map includes metadata (e.g., communicated from the map management system to the computing device executing the application with the floor map) indicating the map has or has not been calibrated.
  • a calibration operation can include walking a location (e.g., a floor of a building) with the application 1230 operating on the client device 1200 .
  • the application 1230 can be in a calibration mode configured to capture data.
  • the floor map calibration module 1232 can generate a table of coordinates on the map and a corresponding unique identifier (or signature or footprint of that position).
  • the identifier can be a magnetic footprint or a WIFI signal footprint.
  • Magnetic indoor positioning is based on buildings or structures each have a unique magnetic signature or fingerprint.
  • the steel structure within the building distorts the Earth's magnetic field in a unique way.
  • the user e.g., of the aforementioned application
  • can calibrate the map by marking a path on the map and then walking the same path in a calibration mode selected in the application.
  • the application then captures the magnetic fingerprint readings along the path and builds the database of fingerprints as related to coordinates within the location.
  • the more that training paths and rescans e.g., using crowd-sourcing, the better the accuracy of this technique.
  • FIG. 7 and FIG. 8 illustrate examples of magnetic fingerprint as related to coordinates within the location.
  • the floor map is calibrated for use in determining or calculating a distance.
  • signals e.g., associated with NFC, Bluetooth, BLE, WIFI, and/or the like
  • attenuated e.g., power loss
  • objects e.g., desks, assets, and the like. Therefore, calibration allows the determining or calculating of a distance to take into account the attenuation and/or distortion of the signals.
  • the calibrated footprint can have accuracy in the range of 2 to 4 meters.
  • the client device 1200 or application 1230 includes a device location module 1234 .
  • the device location module 1234 can be configured to identify the location of the client device 1200 .
  • IPS Indoor Positioning System
  • IPS Indoor Positioning System
  • the client device 1200 or application 1230 includes an asset discovery module 1236 .
  • the asset discovery module 1236 can be configured to discover or detect an asset based on a signal (e.g., NFC, Bluetooth, BLE, WIFI, and/or the like) from a device (e.g., a smart device and/or an IoT enabled device). For example, as the client device 1200 comes in range (e.g., within 5 meters) of an asset, the asset discovery module 1236 can trigger an event indicating an asset (e.g., smart device or IoT enabled device) is close by.
  • a signal e.g., NFC, Bluetooth, BLE, WIFI, and/or the like
  • a device e.g., a smart device and/or an IoT enabled device.
  • the asset discovery module 1236 can trigger an event indicating an asset (e.g., smart device or IoT enabled device) is close by.
  • the client device 1200 or application 1230 includes an asset location module 1240 .
  • the asset location module 1240 can be configured to determine a location of the asset. For example an IoT enabled device (or smart device) can be scanned using various protocols including (but not limited to) WIFI, Bluetooth, and the like and channel Frequency (e.g., 2.4 GHz or 5 GHz) is captured along with the signal strength, for example, a Received Signal Strength Indicator (RSSI) in dBm (decibels per milliWatt).
  • RSSI Received Signal Strength Indicator
  • the signal strength is related to the distance and frequency, Therefore, it possible to find the distance based on above two readings using equation 1 as described above.
  • asset location module 1240 can be configured to determine an angle of the client device 1200 to estimate the distance to a point on the ground. Other measurements like height, width can be used to improve accuracy.
  • the computing device executing the application can be held in front of the user, align the point in a camera 1211 toward the asset and using the application get a direct reading of the distance. For example, the height of where the computing device executing the application is held (e.g., eye-level) can be determined, then the user can point the camera 1211 to the point where the asset touches the ground. Then the asset location module 1240 can measure an inclination and with simple trigonometry the can determine or calculate distance as illustrated in FIG. 10 and FIG. 11 .
  • the client device 1200 or application 1230 includes an IoT enabled asset discovery module 1238 .
  • the IoT enabled asset discovery module 1238 can be configured to discover properties that identify the IoT enabled device or asset.
  • the IoT enabled asset discovery module 1238 can use a wireless protocol to discover the IoT enabled device or asset. For example, as the user is in range of a signal communicated from the IoT enabled device or asset, the client device 1200 can receive the signal and determine that the asset is an IoT enabled device.
  • the IoT enabled asset discovery module 1238 can then request the properties that identify the IoT enabled device or asset (e.g., Name, ID, MAC Address, IP Address, and the like) from the IoT enabled device.
  • the IoT enabled device communicates the properties that identify the IoT enabled device to the client device 1200 .
  • the IoT enabled asset discovery module 1238 then stores (e.g., in memory 1214 ) the properties that identify the IoT enabled device.
  • the client device 1200 or application 1230 includes an image processing API 1218 .
  • the image processing API 1218 can be configured to utilize external tools to identify an object using a picture (or image) of the object. For example, can be identified using machine learning (ML) and implemented through the image processing API 1218 .
  • the external tools implemented through the image processing API 1218 can enable developers to understand the content of an image by encapsulating machine learning models in a representational state transfer (REST) API.
  • the external tools implemented through the image processing API 1218 can classify images into thousands of categories and detect individual objects (e.g., text) within an image.
  • the image processing API 1218 can be configured to communicate with the external tools using an internet (e.g., HTTP) protocol.
  • Machine learning (ML) classification can be used to identify an asset class from a set of given classes. Over time (e.g., through learning iterations), a set of features can be identified for given asset classes. Then using sufficient training data a ML model can be built. The ML model can then be used to classify objects in a captured image to one of the asset class.
  • ML Machine learning
  • FIG. 13 illustrates a block diagram of another computing device according to at least one example embodiment.
  • computing device may include a map or asset management computing system 1350 , one or more storage computing devices or systems 1305 and an administrator device 1330 .
  • the map or asset management computing system 1350 , the one or more storage computing devices or systems 1305 and the administrator device 1330 can operate in a single computing device (e.g., workstation, a server, a blade server, and other appropriate computers, etc. or a virtual machine or virtual computing device thereof), on separate computing devices, or any combination thereof.
  • the map or asset management computing system 1350 may include a map selector 1352 .
  • the map selector 1352 may be configured to receive location information from the client device 1200 .
  • the client device 1200 may supply or transmit the current location of the client device 1200 periodically or when a triggering event occurs (e.g., in response to a user request for a map or floor map, entering a predefined location, such as, one of the company's offices, etc.).
  • the client device 1200 may supply or transmit a request for a map or floor map of a specific location (e.g., abuilding and floor).
  • this location information may include a list of GPS coordinates or other location coordinates or information.
  • the map selector 1352 may be configured to select at least one map or floor map that is deemed relevant to the provided location information. In one embodiment, the map selector 1352 may be configured to pick or select a map or floor map that includes or bounds the provided location information. For example, if the client device 1200 is on the third floor of a building, the map selector 1352 may select the floor map of the third floor of that building. In another embodiment, the map selector 1352 may be configured to select at least one map or floor map near (as defined by a predefined set of criteria or rules) to the supplied location information.
  • the map selector 1352 may select the floor maps of the second, third, and fourth floors of that building.
  • the map selector 1352 may be configured to remember a history of what map or floor map, etc. have previously been presented to the client device 1200 .
  • the map selector 1352 may be configured to take into account user actions or predicted user actions when selecting a map or floor map. For example, if the client device 1200 is on the third floor of a building, and moving towards the elevators, the map selector 1352 may select the floor map of the second and fourth floors of that building. It is understood that the above are merely a few illustrative examples to which the disclosed subject matter is not limited.
  • the map selector 1352 may be configured to retrieve any asset associated with the selected map or floor map. In some embodiments, the map selector 1352 may be configured to filter or only select a portion of the assets associated with the selected map or floor map. In one embodiment, the map selector 1352 may be configured to retrieve any metadata or properties associated with the selected map or floor map and the selected assets. In the illustrated embodiment, this metadata includes asset actions and asset states. It is understood that the above is merely one illustrative example to which the disclosed subject matter is not limited.
  • the map selector 1352 may be configured to transmit the selected map or floor map, the associated or selected assets, and the associated asset metadata to the client device 1200 .
  • this information and other communications may be transmitted via Hypertext Transfer Protocol (HTTP), Hypertext Transfer Protocol Secure (HTTPS), or another communications protocol.
  • HTTP Hypertext Transfer Protocol
  • HTTPS Hypertext Transfer Protocol Secure
  • the map or asset management computing system 1350 may include an asset state manager 1354 .
  • the asset state manager 1354 may be configured to maintain state information associated with each asset.
  • the asset state manager 1354 may receive state information from a plurality of sources, such as, for example the assets illustrated in FIG. 1 , various client devices 1200 , or administer devices 1330 , etc.
  • the printer when a printer detects a paper jam, the printer may be configured to send a message (e.g., email, tweet, HTTP message, etc.) to the asset state manager 1354 or a server to which the asset state manager 1354 subscribes (e.g., a Rich Site Summary or Really Simple Syndication (RSS) feed, etc.).
  • the asset state manager 1354 may then edit or update the asset state 1320 associated with the printer to reflect the paper jam (e.g., a state of paper jam, unavailable, etc.).
  • a message e.g., email, tweet, HTTP message, etc.
  • RSS Really Simple Syndication
  • the client device 1200 may inform the application 1230 of the new or current state.
  • the map or asset management computing system 1350 may include an asset action manager 1356 .
  • the asset action manager 1356 may be configured to execute or process an asset action request from a client device 1200 .
  • the asset action manager 1356 may be configured to perform the request action (or portion thereof) itself, or to request that another device perform the action or part thereof.
  • the asset action manager 1356 may be configured to change the state of the asset associated with the action.
  • the action may include that the user has cleared the paper jam in the printer, and the requested action may be to change the state or status of the printer to reflect that this manual portion of the action has been performed.
  • the asset action manager 1356 may work with or communicate with the asset state manager 1354 to perform such an action.
  • map and asset information is transmitted for the administrator device 1330 to the map or asset management computing system 1350 , and more specifically to the map and asset manager 1358 .
  • the map and asset manager 1358 may be configured to enter the map or PoI information supplied by the administrator device 1330 into the map storage 1310 . In various embodiments, this may include re-formatting the map or asset information for storage as the maps 1312 and assets 1314 .
  • the map and asset manager 1358 may be configured to retrieve maps 1312 and assets 1314 requested by the administrator device 1330 from the storage system 1305 and supply the resultant map or asset information to the administrator device 1330 .
  • an administrator may edit, delete, or update various aspects of existing maps 1312 and assets 1314 .
  • this map or asset information may be communicated directly between the storage system 1305 and the administrator device 1330 .
  • map and asset information is transmitted from the client device 1200 to the map and asset manager 1358 .
  • the map and asset information can include asset locations and information that identifies the asset.
  • the map or asset management computing system 1350 may include hardware and/or software components 1360 analogous to those described above in reference to client device 1200 . In some embodiments, the map or asset management computing system 1350 may include a plurality of computing devices.
  • the storage system 1305 may include a computing device, such as, for example, a desktop, workstation, a server, a blade server, and other appropriate computers, etc. or a virtual machine or virtual computing device thereof.
  • the storage system 1305 may include hardware and/or software components 1324 analogous to those described above in reference to client device 1200 .
  • the storage system 1305 may include a plurality of computing devices.
  • the storage system 1305 may include one or more storage systems or data bases 1310 and 1318 .
  • the storage system 1305 may include a map and asset storage or database 1310 .
  • the map storage 1310 may store one or more maps or floor maps 1312 and one or more assets 1314 .
  • the storage system 1305 may include an asset state storage or database 1318 .
  • the asset state storage or database 1318 may include one or more asset states 1320 .
  • each stored asset state 1320 may be associated with a respective asset 1314 .
  • the data structure associated with the asset 1314 may be associated with or include an asset state 1320 property or field that indicates the status or usability of the associated asset 1314 .
  • the asset 1314 may inherit one or more acceptable states based on the asset type.
  • the administrator may set or define a list of possible states the asset 1314 may be in.
  • the asset states 1320 include the actual state of the asset 1314 at a given moment.
  • the application 1230 may display the current state of a given asset 1314 on the annotated map 1228 , as described below.
  • the administer may use the administrator user interface (UI) or application 1332 to import (and the edit or maintain, etc.) graphic images or data structures that represent floor maps into the map and asset storage or database 1310 .
  • the floor maps 1312 may include data that includes a description of the floor map (e.g., “Building H, Floor 2”, “Winnipeg Office, Ground Floor”, etc.), and a geographical location or coordinates where the associated physical floor exists.
  • other information may be included. In some embodiments, such information may not be stored within the floor map 1312 itself, but in a separate format as floor map metadata 1316 .
  • the information may be stored in a variety of formats (e.g., as part of the floor map's 1312 filename, as part of a metadata tag include by the floor map, as a separate file, etc.).
  • the floor map metadata 1316 and the floor map 1312 may be stored in a variety of formats, such as for example a text-based file (e.g., Extensible Markup Language (XML), JavaScript Object Notation (JSON), Comma-separated values (CSV), etc.), a binary-based format (e.g., zip compression format, JPEG, a serialized object-oriented data structure or object, etc.), or a combination thereof.
  • XML Extensible Markup Language
  • JSON JavaScript Object Notation
  • CSV Comma-separated values
  • a binary-based format e.g., zip compression format, JPEG, a serialized object-oriented data structure or object, etc.
  • the administrator may use the administrator user interface (UI) or application 1332 to import (and the edit or maintain, etc.) one or more assets 1314 to the map and asset storage or database 1310 .
  • the administrator UI or application 1332 may be configured to allow or facilitate the ability for an administrator to place assets 1314 on the map 1312 via a graphical paradigm, similar to placing items via a drawing program.
  • map and asset information is transmitted from the client device 1200 to the map and asset manager 1358 .
  • the map and asset information can include asset locations and information that identifies the asset.
  • Various implementations of the systems and techniques described here can be realized in digital electronic circuitry, integrated circuitry, specially designed ASICs (application specific integrated circuits), computer hardware, firmware, software, and/or combinations thereof. These various implementations can include implementation in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, coupled to receive data and instructions from, and to transmit data and instructions to, a storage system, at least one input device, and at least one output device.
  • Various implementations of the systems and techniques described here can be realized as and/or generally be referred to herein as a circuit, a module, a block, or a system that can combine software and hardware aspects.
  • a module may include the functions/acts/computer program instructions executing on a processor (e.g., a processor formed on a silicon substrate, a GaAs substrate, and the like) or some other programmable data processing apparatus.
  • Methods discussed above may be implemented by hardware, software, firmware, middleware, microcode, hardware description languages, or any combination thereof.
  • the program code or code segments to perform the necessary tasks may be stored in a machine or computer readable medium such as a storage medium.
  • a processor(s) may perform the necessary tasks.
  • references to acts and symbolic representations of operations that may be implemented as program modules or functional processes include routines, programs, objects, components, data structures, etc., that perform particular tasks or implement particular abstract data types and may be described and/or implemented using existing hardware at existing structural elements.
  • Such existing hardware may include one or more Central Processing Units (CPUs), digital signal processors (DSPs), application-specific-integrated-circuits, field programmable gate arrays (FPGAs) computers or the like.
  • CPUs Central Processing Units
  • DSPs digital signal processors
  • FPGAs field programmable gate arrays
  • the software implemented aspects of the example embodiments are typically encoded on some form of non-transitory computer readable storage medium or implemented over some type of transmission medium.
  • the program storage medium may be magnetic (e.g., a floppy disk or a hard drive) or optical (e.g., a compact disk read only memory, or CD ROM), and may be read only or random access.
  • the transmission medium may be twisted wire pairs, coaxial cable, optical fiber, or some other suitable transmission medium known to the art. The example embodiments not limited by these aspects of any given implementation.

Abstract

A method includes receiving a floor map indicating a layout of a location, displaying at least a portion of the floor map, capturing signal strength data representing a signal field for at least one position on the floor map, identifying an asset within the layout of the location, determining at least one property that identifies the asset using one of a discovery process using a wireless protocol and an image processing application programming interface (API) configured to classify an image and detect individual within the image, updating the floor map with the asset and the at least one property, and communicating the asset and the at least one property to the remote computing device.

Description

    RELATED APPLICATION
  • This application claims priority to application no. TEMP/E-1/2254/2017-CHE, filed in India on Jan. 20, 2017, the contents of which is incorporated herein by reference.
  • FIELD
  • Embodiments relate to asset management, d. more specifically to the tagging and use of physical objects or locations.
  • BACKGROUND
  • Typically a company or sufficiently large organization has shared resources or assets that various people use. In some instances those shared resources may include things or physical objects, such as, for example copiers, printers, fax machines, traveler's workstations or computers (e.g., an unassigned computer available for use by travelling or transient workers), refrigerators, coffee makers, and the like. In some instances those shared resources may include locations, such as, for example, conference rooms, traveler's workstations or hot desks (i.e., an unassigned office space available to travelling or transient workers), break rooms, and the like.
  • Often times, it may be difficult someone to locate these shared resources, particularly if one is visiting a corporate site that one does not often visit or has never been to. For example, a worker might work at a company's Austin site but when that worker visits the company's San Jose site, they may find it difficult to locate usable office space, or printers, etc. Frequently, even if such a resource is found (e.g., an empty desk) such a worker may not even know if they are allowed to use the resource. It may be embarrassing for the worker to sit down at and begin to use an empty desk only to find out that the desk is reserved for someone else. Alternately, it may be frustrating to attempt to use a printer only to find out that the printer is out of order and that the search for a new printer must begin again. Other irritations and issues may arise when attempting to use shared resources.
  • In some case, even when a desired shared resource is located, there might be additional steps or actions that may need to be performed or additional resources may need to be located in order to use the resource. For example, a worker might need instructions on using the video or presentation equipment. In another example, even though a working printer has been found, the printer paper supply might be low or empty and the worker may need to locate more paper.
  • Likewise, often local or non-traveling employees, members of the organization, or guests, have a similar need of need to know if a desired resource is available or functional. Traditionally, a worker would have to physically go to the resource or location and find out it may or may not be available or functional. For example, to see if a conference room is available, one needs to travel to actual conference room and look to see if anyone is using it. Such a traditional scheme costs valuable time and has the disadvantage of not always being accurate (e.g., a conference room may be reserved but the reserver may simply be late, leading the worker to incorrectly view the empty conference room as available when it is not, etc.).
  • For Information Technology (IT) Field Support personnel (e.g., Technicians, Asset Managers, and the like), the process of building a floor map by discovering, identifying and pinning the asset at appropriate location on the floor map, is very time consuming and requires many manual steps. Also, because floor map building is a manual process, the location can be based on approximations and/or be error prone. The number of assets on a floor has grown substantially (e.g., with bring your own device (BYOD) organizations). In addition, the location of devices can change on a regular or random basis (e.g., daily, weekly, as a user moves hot desks, and the like). Therefore, keeping the visual floor map up-to-date is a significant challenge for IT personnel.
  • Similarly for and end user, locating a required asset (e.g., a specific printer), finding all useful assets within a periphery (e.g., a video conferencing room or board room) and finding live statuses of assets can be a fairly non-intuitive process without an up-to-date visual floor map.
  • SUMMARY
  • Example embodiments describe systems and methods to identify and locate assets and to generate a visual asset floor map.
  • In a general aspect, a non-transitory computer readable storage medium including executable code that, when executed by a processor, is configured to cause the processor to perform steps and a method to perform the steps including receiving, from a remote computing device, a floor map indicating a layout of a location, displaying, via a display interface of a client device, at least a portion of the floor map, capturing, using an application of the client device, signal strength data representing a signal field for at least one position on the floor map, identifying an asset within the layout of the location, determining at least one property that identities the asset using one of a discovery process using a wireless protocol and an image processing application programming interface (API) configured to classify an image and detect individual within the image, updating the floor map with the asset and the at least one property, and communicating the asset and the at least one property to the remote computing device.
  • Implementations can include one or more of the following features. For example, the location is can be floor of a building. The signal strength data representing the signal field can be based on a magnetic field footprint captured using a Magnetic Indoor Positioning protocol. The signal strength data representing the signal field can be based on a WIFI signal footprint captured using a captured using a WIFI Indoor Positioning protocol. The asset can be a smart device, and the identifying of the asset can include detecting a communications protocol signal transmitted from the asset. The asset can be a smart device, and the determining of at least one property that identities the asset includes determining a position of the asset on the floor map can include determining a position of the client device on the floor map using an Indoor Positioning System (IPS) protocol, measuring a channel frequency and a signal strength using the wireless protocol, and using a formula based on a free-space path loss (FSPL), the channel frequency and the signal strength to determine a distance between the client device and the asset.
  • The asset may not be smart device, and the identifying of the asset includes capturing an image of the asset, using the image processing API to communicate the image to an external tool configured to identify an object using the image, and receive an asset class associated with the asset from the external tool. The asset may not be a smart device, and the determining of at least one property that identifies the asset includes determining a position of the asset on the floor map including determining a position of the client device on the floor map using an Indoor Positioning System (IPS) protocol, and measuring an inclination between at least two heights associated with the asset and use a trigonometric function and the inclination to determine a distance between the client device and the asset.
  • In another general aspect, a method includes receiving, from a client device, a request for a floor map based on a. floor of a building, the floor map indicating a layout of the floor of the building, in response to receiving the request for the floor map, selecting a floor map from a database configured to store a plurality of maps, communicating the floor map to the client device, receiving, from the client device, information related to an asset, the information including at least one properly that identifies the asset and a position of the asset on the floor map, in response to receiving the information related to the asset, update a database configured to store data related to a plurality of assets, generating an annotated floor map based on the asset and the information related to the asset, and communicating the annotated floor map to the client device.
  • Implementations can include one or more of the following features. For example, the annotated floor map includes an icon representing the asset and an indicator, the icon representing the asset is located on the floor map at the position of the asset, and the indicator is located on the floor map at the position of the asset and indicates at least one of a type of the asset and a status of the asset. The discovering linkages to characteristics of the asset, and adding the linkages for the asset to the database configured to store data related to the plurality of assets. The update of the database configured to store data related to the plurality of assets includes one of determining whether a record associated with the asset exists upon determining a record associated with the asset exists, update the record using the information related to the asset, and upon determining a record associated with the asset does not exist, generate a new record using the information related to the asset.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • Example embodiments will become more fully understood from the detailed description given herein below and the accompanying drawings, wherein like elements are represented by like reference numerals, which are given by way of illustration only and thus are not limiting of the example embodiments and wherein:
  • FIG. 1 illustrates a floor map according to at least one example embodiment.
  • FIGS. 2A, 2B, and 2C pictorially illustrate a route to an asset according to at least one example embodiment.
  • FIG. 3 pictorially illustrates a status and location of an asset according to at least one example embodiment.
  • FIG. 4 illustrates a block diagram of a flowchart according to at least one example embodiment.
  • FIG. 5 illustrates another block diagram of a flowchart according to at least one example embodiment.
  • FIG. 6 illustrates another block diagram of a flowchart according to at least one example embodiment.
  • FIGS. 7 and 8 illustrate a magnetic fingerprint of a floor map according to at least one example embodiment.
  • FIG. 9 illustrates trilateration of a device or asset according to at least one example embodiment.
  • FIGS. 10 and 11 illustrate locating a position of a device or asset according to at least one example embodiment.
  • FIG. 12 illustrates a block diagram of a computing device according to at least one example embodiment.
  • FIG. 13 illustrates a block diagram of another computing device according to at least one example embodiment.
  • It should be noted that these Figures are intended to illustrate the general characteristics of methods, structure and/or materials utilized in certain example embodiments and to supplement the written description provided below. These drawings are not, however, to scale and may not precisely reflect the precise structural or performance characteristics of any given embodiment, and should not be interpreted as defining or limiting the range of values or properties encompassed by example embodiments. For example, the positioning of structural elements may be reduced or exaggerated for clarity. The use of similar or identical reference numbers in the various drawings is intended to indicate the presence of a similar or identical element or feature.
  • DETAILED DESCRIPTION OF THE EMBODIMENTS
  • While example embodiments may include various modifications and alternative forms, embodiments thereof are shown by way of example in the drawings and will herein be described in detail. It should be understood, however, that there is no intent to limit example embodiments to the particular forms disclosed, but on the contrary, example embodiments are to cover all modifications, equivalents, and alternatives.
  • The digital workplace is expected to be a live and dynamic workplace where both Information Technology (IT) Field Support personnel and end users can be expected to work in a smart and intuitive environment. Example embodiments describe an automated technique for generating a visual asset floor map by (1) automatically identifying locations of smart (e.g., Internet of Things (IoT) enabled) devices on a floor map, (2) providing an intuitive technique for identifying location of non-Smart devices on the floor map, (3) identifying and classifying an asset type of non-Smart devices using techniques based on image and/or pattern recognition, and (4) using a crowd-sourced model to keep the floor map live and up-to-date in terms of device location and/or status. Smart devices, IoT enabled devices, non-Smart devices, assets and the like can also be referred to as points of interest (PoI) as well. A PoI can be a device (e.g., printer, computer, television, and/or the like) and/or a location (e.g., conference room, break room, rest room and/or the like).
  • FIG. 1 illustrates a floor map according to at least one example embodiment. As shown in FIG. 1, the floor map 100 can include at least one PoI or asset (hereinafter referred to as an asset) 105, 110, 115, 120, 125. For example, asset 105 and 110 are illustrated as a conference room, asset 120 and 125 are illustrated as a display (e.g., a television, a computer monitor and the like), and asset 115 is illustrated as a printer. Each asset 105, 110, 115, 120, 125 is illustrated as having an associated status indicator 130. Status indicator 130 can be configured to illustrate a status of the associated asset 105, 110, 115, 120, 125, The status indicator 130 can be color coded. For example, a red status indicator can indicate the asset 105, 110, 115, 120, 125 is not available and/or not operational. A yellow status indicator can indicate the asset 105, 110, 115, 120, 125 is available, but not operating properly. A green status indicator can indicate the asset 105, 110, 115, 120, 125 is available and operating properly. The status indicator 130 can also be a callout box including text with status and other information about the asset 105, 110, 115, 120, 125. Example implementations can include other colors and or other mechanisms for displaying text to show status of an asset 105, 110, 115, 120, 125.
  • According to example embodiments, the floor map 100 can be generated and populated with any number of assets (e.g., hardware, network devices, equipment, rooms, and the like). The floor map 100 can be dynamic (e.g., assets can be added, removed and/or relocated). Therefore, the floor map 100 can be updated in real time (e.g., a live map) as the floor map 100 updates itself after every usage (e.g., each time an end user interacts with the map and/or as IT personnel perform an build/rebuild operation). For example, an application operating on a computing device (e.g., desktop, laptop, and/or mobile device) can refresh a display showing the floor map 100 regularly (e.g., on a configured time interval) or as the floor map 100 (e.g., data associated with the floor map 100) is updated or changes.
  • An example method uses various combinations of techniques to auto-generation of the floor map 100. The techniques can include at least one of (1) after discovering a smart device using IoT protocols, a position can be found on the map based on a combination of IPS (Internal Positioning System) that may work within buildings where GPS does not and Trilateration of the devices positions based on signal strength (e.g., WIFI, near field technologies, and the like), (2) position detection of non-smart devices (e.g., devices that are not IoT) using a combination of IPS and camera measure techniques e.g., to locate the distance and angle of the device), (3) use a machine learning technique to determine an asset class of the device and then helps build linkages within a configuration management database (CMDB) by recommending a mapping to a configuration items (CI) within the CMDB, (4) use crowd-sourcing to build exact positions more accurately, to keep the map up-to-date in real-time and to keep the floor maps more accurate with respect to asset positions and status, and (5) end users can view assets on the map dynamically based on the assets current location, thus improving a user experience.
  • Accordingly, example embodiments can solve and/or help solve many use cases related to floor maps. For example, at least one example implementation can (1) discover an asset in real-time and pin the asset on the floor map 100, (2) keep a CMDB up-to-date using smart device techniques (e.g., IoT techniques), (3) keep the floor map up-to-date using crowdsourcing or keep the floor map up-to-date without using expensive (e.g., in human resources) and/or time consuming data entry effort on part of an IT organization, (4) search for an asset and find directions (e.g., routes within a building) to the asset on the floor map 100, and (5) provide an up-to-date status and location of the asset to the end users.
  • FIGS. 2A, 2B, and 2C pictorially illustrate a route to an asset according to at least one example embodiment. In an implementation, a floor map (e.g., the floor map 100) can be displayed on a computing device (e.g., mobile device, client device, smart phone, and/or the like) using an application. Using the application, a user can discover an asset 210 which is then shown on the floor map 205 (or a portion of the floor map 205). The user can then interact with the application to find a route to the asset 210. The user (shown as an avatar 215) can be at a first position as shown in FIG. 2A. A direct path from the first position to the asset 210 (shown as a straight line) is not possible. Therefore, the route directs the user to a second position (illustrated as a dotted line) as shown in FIG. 2B. A direct path from the second position to the asset 210 (shown as a straight line) is not possible. Therefore, the route directs the user to a third position (illustrated as a dotted line) as shown in FIG. 2C where the user has access to the asset 210. FIGS. 2A, 2B, and 2C pictorially illustrate the route to the asset as three screenshots. However, as single screenshot could be used.
  • FIG. 3 pictorially illustrates a status and location of an asset according to at least one example embodiment. As shown in FIG. 3, a user (shown as an avatar 315) is interacting with a computing device 305 executing an application showing an asset 310. The asset 310 is illustrated as having an associated status indicator 320. The status indicator 320 can be configured to illustrate a status of the associated asset (e.g., red, yellow green as discussed above). The status indicator 320 is shown with a callout box 325 including text with status and other information about the asset 310. The status indicator 320 and the callow box 325 can be implemented together as an indicator. The callout box 325 can be shown automatically when the user zooms (pans) in on the asset 310. The callout box 325 can disappear automatically when the user zooms (pans) out from the asset 310. The callout box 325 can appear/disappear when the user performs some action on the asset 310 via the application.
  • FIGS. 4-6 are flowcharts of methods according to example embodiments. The steps described with regard to FIGS. 4-6 may be performed due to the execution of software code stored in a memory and/or a non-transitory computer readable medium (e.g., memory 1214) associated with an apparatus (e.g., as shown in FIGS. 12 and 13 (described below)) and executed by at least one processor (e.g., processor 1212) associated with the apparatus. However, alternative embodiments are contemplated such as a system embodied as a special purpose processor. Although the steps described below are described as being executed by a processor, the steps are not necessarily executed by a same processor. In other words, at least one processor may execute the steps described below with regard to FIGS. 4-6.
  • FIG. 4 illustrates a block diagram of a flowchart according to at least one example embodiment. As shown in FIG. 4, in step S405 a building and floor level are selected. For example, a user can be interacting with a computing device executing an application configured to discover, identify and pin an asset as being located at a location on a floor map. In an initial operation on the application, the user can select a building and a floor in the building. In one implementation, the user can select the building based on a location of the device and/or through use of, for example, a dropdown list in the application. The floor can be selected, for example, using a dropdown list in the application. The dropdown list may include a floor(s) (e.g., a floor number) within the building.
  • In step S410 a floor map based on selected building and floor level is loaded. For example, the building and floor level (or a combination thereof) can have a unique value representing the building and floor level. The value could be the address of the building, a geo-positional value, an ID previously assigned, and the like and a number for the floor. The unique value can be communicated from the computing device executing the application to a computing device including a map management system (e.g., map or asset management computing system 1350). The map management system can then select a map as the floor map from a datastore (e.g., map storage 1310) including a plurality of maps at least a portion thereof each representing a floor map. The map management system can then communicate the floor map to the computing device executing the application in response to the user of the application loading the floor map. The application can then display the floor map on a display (e.g., within an active window) of the computing device executing the application.
  • Further, metadata representing at least one asset could be communicated from the map management system to the computing device executing the application with the floor map (e.g., in a same data packet or in a sequential (close in time) data packet). The metadata could include information about or related to the asset (e.g., status, type, serial number, and the like) and a location (e.g., coordinates) on the map representing the floor map. The floor map can then be annotated (e.g., overlaid) with the asset at the corresponding location on the floor map. In other words, the map management system can generate an annotated floor map based on the asset and the information (e.g., status, type, serial number, location and the like) related to the asset
  • If the floor map is calibrated (yes in step S415), processing continues to step S425. If the floor map is not calibrated (no in step S415), processing continues to step S420. In an example implementation, the floor map can be identified as not calibrated the first time the floorplan is loaded into an application (by the user or by any other user). In another implementation. the floor map includes metadata (e.g., communicated from the map management system to the computing device executing the application with the floor map) indicating the map has or has not been calibrated.
  • In step S420 the floor map is calibrated. For example, the calibration operation can include walking a location (e.g., a floor of a building) with the application operating on the computing device. The application can be in a calibration mode configured to capture data. The captured data can be signal strength data or information representing a signal field for at least one position (e.g., coordinate) on the floor map. Using the captured data, the application can generate a table of coordinates on the map and a corresponding unique identifier (or signature or footprint or fingerpring of that position). For example, depending on the technique used, the identifier can be a magnetic field footprint (sometimes called a fingerprint) or a WIFI signal footprint (sometimes called a fingerprint) captured using a WIFI Indoor Positioning protocol.
  • Magnetic indoor positioning is based on buildings or structures each have a unique magnetic signature or fingerprint. For example, the steel structure within the building distorts the Earth's magnetic field in a unique way. In an example implementation, the user (e.g., of the aforementioned application) can calibrate the map by marking a path on the map and then walking the same path in a calibration mode selected in the application. The application then captures the magnetic field fingerprint (e.g., using a Magnetic Indoor Positioning protocol) readings along the path and builds the database of fingerprints as related to coordinates within the location. The more that training paths and rescans (e.g., using crowd-sourcing, the better the accuracy of this technique. An example implementation can use extrapolation logic to generate missing fingerprints for the remaining (or unmeasured) coordinates. FIG. 7 and FIG. 8 illustrate examples of magnetic fingerprint as related to coordinates within the location.
  • In an example implementation, the floor map is calibrated for use in determining or calculating a distance. For example, signals (e.g., associated with NFC Bluetooth, BLE, WIFI, and/or the like) can be attenuated (e.g., power loss) or otherwise distorted by structures (e.g., walls) or objects (e.g., desks, assets, and the like). Therefore, calibration allows the determining or calculating of a distance to take into account the attenuation and/or distortion of the signals. In an example implementation, the calibrated footprint can have accuracy in the range of 2 to 4 meters.
  • In step S425 a location of client device is determined. In an example implementation, to identify the location of a user e.g., IT personnel, application user, and the like) of the computing device executing the application on the floor map, an Indoor Positioning System (IPS) technique (e.g., Magnetic Indoor Positioning, WIFI Indoor Positioning, and/or the like) can be used. The user (e.g., as an avatar 215, 315) can be shown in the application on the floor map. The user can be shown in the application in different positions on the floor map as the user moves around the building floor. Showing the user on the floor map may give the user of the application a sense of where the user is relative to other elements (e.g., assets hallways, office rooms, and the like) displayed in association with the map.
  • In step S430 an asset is located. For example, as the user moves around the building floor, the user can visually identify an asset that is not shown (or a different asset than what is shown). In another example, an asset can be detected based on a communications protocol signal (e.g., NFC, Bluetooth, BLE, WIFI, and/or the like) transmitted from a device (e.g., a smart device and/or an IoT enabled device).
  • If the asset is IoT enabled, also referred to as a smart device, (yes in step S435), processing continues to node B which continues with FIG. 6 described below. If the asset is not IoT enabled, also referred to as not a smart device, (no in step S435), processing continues to node A which continues with FIG. 5 described below. Processing returns from node A and node B at node C.
  • In step S440 the asset is pinned at the location on the floor map. For example, metadata including information associated with the IoT enabled device (or smart device) identified as an asset (see the discussion of FIG. 5 below) or the not IoT enabled device (or non-smart device/asset) identified as an asset (see the discussion of FIG. 6 below) that has been placed or pinned to a position on the floor map corresponding to the auto-located position in the application can be communicated from the computing device executing the application to the computing device including the map management system. The metadata can include information that identifies the asset (e.g., Name, ID, MAC Address, IP Address, and the like) ascertained during discovery of the IoT enabled device (or non-smart device/asset) and/or asset identification of the not IoT enabled device (or non-smart device/asset). The metadata can include information related to the location of the asset (e.g., coordinates on the floorplan or corresponding map) ascertained during the auto-locate process implemented for the asset.
  • In step S445 data is updated based on the asset. For example, the metadata can be processed by the map management system and stored in the datastore (e.g., map storage 1310) as an asset (e.g., asset 1314). The map management system can be a configuration management database (CMDB) including assets identified as configuration items (CI). Using the information that identifies the asset, the map management system (or CMDB) can determine if matching asset(s) (as a CI) exist in the datastore. Should a new asset be discovered, the map management system can create a new record (e.g., new CI record). Should the asset exist (e.g., a matching CI, ID, serial number, and/or the like is found), the map management system can update an existing record (e.g., CI record) using a reconciliation process in the map management system (or CMDB). In other words, the received metadata for the asset can be used to update the record for the asset (e.g., asset 1314 and/or asset state 1320) with regard to a location, a status, a MAC Address, an IP Address, and/or the like for the asset in the datastore (e.g., map storage 1310 and/or asset state storage 1318).
  • In step S450 asset characteristics and linkages are discovered. For example, map management system and/or other IT systems can include information about an asset and/or a type of asset. Characteristics of the asset (e.g., possible asset states, possible asset actions, operating procedures, error logs, maintenance logs, and/or the like) can be retrieved from or linked to the map management system and/or other IT systems and stored in relation to the asset, for example, in the record for the asset (e.g., asset 1314). Further, linkages (e.g., a joined table) can be discovered (e.g., an ID for the asset in another datastore or the joined table) and stored. For example, the ID for the asset discovered as existing in another can be added (in an appropriate field) in the record for the asset (e.g., asset 1314) to create the linkage via a joined table.
  • Using this set of steps, system will be able to auto-generate an asset floor plan much more intuitively than any other existing/known application. End Users can also follow the same flow to discover more assets or detect changes in asset locations and keep the asset floor plan as real-time as possible.
  • Example embodiments can use two techniques for identifying and pinning various assets on the map. In a first technique the asset is an IoT enabled device, the asset can be discovered using IoT protocols (e.g., NFC, Bluetooth, BLE, WIFI, and/or the like). IoT enabled devices or assets can include printers, laptops, mobile phones, monitors, smart TVs, projectors, hard disks, network access points, and/or the like. IoT enabled devices or assets can be configured to communicate wirelessly using wireless protocols. During a discovery process, at least one property (or properties) that identify the device (e.g., Name, ID, MAC Address, IP Address, and the like) can be communicated from the IoT enabled devices or assets to the computing device executing the application. In addition, channel frequency and signal strengths can be measured and/or determined and stored by the computing device executing the application.
  • FIG. 5 illustrates another block diagram of a flowchart according to at least one example embodiment. As shown in FIG. 5, in step S505 the IoT enabled asset is discovered using a wireless protocol. For example, as the user is in range of a signal communicated from the IoT enabled asset, the computing device executing the application can receive the signal and determine that the asset is an IoT enabled device. The computing device executing the application can then request the properties that identify the device (e.g., Name, ID, MAC Address, IP Address, and the like) from the IoT enabled device. The IoT enabled device communicates the properties that identify the IoT enabled device to the computing device executing the application. The application then stores (e.g., in a memory of the computing device) the properties that identify the IoT enabled device.
  • In step S510 the asset is auto-located. For example, initially, the IoT enabled device is scanned using various protocols including (but not limited to) WIFI, Bluetooth, and the like and channel Frequency (e.g., 2.4 GHz or 5 GHz) is captured along with the signal strength, for example, a Received Signal Strength Indicator (RSSI) in dBm (decibels per milliWatt). The signal strength is related to the distance and frequency. Therefore, it possible to find the distance based on above two readings using formula derived from free-space path loss (FSPL):

  • Dist(m)=(27.55−(20 log10 F(in MHz)+|Signal strength(in db)|)/20  (1)
  • The constants used in Equation 1 depend on the free space path (e.g., obstacles) and can be tuned (e.g., varied) depending on the environment (e.g., as determined in the initial calibration described above). Also, a dB value should be calculated from a dBm value. This gives the approximate circular range in which device would be located. For example, for a device transmitting a WIFI signal at 2.4 GHz frequency with RSSI of −27 dB would be located in a circle of approximately 7 meter radius.
  • In order to find a more precise location, readings can be measured as the user walks around the area, The application can store the change location of the computing device executing the application (e.g., using the IPS technique above) as well as new distance calculated based on changed RSSI reading. Using Trilateration as illustrated in FIG. 9, the exact (or a more precise) position of the IoT enabled device can be determined or calculated.
  • In step S515 the asset is mapped. For example, the IoT enabled device can be identified as an asset and the IoT enabled device can be placed or pinned to a position on the floor map corresponding to the auto-located position.
  • In a second technique for identifying and pinning various assets on the map, the assets not IoT enabled. For assets not IoT enabled (e.g., non-smart devices), example embodiments use a technique based on camera measurement and image processing APIs (e.g., vision API) to identify the asset.
  • FIG. 6 illustrates another block diagram of a flowchart according to at least one example embodiment. As shown in FIG. 6, in step S605 an image of the asset is captured. For example, the computing device executing the application can include a camera (e.g., camera 1211). The camera can be used to take a picture of the asset.
  • In step S610 the asset identified. For example, an asset class associated with the asset can be identified using machine learning (ML) and an image processing application programming interface (API). The image processing API can enable developers to understand the content of an image by encapsulating machine learning models in an easy to use representational state transfer (REST) API. The image processing API can classify images into thousands of categories (e.g., as a type of asset) and detect individual objects (e.g., text) within an image.
  • Machine learning classification can be used to identify an asset class from a set of given classes. Over time (e.g., through learning iterations), a set of features can be identified for given asset classes. Then using sufficient training data a ML model can be built. The ML model can then be used to classify objects in a captured image to one of the asset class.
  • In step S615 the asset is auto-located. According to an example embodiment, in order to locate a position of assets not IoT enabled (or non-smart device) an angle of the computing device executing the application can be used to estimate the distance to a point on the ground. Other measurements like height, width can be used to improve accuracy. The computing device executing the application can be held in front of the user, align the point in the camera toward the asset and using the application get a direct reading of the distance. For example, the height of where the computing device executing the application is held (e.g., eye-level) can be determined, then the user can point the camera to the point where the asset touches the ground. Then the computing device executing the application can measure an inclination (e.g., based on the aforementioned angle) between at least two heights associated with the asset and use a trigonometric function and the inclination to determine a distance between the client device and the asset as illustrated in FIG. 10 and FIG. 11.
  • In step S620 the asset is mapped. For example, the asset can be identified as an asset and can be placed or pinned to a position on the floor map corresponding to the auto-located position.
  • FIG. 12 illustrates a block diagram of a computer device according to at least one example embodiment. As shown in FIG. 12, the computer device is a client device 1200. The client device 1200 can be a desktop, laptop, mobile phone, mobile device, workstation, personal digital assistant, smartphone, tablet, a virtual machine, a virtual computing device and/or the like. The client device 1200 can also be referred to as a user device an agent device, a client computing system, a user computing system, a device and/or the like. In various embodiments, the client device 1200 may be used by a user including IT personnel and/or general population (e.g., employees, managers and/or the like) authorized to have access to or use assets associated with an organization.
  • In various embodiments, the client device 1200 may include a processor 1212 configured to execute one or more machine executable instructions or pieces of software, firmware, or a combination thereof. The client device 1200 may include, in some embodiments, a memory 1214 configured to store one or more pieces of data, either temporarily, permanently, semi-permanently, or a combination thereof, Further, the memory 1214 may include volatile memory, non-volatile memory or a combination thereof. In various embodiments, the client device 1200 may include a storage medium 1215 configured to store data in a semi-permanent or substantially permanent form. In various embodiments, the storage medium 1215 may be included by the memory 1214. The memory 1214 and/or the storage medium 1215 may be referred to as and/or implemented as a non-transitory computer readable storage medium.
  • In various embodiments, the client device 1200 may include one or more network interfaces 1216 configured to allow the client device 1200 to be part of and communicate via a communications network. Examples of a Wi-Fi protocol may include, but are not limited to: Institute of Electrical and Electronics Engineers (IEEE) 802.11g, IEEE 802.11n, etc. Examples of a cellular protocol may include, but are not limited to: IEEE 802.16m (a.k.a. Wireless-MAN (Metropolitan Area Network) Advanced), Long Term Evolution (LTE) Advanced), Enhanced Data rates for GSM (Global System for Mobile Communications) Evolution (EDGE), Evolved High-Speed Packet Access (HSPA+), etc. Examples of a wired protocol may include, but are not limited to: IEEE 802.3 (a.k.a. Ethernet), Fibre Channel, Power Line communication (e.g., HomePlug, IEEE 1901, etc.), etc. it is understood that the above are merely a few illustrative examples to which the disclosed subject matter is not limited.
  • In various embodiments, the client device 1200 may include one or more other hardware components 1213 (e.g., a display or monitor, a keyboard, a mouse, a camera, a fingerprint reader, a video processor, etc.). It is understood that the above are merely a few illustrative examples to which the disclosed subject matter is not limited.
  • In the illustrated embodiment, the client device 1200 may include one or more location services 1219. In one such embodiment, the location services 1219 may be configured to indicate where the client device 1200 is physically located within a certain amount of precision (often determined by the technology used for detecting the location). In various embodiments, this location service 1219 may include a Global Positioning System (GPS) receiver or detector. in another embodiment, the location service 1219 may include a control plane locator, such as, a device configured to determine the distance of the client device 1200 from one or more cell-phone (or other radio signal) towers or broadcasters. In another embodiment, the location service 1219 may be configured to estimate the client device's 1200 location based upon a time difference of arrival or other time-based technique. In yet another embodiment, the location service 1219 may be configured to estimate the user device's 102 location based upon a local-range (e.g., <30 meters, Bluetooth, wireless local area network (WLAN) signals, near field communication (NFC), radio-frequency identification (RFID) tags, etc.) signals or another form of a local position system (LYS). In various embodiments, the location service 1219 may be configured. to make use of triangulation, trilateration, multilateration, or a combination thereof. In various embodiments, location service 1219 may be configured to make use of one or more of these examples either in combination or alone. It is understood that the above are merely a few illustrative examples to which the disclosed subject matter is not limited.
  • In various embodiments, the client device 1200 may include an operating system (OS) 1217 configured to provide one or more services to an application 1230 and manage or act as an intermediary between the application 1230 and the various hardware components (e.g., the processor 1212, a network interface 1216, etc.) of the client device 1200. In such an embodiment, the client device 1200 may include one or more native applications, which may be installed locally (e.g., within the storage medium 1315, etc.) and configured to be executed directly by the processor 1212 and directly interact with the OS 1217. In such an embodiment, the native applications may include pre-compiled machine executable code. In some embodiments, the native applications may include a script interpreter (e.g., C shell (csh), AppleScript, AutoHotkey, etc.) or a virtual execution machine (VM) (e.g., the Java Virtual Machine, the Microsoft Common Language Runtime, etc.) that are configured to translate source or object code into executable code which is then executed by the processor 1212.
  • In various embodiments, the user may be an Information Technology (IT) Field Support personnel (e.g., Technicians, Asset Managers, and the like), using the application 1230 to build a floor map by discovering, identifying and pinning an asset at appropriate location on a floor map. In various embodiments, the user may be travelling to a new environment or work place, although the illustrated embodiment would be just as valid for a location that the user frequents. It is understood that the below is merely one illustrative example to which the disclosed subject matter is not limited. In such an embodiment, the user may wish to see or be made aware of the various assets, physical resources, or points of interests (POIs) around the user in this location.
  • In this context, a floor plan, floor map, and/or map includes a map or data structure that may be interpreted as a geographic diagram of a given or associated location or route. The floor plan, floor map, and/or map can include a layout of a location (e.g., a floor of a building). In this context, an asset is a term used to describe both physical objects, such as, for example a copier, printer, fax machine, traveler's workstation or computer, etc. and/or locations, such as, for example, a conference room, desk, etc. In this context, the term asset may be used to both describe the object/location itself or a data structure that represents or is associated with the physical object/location itself and used to represent that physical object/location to a computing device (e.g., client device 1200) or a software application (e.g., application 1230).
  • However, while the examples described herein show and describe a floor of an office building, and assets that are typical of an office environment (e.g., printers, coffee machines, conference rooms, etc.), it is understood that such are merely a few illustrative examples to which the disclosed subject matter is not limited. In another embodiment, the floor map may include a diagram of a rack of servers in data center. In such an embodiment, the asset may include various server racks or particular server in a given rack. In another embodiment, the floor map may include a diagram of a computer network, and the asset may include various computing devices, access points, gateways, servers, and/or routers on the network. It is understood that the above are merely a few illustrative examples to which the disclosed subject matter is not limited.
  • In various embodiments, the application 1230 may be configured to display an annotated map 1228 to the user on a display or display interface of the client device 1200. The annotated map 1228 may include a floor map (e.g., map 1312) and be annotated with one or more asset (e.g., asset 1314) retrieved or received from a remote computing device (e.g., map or asset management computing system 1350). In an example implementation, the annotated map 1228 may include the floor map 100. In various embodiments, the floor map may show or describe the location of various structural features of a given location (e.g., a floor of an office building, etc.). In some embodiments, the structural features may include, but are not limited to, walls, doors, desks, furniture, sinks, toilets, elevators, plants, etc. In some embodiments, the floor map may be stored as images (e.g., a Joint Photographic Experts Group (jpeg) image, bitmap, scalable vector graphic, etc.) or as an array or other data structure that the displaying or manipulating application may read and display to the user as a human readable floor map. As described above, in the illustrated embodiment, the annotated map 1228 (e.g., as floor map 100) may include one or more assets (e.g., printer 115, etc.). As described above, the assets may include physical objects (e.g., printer 115, etc.), locations (e.g., conference room 105, etc.), or assets that are a combination of both (e.g., conference room 105 that includes a computer 125, etc.). In various embodiments, these assets may be received by the displaying or manipulating application as a data structure that is then interpreted and displayed to the user as a human readable indicator (e.g., icon, rectangle, etc.).
  • In the illustrated embodiment, the application 1230 may include a map annotator 1222. In one such an embodiment, the map annotator 1222 may be configured to take a selected map and annotate it with the selected assets and the asset metadata (e.g., type, state, actions, etc.). In one embodiment, the map annotator 1222 may generate or produce the annotated map 1228. In various embodiments, this annotated map 1228 may be similar to floor map 100.
  • In the illustrated embodiment, the application 1230 may include a map viewer 1224. In such an embodiment, the map viewer 1224 may be configured to display the annotated map 1228 to the user. In various embodiments, the map viewer 1224 may be configured to allow the user to select various assets, view the state information or metadata associated with the assets, zoom in or out of the annotated map 1228, display a route between two or more locations, select an action, etc. It is understood that the above are merely a few illustrative examples to which the disclosed subject matter is not limited.
  • In the illustrated embodiment, the map viewer 1224 may include a filter or search mechanism 1225. In such an embodiment, the user may be able to limit the assets displayed by the map viewer 1224 or included within the annotated map 1228 using a set of criteria supplied or selected by the user. For example, in one embodiment, the user may only wish to see assets of type printer. In such an embodiment, any assets not of type printer may be removed from the annotated map 1228 or simply not displayed by the map viewer 1224. In another embodiment, the filter 1225 may select or filter assets based on other properties or associated with an asset (e.g., free conference rooms, working copiers, assets associated with the Finance department, assets with a red state, etc.). It is understood that the above are merely a few illustrative examples to which the disclosed subject matter is not limited.
  • In the illustrated embodiment, the map viewer 1224 may include a router or path generating mechanism or component 1223. In such an embodiment, the router 1223 may be configured to generate or determine a route between two or more locations. In one embodiment, the router 1223 may determine a path between the current location of the client device 1200 and a selected or desired asset (e.g., asset 210). In some embodiments, this route or path may be graphical and displayed on the annotated map 1228 (as shown in FIGS. 2A, 2B and 2C. In another embodiment, the path may be described in text, graphics, audio directions, a combination thereof, or other forms. It is understood that the above are merely a few illustrative examples to which the disclosed subject matter is not limited.
  • In one embodiment, the application 1230 may include a asset action responder 1226. In various embodiments, the asset action responder 1226 may be configured to execute or request the execution of the steps or process defined by the selected action 167. In one embodiment, once the user 190 selects or takes an action, the asset action responder 1226 may determine if the action may be executed locally (by the client device 1200). For example, a user may wish to view a file, or place a telephone call, send an email, etc. If the information needed to execute the action is available locally or may be obtained via local resources (hardware or software), the asset action responder 1226 may execute or perform the requested action. For example, the requested file may be included in the metadata or may be obtainable via an HTTP request, the client device 1200 may include a phone and the desired number may be included in the metadata, likewise when sending an email, etc.
  • In some embodiments, the client device 1200 or application 1230 may have received one or more signals triggering location 1221. In such an embodiment, when the client device 1200 comes within a predefined range (e.g., 500 meters, 10 feet, 2 meters etc.) or within an area defined by the triggering location 1221, the application 1230 or client device 1200 may transmit its location information or a map request that includes the location information. It is understood that the above is merely one illustrative example to which the disclosed subject matter is not limited.
  • In sonic embodiments, the client device 1200 or application 1230 includes a floor map calibration module 1232. The floor map calibration module 1232 can be configured to store (or cause to be stored) calibration information. The calibration information can include information as to whether or not the floor map has been calibrated, calibration measurement data and/or calibration result data. The floor map can be identified as not calibrated the first time the floorplan is loaded into the application 1230 (by the user or by any other user). In another implementation, the floor map includes metadata (e.g., communicated from the map management system to the computing device executing the application with the floor map) indicating the map has or has not been calibrated.
  • For example, a calibration operation can include walking a location (e.g., a floor of a building) with the application 1230 operating on the client device 1200. The application 1230 can be in a calibration mode configured to capture data. Using the captured date, the floor map calibration module 1232 can generate a table of coordinates on the map and a corresponding unique identifier (or signature or footprint of that position). For example, depending on the technique used, the identifier can be a magnetic footprint or a WIFI signal footprint.
  • Magnetic indoor positioning is based on buildings or structures each have a unique magnetic signature or fingerprint. For example, the steel structure within the building distorts the Earth's magnetic field in a unique way. In an example implementation, the user (e.g., of the aforementioned application) can calibrate the map by marking a path on the map and then walking the same path in a calibration mode selected in the application. The application then captures the magnetic fingerprint readings along the path and builds the database of fingerprints as related to coordinates within the location. The more that training paths and rescans (e.g., using crowd-sourcing, the better the accuracy of this technique. There is also an extrapolation logic needed to generate missing fingerprints for the remaining coordinates. FIG. 7 and FIG. 8 illustrate examples of magnetic fingerprint as related to coordinates within the location.
  • In an example implementation, the floor map is calibrated for use in determining or calculating a distance. For example, signals (e.g., associated with NFC, Bluetooth, BLE, WIFI, and/or the like) can be attenuated (e.g., power loss) or otherwise distorted by structures walls) or objects (e.g., desks, assets, and the like). Therefore, calibration allows the determining or calculating of a distance to take into account the attenuation and/or distortion of the signals. In an example implementation, the calibrated footprint can have accuracy in the range of 2 to 4 meters.
  • In some embodiments, the client device 1200 or application 1230 includes a device location module 1234. The device location module 1234 can be configured to identify the location of the client device 1200. For example, an Indoor Positioning System (IPS) technique (e.g., Magnetic Indoor Positioning, WIFI Indoor Positioning, and/or the like) can be used.
  • In some embodiments, the client device 1200 or application 1230 includes an asset discovery module 1236. The asset discovery module 1236 can be configured to discover or detect an asset based on a signal (e.g., NFC, Bluetooth, BLE, WIFI, and/or the like) from a device (e.g., a smart device and/or an IoT enabled device). For example, as the client device 1200 comes in range (e.g., within 5 meters) of an asset, the asset discovery module 1236 can trigger an event indicating an asset (e.g., smart device or IoT enabled device) is close by.
  • In some embodiments, the client device 1200 or application 1230 includes an asset location module 1240. The asset location module 1240 can be configured to determine a location of the asset. For example an IoT enabled device (or smart device) can be scanned using various protocols including (but not limited to) WIFI, Bluetooth, and the like and channel Frequency (e.g., 2.4 GHz or 5 GHz) is captured along with the signal strength, for example, a Received Signal Strength Indicator (RSSI) in dBm (decibels per milliWatt). The signal strength is related to the distance and frequency, Therefore, it possible to find the distance based on above two readings using equation 1 as described above.
  • For example, to locate a not IoT enabled device (or non-smart device) asset location module 1240 can be configured to determine an angle of the client device 1200 to estimate the distance to a point on the ground. Other measurements like height, width can be used to improve accuracy. The computing device executing the application can be held in front of the user, align the point in a camera 1211 toward the asset and using the application get a direct reading of the distance. For example, the height of where the computing device executing the application is held (e.g., eye-level) can be determined, then the user can point the camera 1211 to the point where the asset touches the ground. Then the asset location module 1240 can measure an inclination and with simple trigonometry the can determine or calculate distance as illustrated in FIG. 10 and FIG. 11.
  • In some embodiments, the client device 1200 or application 1230 includes an IoT enabled asset discovery module 1238. The IoT enabled asset discovery module 1238 can be configured to discover properties that identify the IoT enabled device or asset. The IoT enabled asset discovery module 1238 can use a wireless protocol to discover the IoT enabled device or asset. For example, as the user is in range of a signal communicated from the IoT enabled device or asset, the client device 1200 can receive the signal and determine that the asset is an IoT enabled device. The IoT enabled asset discovery module 1238 can then request the properties that identify the IoT enabled device or asset (e.g., Name, ID, MAC Address, IP Address, and the like) from the IoT enabled device. The IoT enabled device communicates the properties that identify the IoT enabled device to the client device 1200. The IoT enabled asset discovery module 1238 then stores (e.g., in memory 1214) the properties that identify the IoT enabled device.
  • In some embodiments, the client device 1200 or application 1230 includes an image processing API 1218. The image processing API 1218 can be configured to utilize external tools to identify an object using a picture (or image) of the object. For example, can be identified using machine learning (ML) and implemented through the image processing API 1218. The external tools implemented through the image processing API 1218 can enable developers to understand the content of an image by encapsulating machine learning models in a representational state transfer (REST) API. The external tools implemented through the image processing API 1218 can classify images into thousands of categories and detect individual objects (e.g., text) within an image. The image processing API 1218 can be configured to communicate with the external tools using an internet (e.g., HTTP) protocol.
  • Machine learning (ML) classification can be used to identify an asset class from a set of given classes. Over time (e.g., through learning iterations), a set of features can be identified for given asset classes. Then using sufficient training data a ML model can be built. The ML model can then be used to classify objects in a captured image to one of the asset class.
  • FIG. 13 illustrates a block diagram of another computing device according to at least one example embodiment. In various embodiments, computing device may include a map or asset management computing system 1350, one or more storage computing devices or systems 1305 and an administrator device 1330. In one or more implementation, the map or asset management computing system 1350, the one or more storage computing devices or systems 1305 and the administrator device 1330 can operate in a single computing device (e.g., workstation, a server, a blade server, and other appropriate computers, etc. or a virtual machine or virtual computing device thereof), on separate computing devices, or any combination thereof.
  • In one embodiment, the map or asset management computing system 1350 may include a map selector 1352. In such an embodiment, the map selector 1352 may be configured to receive location information from the client device 1200. In one embodiment, the client device 1200 may supply or transmit the current location of the client device 1200 periodically or when a triggering event occurs (e.g., in response to a user request for a map or floor map, entering a predefined location, such as, one of the company's offices, etc.). In another embodiment, the client device 1200 may supply or transmit a request for a map or floor map of a specific location (e.g., abuilding and floor). In such an embodiment, the user may wish to pre-load the client device 1200 with one or more maps or floor maps of places the user is expecting to travel to; although, it is understood that the above is merely one illustrative example to which the disclosed subject matter is not limited. As described above, in various embodiments, this location information may include a list of GPS coordinates or other location coordinates or information.
  • In some embodiments, the map selector 1352 may be configured to select at least one map or floor map that is deemed relevant to the provided location information. In one embodiment, the map selector 1352 may be configured to pick or select a map or floor map that includes or bounds the provided location information. For example, if the client device 1200 is on the third floor of a building, the map selector 1352 may select the floor map of the third floor of that building. In another embodiment, the map selector 1352 may be configured to select at least one map or floor map near (as defined by a predefined set of criteria or rules) to the supplied location information. For example, if the client device 1200 is on the third floor of a building, the map selector 1352 may select the floor maps of the second, third, and fourth floors of that building. In yet another embodiment, the map selector 1352 may be configured to remember a history of what map or floor map, etc. have previously been presented to the client device 1200. In various embodiments, the map selector 1352 may be configured to take into account user actions or predicted user actions when selecting a map or floor map. For example, if the client device 1200 is on the third floor of a building, and moving towards the elevators, the map selector 1352 may select the floor map of the second and fourth floors of that building. It is understood that the above are merely a few illustrative examples to which the disclosed subject matter is not limited.
  • In various embodiments, the map selector 1352 may be configured to retrieve any asset associated with the selected map or floor map. In some embodiments, the map selector 1352 may be configured to filter or only select a portion of the assets associated with the selected map or floor map. In one embodiment, the map selector 1352 may be configured to retrieve any metadata or properties associated with the selected map or floor map and the selected assets. In the illustrated embodiment, this metadata includes asset actions and asset states. It is understood that the above is merely one illustrative example to which the disclosed subject matter is not limited.
  • In such an embodiment, the map selector 1352 may be configured to transmit the selected map or floor map, the associated or selected assets, and the associated asset metadata to the client device 1200. In various embodiments, this information and other communications may be transmitted via Hypertext Transfer Protocol (HTTP), Hypertext Transfer Protocol Secure (HTTPS), or another communications protocol.
  • In various embodiments, the map or asset management computing system 1350 may include an asset state manager 1354. In one embodiment, the asset state manager 1354 may be configured to maintain state information associated with each asset. In such an embodiment, the asset state manager 1354 may receive state information from a plurality of sources, such as, for example the assets illustrated in FIG. 1, various client devices 1200, or administer devices 1330, etc. In one embodiment, when a printer detects a paper jam, the printer may be configured to send a message (e.g., email, tweet, HTTP message, etc.) to the asset state manager 1354 or a server to which the asset state manager 1354 subscribes (e.g., a Rich Site Summary or Really Simple Syndication (RSS) feed, etc.). The asset state manager 1354 may then edit or update the asset state 1320 associated with the printer to reflect the paper jam (e.g., a state of paper jam, unavailable, etc.).
  • In such an embodiment, as the asset state manager 1354 changes or at predefined periodic intervals or upon a request from the client device 1200, the client device 1200 may inform the application 1230 of the new or current state.
  • In the illustrated embodiment, the map or asset management computing system 1350 may include an asset action manager 1356. In such an embodiment, the asset action manager 1356 may be configured to execute or process an asset action request from a client device 1200. In various embodiments, the asset action manager 1356 may be configured to perform the request action (or portion thereof) itself, or to request that another device perform the action or part thereof.
  • In one embodiment, the asset action manager 1356 may be configured to change the state of the asset associated with the action. For example, the action may include that the user has cleared the paper jam in the printer, and the requested action may be to change the state or status of the printer to reflect that this manual portion of the action has been performed. In some embodiments, the asset action manager 1356 may work with or communicate with the asset state manager 1354 to perform such an action.
  • In the illustrated embodiment, map and asset information is transmitted for the administrator device 1330 to the map or asset management computing system 1350, and more specifically to the map and asset manager 1358. In such an embodiment, the map and asset manager 1358 may be configured to enter the map or PoI information supplied by the administrator device 1330 into the map storage 1310. In various embodiments, this may include re-formatting the map or asset information for storage as the maps 1312 and assets 1314. Likewise, in the illustrated embodiment, the map and asset manager 1358 may be configured to retrieve maps 1312 and assets 1314 requested by the administrator device 1330 from the storage system 1305 and supply the resultant map or asset information to the administrator device 1330. In such an embodiment, an administrator may edit, delete, or update various aspects of existing maps 1312 and assets 1314. However, in another embodiment, this map or asset information may be communicated directly between the storage system 1305 and the administrator device 1330.
  • In another example embodiment, map and asset information is transmitted from the client device 1200 to the map and asset manager 1358. In this way the steps described above with regard to FIGS. 4, 5 and 6 can be implemented to discover a new asset and to update an existing asset. The map and asset information can include asset locations and information that identifies the asset.
  • In various embodiments, the map or asset management computing system 1350 may include hardware and/or software components 1360 analogous to those described above in reference to client device 1200. In some embodiments, the map or asset management computing system 1350 may include a plurality of computing devices.
  • In various embodiments, the storage system 1305 may include a computing device, such as, for example, a desktop, workstation, a server, a blade server, and other appropriate computers, etc. or a virtual machine or virtual computing device thereof. In various embodiments, the storage system 1305 may include hardware and/or software components 1324 analogous to those described above in reference to client device 1200. In some embodiments, the storage system 1305 may include a plurality of computing devices.
  • In various embodiments, the storage system 1305 may include one or more storage systems or data bases 1310 and 1318. In some embodiments, the storage system 1305 may include a map and asset storage or database 1310. In such an embodiment, the map storage 1310 may store one or more maps or floor maps 1312 and one or more assets 1314.
  • In some embodiments, the storage system 1305 may include an asset state storage or database 1318. In such an embodiment, the asset state storage or database 1318 may include one or more asset states 1320. In various embodiments, each stored asset state 1320 may be associated with a respective asset 1314. In one embodiment, the data structure associated with the asset 1314 may be associated with or include an asset state 1320 property or field that indicates the status or usability of the associated asset 1314. In one embodiment, the asset 1314 may inherit one or more acceptable states based on the asset type. In another embodiment, the administrator may set or define a list of possible states the asset 1314 may be in. In the illustrated embodiment, the asset states 1320 include the actual state of the asset 1314 at a given moment. In such an embodiment, the application 1230 may display the current state of a given asset 1314 on the annotated map 1228, as described below.
  • In the illustrated embodiment, the administer may use the administrator user interface (UI) or application 1332 to import (and the edit or maintain, etc.) graphic images or data structures that represent floor maps into the map and asset storage or database 1310. In various embodiments, the floor maps 1312 may include data that includes a description of the floor map (e.g., “Building H, Floor 2”, “Winnipeg Office, Ground Floor”, etc.), and a geographical location or coordinates where the associated physical floor exists. In various embodiments, other information may be included. In some embodiments, such information may not be stored within the floor map 1312 itself, but in a separate format as floor map metadata 1316. In one embodiment, the information may be stored in a variety of formats (e.g., as part of the floor map's 1312 filename, as part of a metadata tag include by the floor map, as a separate file, etc.). In various embodiments, the floor map metadata 1316 and the floor map 1312 may be stored in a variety of formats, such as for example a text-based file (e.g., Extensible Markup Language (XML), JavaScript Object Notation (JSON), Comma-separated values (CSV), etc.), a binary-based format (e.g., zip compression format, JPEG, a serialized object-oriented data structure or object, etc.), or a combination thereof. It is understood that the above are merely a few illustrative examples to which the disclosed subject matter is not limited.
  • In the illustrated embodiment, the administrator may use the administrator user interface (UI) or application 1332 to import (and the edit or maintain, etc.) one or more assets 1314 to the map and asset storage or database 1310. In some embodiments, the administrator UI or application 1332 may be configured to allow or facilitate the ability for an administrator to place assets 1314 on the map 1312 via a graphical paradigm, similar to placing items via a drawing program.
  • In another example embodiment, map and asset information is transmitted from the client device 1200 to the map and asset manager 1358. In this way the steps described above with regard to FIGS. 4, 5 and 6 can be implemented to discover a new asset and to update an existing asset. The map and asset information can include asset locations and information that identifies the asset.
  • Various implementations of the systems and techniques described here can be realized in digital electronic circuitry, integrated circuitry, specially designed ASICs (application specific integrated circuits), computer hardware, firmware, software, and/or combinations thereof. These various implementations can include implementation in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, coupled to receive data and instructions from, and to transmit data and instructions to, a storage system, at least one input device, and at least one output device. Various implementations of the systems and techniques described here can be realized as and/or generally be referred to herein as a circuit, a module, a block, or a system that can combine software and hardware aspects. For example, a module may include the functions/acts/computer program instructions executing on a processor (e.g., a processor formed on a silicon substrate, a GaAs substrate, and the like) or some other programmable data processing apparatus.
  • Some of the above example embodiments are described as processes or methods depicted as flowcharts. Although the flowcharts describe the operations as sequential processes, many of the operations may be performed in parallel, concurrently or simultaneously. In addition, the order of operations may be re-arranged. The processes may be terminated when their operations are completed, but may also have additional steps not included in the figure. The processes may correspond to methods, functions, procedures, subroutines, subprograms, etc.
  • Methods discussed above, some of which are illustrated by the flow charts, may be implemented by hardware, software, firmware, middleware, microcode, hardware description languages, or any combination thereof. When implemented in software, firmware, middleware or microcode, the program code or code segments to perform the necessary tasks may be stored in a machine or computer readable medium such as a storage medium. A processor(s) may perform the necessary tasks.
  • Specific structural and functional details disclosed herein are merely representative for purposes of describing example embodiments. Example embodiments, however, be embodied in many alternate forms and should not be construed as limited to only the embodiments set forth herein.
  • It will be understood that, although the terms first, second, etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first element could be termed a second element, and, similarly, a second element could be termed a first element, without departing from the scope of example embodiments. As used herein, the term and/or includes any and all combinations of one or more of the associated listed items.
  • It will be understood that when an element is referred to as being connected or coupled to another element, it can be directly connected or coupled to the other element or intervening elements may be present. In contrast, when an element is referred to as being directly connected or directly coupled to another element, there are no intervening elements present. Other words used to describe the relationship between elements should be interpreted in a like fashion (e.g., between versus directly between, adjacent versus directly adjacent, etc.).
  • The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of example embodiments. As used herein, the singular forms a, an and the are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms comprises, comprising, includes and/or including, when used herein, specify the presence of stated features, integers, steps, operations, elements and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components and/or groups thereof.
  • It should also be noted that in some alternative implementations, the functions/acts noted may occur out of the order noted in the figures. For example, two figures shown in succession may in fact be executed concurrently or may sometimes be executed in the reverse order, depending upon the functionality/acts involved.
  • Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which example embodiments belong. It will be further understood that terms, e.g., those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
  • Portions of the above example embodiments and corresponding detailed description are presented in terms of software, or algorithms and symbolic representations of operation on data bits within a computer memory. These descriptions and representations are the ones by which those of ordinary skill in the art effectively convey the substance of their work to others of ordinary skill in the art. An algorithm, as the term is used here, and as it is used generally, is conceived to be a self-consistent sequence of steps leading to a desired result. The steps are those requiring physical manipulations of physical quantities. Usually, though not necessarily, these quantities take the form of optical, electrical, or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated. It has proven convenient at times, principally for reasons of common usage, to refer to these signals as bits, values, elements, symbols, characters, terms, numbers, or the like.
  • In the above illustrative embodiments, reference to acts and symbolic representations of operations (e.g., in the form of flowcharts) that may be implemented as program modules or functional processes include routines, programs, objects, components, data structures, etc., that perform particular tasks or implement particular abstract data types and may be described and/or implemented using existing hardware at existing structural elements. Such existing hardware may include one or more Central Processing Units (CPUs), digital signal processors (DSPs), application-specific-integrated-circuits, field programmable gate arrays (FPGAs) computers or the like.
  • It should be borne in mind, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities. Unless specifically stated otherwise, or as is apparent from the discussion, terms such as processing or computing or calculating or determining of displaying or the like, refer to the action and processes of a computer system, or similar electronic computing device, that manipulates and transforms data represented as physical, electronic quantities within the computer system's registers and memories into other data similarly represented as physical quantities within the computer system memories or registers or other such information storage, transmission or display devices.
  • Note also that the software implemented aspects of the example embodiments are typically encoded on some form of non-transitory computer readable storage medium or implemented over some type of transmission medium. The program storage medium may be magnetic (e.g., a floppy disk or a hard drive) or optical (e.g., a compact disk read only memory, or CD ROM), and may be read only or random access. Similarly, the transmission medium may be twisted wire pairs, coaxial cable, optical fiber, or some other suitable transmission medium known to the art. The example embodiments not limited by these aspects of any given implementation.
  • Lastly, it should also be noted that whilst the accompanying claims set out particular combinations of features described herein, the scope of the present disclosure is not limited to the particular combinations hereafter claimed, but instead extends to encompass any combination of features or embodiments herein disclosed irrespective of whether or not that particular combination has been specifically enumerated in the accompanying claims at this time.

Claims (20)

What is claimed is:
1. A method comprising:
receiving, from a remote computing device, a floor map indicating a layout of a location;
displaying, via a display interface of a client device, at least a portion of the floor map;
capturing, using an application of the client device, signal strength data representing a signal field for at least one position on the floor map;
identifying an asset within the layout of the location;
determining at least one property that identifies the asset using one of:
a discovery process using a wireless protocol, and
an image processing application programming interface (API) configured to classify an image and detect individual within the image;
updating the floor map with the asset and the at least one property; and
communicating the asset and the at least one property to the remote computing device.
2. The method of claim 1, wherein the location is a floor of a building.
3. The method of claim 1, wherein the signal strength data representing the signal field is based on a magnetic field footprint captured using a Magnetic Indoor Positioning protocol.
4. The method of claim 1, wherein the signal strength data representing the signal field is based on a WIFI signal footprint captured using a captured using a WIFI Indoor Positioning protocol.
5. The method of claim 1, wherein
the asset is a smart device, and
the identifying of the asset includes detecting a communications protocol signal transmitted from the asset.
6. The method of claim 1, wherein
the asset is a smart device, and
the determining of at least one property that identifies the asset includes determining a position of the asset on the floor map including:
determining a position of the client device on the floor map using an Indoor Positioning System (IPS) protocol,
measuring a channel frequency and a signal strength using the wireless protocol, and
using a formula based on a free-space path loss (FSPL), the channel frequency and the signal strength to determine a distance between the client device and the asset.
7. The method of claim 1, wherein
the asset is not a smart device, and
the identifying of the asset includes:
capturing an image of the asset,
using the image processing API to communicate the image to an external tool configured to identify an object using the image, and
receive an asset class associated with the asset from the external tool.
8. The method of claim 1, wherein
the asset is not a smart device, and
the determining of at least one property that identities the asset includes determining a position of the asset on the floor map including:
determining a position of the client device on the floor map using an Indoor Positioning System (IPS) protocol, and
measuring an inclination between at least two heights associated with the asset and use a trigonometric function and the inclination to determine a distance between the client device and the asset.
9. A non-transitory computer readable storage medium including executable code that, when executed by a processor, is configured to cause the processor to:
receive, from a remote computing device, a floor map indicating a layout of a location;
display, via a display interface of a client device, at least a portion of the floor map;
capture, using an application of the client device, signal strength data representing a signal field for at least one position on the floor map;
identify an asset within the layout of the location;
determine at least one property that identifies the asset using one of:
a discovery process using a wireless protocol, and
an image processing application programming interface (API) configured to classify the image and detect individual within the image;
update the floor map with the asset and the at least one property; and
communicate the asset and the at least one property to the remote computing device.
10. The non-transitory computer readable storage medium of claim 9, wherein the location is a floor of a building.
11. The non-transitory computer readable storage medium of claim 9, wherein the signal strength data representing the signal field is based on a magnetic field footprint captured using a Magnetic Indoor Positioning protocol.
12. The non-transitory computer readable storage medium of claim 9, wherein the signal strength data representing the signal field is based on a WIFI signal footprint captured using a captured using a WIFI Indoor Positioning protocol.
13. The non-transitory computer readable storage medium of claim 9, wherein
the asset is a smart device, and
the identifying of the asset includes detecting a communications protocol signal transmitted from the asset.
14. The non-transitory computer readable storage medium of claim 9, wherein
the asset is a smart device, and
the determining of at least one property that identifies the asset includes determining a position of the asset on the floor map including:
determining a position of the client device on the floor map using an Indoor Positioning System (IPS) protocol,
measuring a channel frequency and a signal strength using the wireless protocol, and
using a formula based on a free-space path loss (FSPL), the channel frequency and the signal strength to determine a distance between the client device and the asset.
15. The non-transitory computer readable storage medium of claim 9, wherein
the asset is not a smart device, and
the identifying of the asset includes:
capturing an image of the asset,
using the image processing API to communicate the image to an external tool configured to identify an object using the image, and
receive an asset class associated with the asset from the external tool.
16. The non-transitory computer readable storage medium of claim 9, wherein
the asset is not a smart device, and
the determining of at least one property that identifies the asset includes determining a. position of the asset on the floor map including:
determining a position of the client device on the floor map using an Indoor Positioning System (IPS) protocol, and
measuring an inclination between at least two heights associated with the asset and use a trigonometric function and the inclination to determine a distance between the client device and the asset.
17. A method comprising:
receiving, from a client device, a request for a floor map based on a floor of a building, the floor map indicating a layout of the floor of the building;
in response to receiving the request for the floor map, selecting a floor map from a database configured to store a plurality of maps;
communicating the floor map to the client device;
receiving, from the client device, information related to an asset, the information including at least one property that identifies the asset and a position of the asset on the floor map
in response to receiving the information related to the asset, update a database configured to store data related to a plurality of assets;
generating an annotated floor map based on the asset and the information related to the asset; and
communicating the annotated floor map to the client device.
18. The method of claim 17, wherein
the annotated floor map includes an icon representing the asset and an indicator,
the icon representing the asset is located on the floor map at the position of the asset; and
the indicator is located on the floor map at the position of the asset and indicates at least one of a type of the asset and a status of the asset.
19. The method of claim 17, further comprising:
discovering linkages to characteristics of the asset, and
adding the linkages for the asset to the database configured to store data related to the plurality of assets.
70. The method of claim 17, wherein the update of the database configured to store data related to the plurality of assets includes one of:
determining whether a record associated with the asset exists;
upon determining a record associated with the asset exists, update the record using the information related to the asset; and
upon determining a record associated with the asset does not exist, generate a new record using the information related to the asset.
US15/476,573 2017-01-20 2017-03-31 Asset floor map Active US10798538B2 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US15/476,573 US10798538B2 (en) 2017-01-20 2017-03-31 Asset floor map
EP18716387.8A EP3571856A1 (en) 2017-01-20 2018-01-19 Asset floor map
PCT/US2018/014460 WO2018136764A1 (en) 2017-01-20 2018-01-19 Asset floor map

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
INTEMPE122542017CHE 2017-01-20
IN201741002224 2017-01-20
IN201741002224 2017-01-20
US15/476,573 US10798538B2 (en) 2017-01-20 2017-03-31 Asset floor map

Publications (3)

Publication Number Publication Date
US20180249298A1 US20180249298A1 (en) 2018-08-30
US20200137527A9 true US20200137527A9 (en) 2020-04-30
US10798538B2 US10798538B2 (en) 2020-10-06

Family

ID=61911662

Family Applications (1)

Application Number Title Priority Date Filing Date
US15/476,573 Active US10798538B2 (en) 2017-01-20 2017-03-31 Asset floor map

Country Status (3)

Country Link
US (1) US10798538B2 (en)
EP (1) EP3571856A1 (en)
WO (1) WO2018136764A1 (en)

Families Citing this family (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10447394B2 (en) * 2017-09-15 2019-10-15 Qualcomm Incorporated Connection with remote internet of things (IoT) device based on field of view of camera
US10970876B2 (en) * 2017-12-08 2021-04-06 Panton, Inc. Methods and apparatus for image locating relative to the global structure
US10824689B2 (en) * 2018-01-12 2020-11-03 Verizon Patent And Licensing Inc. Sharing point of interest data
JP6804481B2 (en) * 2018-01-29 2020-12-23 三菱電機ビルテクノサービス株式会社 Equipment map generator and program
US11217087B2 (en) * 2018-11-14 2022-01-04 Johnson Controls Tyco IP Holdings LLP Assurance services system and method
JP7338964B2 (en) * 2018-12-05 2023-09-05 Juki株式会社 Monitoring system
JP2020091641A (en) * 2018-12-05 2020-06-11 Juki株式会社 Monitoring system
US10694053B1 (en) 2019-01-22 2020-06-23 Xerox Corporation Wireless location tracking tag for monitoring real time location-tracking apparatus for an electronic device
CN109902239B (en) * 2019-03-04 2020-06-02 上海拉扎斯信息科技有限公司 Information interaction method and device, readable storage medium and electronic equipment
JP6939839B2 (en) * 2019-04-04 2021-09-22 セイコーエプソン株式会社 Information processing equipment, machine learning equipment and information processing methods
US11248914B2 (en) * 2019-06-20 2022-02-15 Lyft, Inc. Systems and methods for progressive semantic mapping
EP3771229A1 (en) * 2019-07-23 2021-01-27 HERE Global B.V. Positioning based on calendar information
CN110691116B (en) * 2019-08-18 2023-04-14 朗德万斯公司 Method, positioning device and system for managing network device
US11361265B2 (en) * 2019-08-21 2022-06-14 Kyndryl, Inc. Data center impact assessment post disaster
US11575682B2 (en) * 2019-09-26 2023-02-07 Amazon Technologies, Inc. Assigning contextual identity to a device based on proximity of other devices
US10921131B1 (en) * 2019-12-05 2021-02-16 Capital One Services, Llc Systems and methods for interactive digital maps
US20220136836A1 (en) * 2020-11-04 2022-05-05 Xerox Corporation System and method for indoor navigation
US11244470B2 (en) 2020-03-05 2022-02-08 Xerox Corporation Methods and systems for sensing obstacles in an indoor environment
US11395232B2 (en) * 2020-05-13 2022-07-19 Roku, Inc. Providing safety and environmental features using human presence detection
US11356800B2 (en) 2020-08-27 2022-06-07 Xerox Corporation Method of estimating indoor location of a device
CN112417135A (en) * 2020-11-10 2021-02-26 广东顺畅科技有限公司 Asset position monitoring method and device, client and storage medium
US20220326340A1 (en) * 2021-04-13 2022-10-13 ECSite, Inc. Systems and methods of radio frequency data mapping and collection for environments
US11153720B1 (en) * 2021-04-16 2021-10-19 Relay, Inc. Positioning techniques for dead zones using beacons

Family Cites Families (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6353445B1 (en) 1998-11-25 2002-03-05 Ge Medical Systems Global Technology Company, Llc Medical imaging system with integrated service interface
US7293067B1 (en) 1999-07-16 2007-11-06 Canon Kabushiki Kaisha System for searching device on network
US6278940B1 (en) 2000-03-09 2001-08-21 Alpine Electronics, Inc. Input method for selecting destination, navigation system using the same, and information storage medium for use therewith
US7373244B2 (en) 2004-04-20 2008-05-13 Keith Kreft Information mapping approaches
US7769409B2 (en) 2004-06-23 2010-08-03 Sony Computer Entertainment America Inc. Network participant status evaluation
US8836580B2 (en) 2005-05-09 2014-09-16 Ehud Mendelson RF proximity tags providing indoor and outdoor navigation and method of use
US8516087B2 (en) 2006-02-14 2013-08-20 At&T Intellectual Property I, L.P. Home automation system and method
US8358976B2 (en) 2006-03-24 2013-01-22 The Invention Science Fund I, Llc Wireless device with an aggregate user interface for controlling other devices
US7673248B2 (en) 2006-11-06 2010-03-02 International Business Machines Corporation Combining calendar entries with map views
US8838477B2 (en) 2011-06-09 2014-09-16 Golba Llc Method and system for communicating location of a mobile device for hands-free payment
SG183690A1 (en) * 2007-08-06 2012-09-27 Trx Systems Inc Locating, tracking, and/or monitoring personnel and/or assets both indoors and outdoors
US20090174546A1 (en) * 2008-01-04 2009-07-09 Sensormatic Electronics Corporation System and method for determining location of objects
US8700301B2 (en) 2008-06-19 2014-04-15 Microsoft Corporation Mobile computing devices, architecture and user interfaces based on dynamic direction information
EP2169484B1 (en) 2008-09-18 2013-09-18 Tac AB Control of concept zones
US20100157848A1 (en) 2008-12-22 2010-06-24 Qualcomm Incorporated Method and apparatus for providing and utilizing local maps and annotations in location determination
EP2373073B1 (en) 2008-12-26 2016-11-09 Panasonic Intellectual Property Corporation of America Communication device
US9178768B2 (en) 2009-01-07 2015-11-03 Ixia Methods, systems, and computer readable media for combining voice over internet protocol (VoIP) call data with geographical information
US8307299B2 (en) * 2009-03-04 2012-11-06 Bayerische Motoren Werke Aktiengesellschaft Virtual office management system
US9571625B2 (en) 2009-08-11 2017-02-14 Lg Electronics Inc. Electronic device and control method thereof
US8335989B2 (en) 2009-10-26 2012-12-18 Nokia Corporation Method and apparatus for presenting polymorphic notes in a graphical user interface
US8589069B1 (en) 2009-11-12 2013-11-19 Google Inc. Enhanced identification of interesting points-of-interest
US8775065B2 (en) 2010-04-05 2014-07-08 Qualcomm Incorporated Radio model updating
KR20120015560A (en) 2010-08-12 2012-02-22 삼성전자주식회사 Method of generating map and method of measuring position of mobile terminal using the map
US8669844B2 (en) 2010-09-23 2014-03-11 Blackberry Limited Radio frequency identification (RFID) system providing meeting room reservation and scheduling features and related methods
JP2012088846A (en) 2010-10-18 2012-05-10 Canon Inc Management device, control method of management device, and program
KR101611964B1 (en) 2011-04-28 2016-04-12 엘지전자 주식회사 Mobile terminal and method for controlling same
US8773467B2 (en) 2011-06-13 2014-07-08 International Business Machines Corporation Enhanced asset management and planning system
US8706137B2 (en) 2011-08-02 2014-04-22 Qualcomm Incorporated Likelihood of mobile device portal transition
US20130145293A1 (en) 2011-12-01 2013-06-06 Avaya Inc. Methods, apparatuses, and computer-readable media for providing availability metaphor(s) representing communications availability in an interactive map
US20130262223A1 (en) 2012-03-28 2013-10-03 Informat Innovative Technologies Ltd. Indoor navigation system
US20140111520A1 (en) 2012-10-23 2014-04-24 Bmc Software, Inc. User-centric annotated location aware asset mapping
KR101868444B1 (en) 2014-07-01 2018-07-23 에이치피프린팅코리아 주식회사 Image forming apparatus, position guiding method thereof and image forming system
AU2015321430A1 (en) * 2014-09-26 2017-05-18 Asset Owl Pty Ltd A management platform for a distribution network
US10021529B2 (en) * 2015-04-21 2018-07-10 Hewlett Packard Enterprise Development Lp Calibration of wireless network's signal strength map database for indoor locating techniques
US10187743B2 (en) * 2015-08-31 2019-01-22 Oracle International Corporation System and method for providing situation awareness via a mobile device
CN106899930B (en) * 2015-12-17 2020-07-28 阿里巴巴集团控股有限公司 Fingerprint database construction method, positioning method and device

Also Published As

Publication number Publication date
US20180249298A1 (en) 2018-08-30
US10798538B2 (en) 2020-10-06
WO2018136764A1 (en) 2018-07-26
EP3571856A1 (en) 2019-11-27

Similar Documents

Publication Publication Date Title
US10798538B2 (en) Asset floor map
US10788326B2 (en) Management of annotated location aware assets
US20080225779A1 (en) Location-based networking system and method
US10747634B2 (en) System and method for utilizing machine-readable codes for testing a communication network
BR112016025128B1 (en) COMPUTER IMPLEMENTED METHOD OF DETERMINING A CALCULATED POSITION OF A MOBILE PROCESSING DEVICE, COMPUTER STORAGE MEDIA, AND MOBILE PROCESSING DEVICE
US20180332557A1 (en) New access point setup
KR20150094665A (en) Providing and utilizing maps in location determination based on rssi and rtt data
Bobek et al. Indoor microlocation with BLE beacons and incremental rule learning
di Flora et al. A practical implementation of indoor location-based services using simple WiFi positioning
Keng et al. Spatial standards for Internet of Things
Bbosale et al. Indoor navigation system using BLE beacons
EP3526547B1 (en) Reporting locations being associated with a problem

Legal Events

Date Code Title Description
AS Assignment

Owner name: BMC SOFTWARE, INC., TEXAS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:JAIN, PRIYANKA;MARDHEKAR, SAMEER;BHAGWAT, ANAND;REEL/FRAME:042550/0677

Effective date: 20170120

AS Assignment

Owner name: CREDIT SUISSE AG, CAYMAN ISLANDS BRANCH, AS COLLATERAL AGENT, NORTH CAROLINA

Free format text: SECURITY INTEREST;ASSIGNORS:BMC SOFTWARE, INC.;BLADELOGIC, INC.;REEL/FRAME:043514/0845

Effective date: 20170727

Owner name: CREDIT SUISSE AG, CAYMAN ISLANDS BRANCH, AS COLLAT

Free format text: SECURITY INTEREST;ASSIGNORS:BMC SOFTWARE, INC.;BLADELOGIC, INC.;REEL/FRAME:043514/0845

Effective date: 20170727

AS Assignment

Owner name: CREDIT SUISSE, AG, CAYMAN ISLANDS BRANCH, AS COLLATERAL AGENT, NEW YORK

Free format text: SECURITY INTEREST;ASSIGNORS:BMC SOFTWARE, INC.;BLADELOGIC, INC.;REEL/FRAME:047185/0744

Effective date: 20181002

Owner name: CREDIT SUISSE, AG, CAYMAN ISLANDS BRANCH, AS COLLA

Free format text: SECURITY INTEREST;ASSIGNORS:BMC SOFTWARE, INC.;BLADELOGIC, INC.;REEL/FRAME:047185/0744

Effective date: 20181002

AS Assignment

Owner name: BMC SOFTWARE, INC., TEXAS

Free format text: RELEASE OF PATENTS;ASSIGNOR:CREDIT SUISSE AG, CAYMAN ISLANDS BRANCH;REEL/FRAME:047198/0468

Effective date: 20181002

Owner name: BMC ACQUISITION L.L.C., TEXAS

Free format text: RELEASE OF PATENTS;ASSIGNOR:CREDIT SUISSE AG, CAYMAN ISLANDS BRANCH;REEL/FRAME:047198/0468

Effective date: 20181002

Owner name: BLADELOGIC, INC., TEXAS

Free format text: RELEASE OF PATENTS;ASSIGNOR:CREDIT SUISSE AG, CAYMAN ISLANDS BRANCH;REEL/FRAME:047198/0468

Effective date: 20181002

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

FEPP Fee payment procedure

Free format text: PETITION RELATED TO MAINTENANCE FEES GRANTED (ORIGINAL EVENT CODE: PTGR); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

AS Assignment

Owner name: THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A., AS COLLATERAL AGENT, TEXAS

Free format text: SECURITY INTEREST;ASSIGNORS:BMC SOFTWARE, INC.;BLADELOGIC, INC.;REEL/FRAME:052844/0646

Effective date: 20200601

Owner name: THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A., AS COLLATERAL AGENT, TEXAS

Free format text: SECURITY INTEREST;ASSIGNORS:BMC SOFTWARE, INC.;BLADELOGIC, INC.;REEL/FRAME:052854/0139

Effective date: 20200601

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STPP Information on status: patent application and granting procedure in general

Free format text: AWAITING TC RESP., ISSUE FEE NOT PAID

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STCF Information on status: patent grant

Free format text: PATENTED CASE

AS Assignment

Owner name: ALTER DOMUS (US) LLC, ILLINOIS

Free format text: GRANT OF SECOND LIEN SECURITY INTEREST IN PATENT RIGHTS;ASSIGNORS:BMC SOFTWARE, INC.;BLADELOGIC, INC.;REEL/FRAME:057683/0582

Effective date: 20210930

AS Assignment

Owner name: BLADELOGIC, INC., TEXAS

Free format text: TERMINATION AND RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:ALTER DOMUS (US) LLC;REEL/FRAME:066567/0283

Effective date: 20240131

Owner name: BMC SOFTWARE, INC., TEXAS

Free format text: TERMINATION AND RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:ALTER DOMUS (US) LLC;REEL/FRAME:066567/0283

Effective date: 20240131

AS Assignment

Owner name: GOLDMAN SACHS BANK USA, AS SUCCESSOR COLLATERAL AGENT, NEW YORK

Free format text: OMNIBUS ASSIGNMENT OF SECURITY INTERESTS IN PATENT COLLATERAL;ASSIGNOR:CREDIT SUISSE AG, CAYMAN ISLANDS BRANCH, AS RESIGNING COLLATERAL AGENT;REEL/FRAME:066729/0889

Effective date: 20240229

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4