WO2019012310A1 - Facility media access safeguard systems - Google Patents

Facility media access safeguard systems Download PDF

Info

Publication number
WO2019012310A1
WO2019012310A1 PCT/IB2017/054163 IB2017054163W WO2019012310A1 WO 2019012310 A1 WO2019012310 A1 WO 2019012310A1 IB 2017054163 W IB2017054163 W IB 2017054163W WO 2019012310 A1 WO2019012310 A1 WO 2019012310A1
Authority
WO
WIPO (PCT)
Prior art keywords
visitor
identifier
computing device
private
facility
Prior art date
Application number
PCT/IB2017/054163
Other languages
French (fr)
Inventor
Hon Man Honmy YUEN
Original Assignee
Yuen Hon Man Honmy
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yuen Hon Man Honmy filed Critical Yuen Hon Man Honmy
Priority to PCT/IB2017/054163 priority Critical patent/WO2019012310A1/en
Publication of WO2019012310A1 publication Critical patent/WO2019012310A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L9/00Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols
    • H04L9/32Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols including means for verifying the identity or authority of a user of the system or for message authentication, e.g. authorization, entity authentication, data integrity or data verification, non-repudiation, key authentication or verification of credentials
    • H04L9/3226Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols including means for verifying the identity or authority of a user of the system or for message authentication, e.g. authorization, entity authentication, data integrity or data verification, non-repudiation, key authentication or verification of credentials using a predetermined code, e.g. password, passphrase or PIN
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C9/00Individual registration on entry or exit
    • G07C9/20Individual registration on entry or exit involving the use of a pass
    • G07C9/22Individual registration on entry or exit involving the use of a pass in combination with an identity check of the pass holder
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C9/00Individual registration on entry or exit
    • G07C9/20Individual registration on entry or exit involving the use of a pass
    • G07C9/28Individual registration on entry or exit involving the use of a pass the pass enabling tracking or indicating presence
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L9/00Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols
    • H04L9/32Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols including means for verifying the identity or authority of a user of the system or for message authentication, e.g. authorization, entity authentication, data integrity or data verification, non-repudiation, key authentication or verification of credentials
    • H04L9/3234Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols including means for verifying the identity or authority of a user of the system or for message authentication, e.g. authorization, entity authentication, data integrity or data verification, non-repudiation, key authentication or verification of credentials involving additional secure or trusted devices, e.g. TPM, smartcard, USB or software token
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W12/00Security arrangements; Authentication; Protecting privacy or anonymity
    • H04W12/06Authentication
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W12/00Security arrangements; Authentication; Protecting privacy or anonymity
    • H04W12/08Access security
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L2209/00Additional information or applications relating to cryptographic mechanisms or cryptographic arrangements for secret or secure communication H04L9/00
    • H04L2209/80Wireless
    • H04L2209/805Lightweight hardware, e.g. radio-frequency identification [RFID] or sensor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W12/00Security arrangements; Authentication; Protecting privacy or anonymity
    • H04W12/40Security arrangements using identity modules
    • H04W12/47Security arrangements using identity modules using near field communication [NFC] or radio frequency identification [RFID] modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W12/00Security arrangements; Authentication; Protecting privacy or anonymity
    • H04W12/60Context-dependent security
    • H04W12/69Identity-dependent
    • H04W12/77Graphical identity

Definitions

  • This present disclosure is generally related to systems, devices and/or methods for safeguarding of displaying digital content. Specifically, this disclosure relates to authorization of the device upon unblocking of the device.
  • facilities should review risk analysis data on persons or workforce members that need access to facilities and equipment. This includes staff, patients, visitors and business partners.
  • Some common controls to prevent unauthorized physical access, tampering, and theft that covered facilities may want to consider include: 1 ) Locked doors, signs warning of restricted areas, surveillance cameras, alarms, 2) Property controls such as property control tags, engraving on equipment, 3) Personnel controls such as identification badges, visitor badges and/or escorts for large offices, or 4) Private security service or patrol for the facility.
  • a system for blocking and unblocking display of a mobile device comprising: a user interface of the system; and an application software adapted to display on the mobile device, wherein the application software is adapted to display on the screen of the mobile device upon receiving an authorization indicating that the display has been unblocked.
  • Certain embodiments may facilitate retrieval of an authorization from the visitor badges and display upon unblocking of the mobile device (or more generally, a user device) by the user. Certain embodiments may facilitate retrieval of an authorization from the connection hub and display upon unblocking of the user device by the connection hub account.
  • FIG. 1 is a drawing of a NFC ring according to an embodiment of the present disclosure.
  • FIG. 2 is a drawing of a networked environment according to various embodiments of the present disclosure.
  • FIGS. 3A and 3B are flowcharts illustrating one example of functionality implemented as portions of a visitor tracking application executed in a computing environment in the networked environment of FIG. 2 according to various embodiments of the present disclosure.
  • FIG. 3C is a flowchart illustrating one example of functionality implemented as portions of a visitor verification system executed in a computing environment in the networked environment of FIG. 2 according to various embodiments of the present disclosure.
  • FIG. 4 is a state diagram corresponding to one example of a lifecycle of a visitor in the networked environment of FIG. 2 according to various embodiments of the present disclosure.
  • FIG. 5 Is a schematic block diagram that provides one example illustration of a computing environment employed in the networked environment of FIG. 2 according to various embodiments of the present disclosure. DETAILED DESCRIPTION OF EMBODIMENTS OF THE DISCLOSED
  • the present disclosure relates to tracking and verifying authorization for visitors.
  • facilities for information about the visitors that they invite There is an increasing demand by facilities for information about the visitors that they invite.
  • the compliance movement in particular, has raised awareness about the ultimate source of visitor that they invite.
  • On site visiting visitors are becoming more desirable than visitors remote accessed from far away, at least for the visitors escorted by employees.
  • facilities have become more fickle about how media access is provided to visitors.
  • Visitors with good credit ratings may be more desirable than visitors that have bad credit ratings
  • facilities are also more conscious of privacy and security, while data breach victims, identity theft victims, and so on, are perceived negatives.
  • Facilities may have relied upon representations as to pre-registered authorization that were not tied to identities of specific visitors. For instance, a visitor may be pre-registered, but there may be no way for facilities to be assured that the specific visitors they are registering have not been adulterated or have an acceptable source. Visitors are typically marked with name tags, but these identify the visitor generally, not a specific instance of the visit. Even assuming that a specific instance of a visit were marked with a unique code, the possibility exists for the code to be moved to different visitors or be replicated by fraudsters.
  • Various embodiments of the present disclosure facilitate tracking and authorization verification for a visitor through the use of a pair of public and private identifiers.
  • Each specific instance of a visit may be associated with a name tag comprises of a unique public identifier and a unique private identifier.
  • the name tag may, for example, comprise multiple physical factors.
  • an externally visible factor bears the public identifier
  • a hidden factor bears the private identifier.
  • the hidden factor may be a NFC ring, or a NFC tag on a name tag. While the visitor is being registered and escorted, only the externally visible factor bearing the public identifier may be visible.
  • the public identifier may be scanned upon the occurrence of various events, thereby creating a history record uniquely associated to the visitor.
  • the visitor When a visitor is registered to a facility, the visitor may be associated with the connection hub account. Upon arrival, the facility may perform the
  • the facility may scan the private identifier via a client device, or manually enter the private identifier via a web site.
  • the authorization of the visitor may be determined, and the authorization verification and history of the visitor may be presented to the facility. This visitor-level tracking may be used to manage various operational processes for the visitors as well as to ensure quality and accuracy. In particular, visit-specific expirations may be monitored, and revocation of specific authorizations may be performed.
  • the multi-factor identifier name tag here includes an externally visible factor 103 and a hidden factor 106.
  • the externally visible factor 103 may be opaque and constructed of laminated paper with foil in one embodiment.
  • the hidden factor 106 may be constructed of a NFC ring 121 in one embodiment.
  • the hidden factor 106 may be a NFC tag 1 18 on a NFC ring 121 , while the externally visible factor 103 may be a sticker affixed to the name tag 1 12,
  • the externally visible factor 103 bears a public identifier 109.
  • the public identifier 109 may comprise a barcode, a two-dimensional barcode (data matrix), quick response (QR) code, an alphanumeric siring, or other form of identifier.
  • the externally visible factor 103 may include a hologram. The presence of a hologram may make it more difficult to create a knock-off of the name tag. The hologram may also serve as a tamper-evident seal to indicate whether the externally visible factor 103 has been tampered with and/or peeled off.
  • the hidden factor 106 bears a private identifier 1 18.
  • the private identifier 1 8 may be contained in a NFC ring 121. Because of the design of the multi-factor identifier name tag, the private identifier 1 18 is non-visible.
  • the hidden factor 106 may also contain instructions 124 for the facility. For example, instructions 124 may instruct the facility to scan private identifier 1 18 with a specific application.
  • the externally visible factor 103 may contain a transparent window above the area 127 in order for the public identifier 109 to be visible via the externally visible factor 103 such that the externally visible factor 103 bears the public identifier 109.
  • the public identifier 109 may adhere to the externally visible factor 103 and peel off with the externally visible factor 103.
  • the area 127 may be specially coated with silicone or another substance such that the ink adheres to the underside of the externally visible factor 103.
  • the networked environment 200 includes a computing environment 203, one or more facility authentication devices 206, and one or more facility display devices 209, which are in data communication with each other via a network 215.
  • the network 2 5 includes, for example, the Internet, intranets, extranets, wide area networks (WANs), local area networks (LANs), wired networks, wireless networks, cable networks, satellite networks, or other suitable networks, etc., or any combination of two or more such networks.
  • the computing environment 203 may comprise, for example, a server computer or any other system providing computing capability.
  • the computing environment 203 may employ a plurality of computing devices that may be arranged, for example, in one or more server banks or computer banks or other arrangements. Such computing devices may be located in a single installation or may be distributed among many different geographical locations.
  • the computing environment 203 may include a plurality of computing devices that together may comprise a hosted or "cloud" computing resource, a grid computing resource, and/or any other distributed computing arrangement.
  • the computing environment 203 may correspond to an elastic computing resource where the allotted capacity of processing, network, storage, or other
  • computing-related resources may vary over time.
  • Various applications and/or other functionality may be executed in the computing environment 203 according to various embodiments.
  • various data is stored in a data store 218 that is accessible to the computing environment 203.
  • the data store 218 may be representative of a plurality of data stores 218 as can be appreciated.
  • the data stored in the data store 218, for example, is associated with the operation of the various applications and/or functional facilities described below.
  • the components executed on the computing environment 203 for example, include a visitor tracking application 221 a name tag printing service 224, a name tag verification system 225 in communication with sensors 226, and other applications, services, processes, systems, engines, or functionality not discussed in detail herein.
  • the visitor tracking application 221 is executed to perform visitor tracking and authorization verification functions.
  • the various functions performed by the visitor tracking application 221 may include generating public identifiers 109 and private identifiers 1 18, recording events 227 relating to visitor history, and
  • the data stored in the data store 218 includes, for example, visitor data 230, connection hub data 233, name tag data 236, and potentially other data.
  • the visitor data 230 includes various data corresponding to visitors. Each specific instance of a visit may be associated with a public identifier 109 and a private identifier 1 18.
  • the private identifier 1 18 may be encrypted or otherwise maintained in a secure way.
  • the private identifier 1 18 may be encrypted using a reversible form of encryption, in another embodiment, the private identifier 1 18 may be encrypted using a non-reversible form of encryption (e.g., a hash), it may be that a reversibly encrypted form of the private identifier 1 18 may be maintained in order to perform rotations of a hashing function used to generate the non-reversibly encrypted form of the private identifier 1 18.
  • the visitor data 230 may include a visitor history record 239 that records a plurality of events 227 associated with processing of the visitor. Each event 227 may be associated with the visitor by way of scanning or entering the public identifier 109 in connection with generating the respective event 227.
  • An event 227 may be used to tie specific information to a visitor, such as visit date, location, data breach history, list of compromised online accounts, expiration date, source credit, and so on.
  • the events 227 may also relate to the chain of custody for the name tag, including describing facilities who have had possession of the name tag and the times they gained or lost custody.
  • the visitor data 230 may also record the authorization requests 242 associated with the visitor.
  • An authorization request 242 may correspond to a specific instance in which a private identifier 1 18 for a visitor is presented for authorization of the visitor.
  • the authorization requests 242 may be recorded for the purpose of limiting the number of authorization requests 242 for the visitor to a maximum threshold. Although it may be desirable to allow for multiple authorization requests 242 for re-verification, limiting the total number of authorization requests 242 may ensure that a private identifier 1 18 is not reused in a fraudulent way.
  • the connection hub data 233 may include various data associated with a visitor, such as credit rating data 245, security credentials 248, and/or other data.
  • the connection hub data 233 may record information relating to a specific facility, including a list of visitors, whether the visitor has been registered, whether the name tag has been returned, and so on.
  • the credit rating data 245 may be associated with the specific visitor sent or to be sent to the facility.
  • an individual connection hub account may be associated with the public identifier 109, the private identifier 1 18, and/or other information in the visitor data 230.
  • the security credentials 248 may include usernames, passwords, and/or other credentials used in authorizing a user at a facility display device 209.
  • the name tag data 236 may indicate the respective public identifier 109 and private identifier 1 18 of the various name tags as well as the current status for each.
  • the name tag data 236 may identify name tags that have been shipped to visitors but are not yet associated with visitors.
  • the facility authentication devices 206 and the facility display devices 209 are representative of a plurality of client devices that may be coupled to the network 215.
  • Each of the facility authentication devices 206 and the facility display devices 209 may include a display 263.
  • the display 263 may comprise, for example, one or more devices, such as liquid crystal display (LCD) displays, gas plasma-based flat panel displays, organic light emitting diode (OLED) displays, electrophoretic ink (E ink) displays, LCD projectors, or other types of display devices, etc.
  • LCD liquid crystal display
  • OLED organic light emitting diode
  • E ink electrophoretic ink
  • Each of the facility authentication devices 206 and the facility display devices 209 may be configured to execute various applications such as a sensitive data application 266 and/or other applications.
  • the sensitive data application 266 may be executed, for example, to access network content served up by the computing environment 203 and/or other servers, thereby rendering a user interface 269 on the display 263.
  • the sensitive data application 266 may comprise, for example, a browser, a dedicated application, etc.
  • the user interface 269 may comprise a network page, an application screen, etc.
  • Each of the facility authentication devices 206 and the facility display devices 209 may be configured to execute applications beyond the sensitive data application 266 such as, for example, email applications, social networking applications, word processors, spreadsheets, and/or other applications.
  • visitor data 230 including a public identifier 109 and a private identifier 1 18. These identifiers are each unique for a particular visitor. The visitor tracking application 221 then initiates the printing of a name tag
  • FIG. 1 One example of such a name tag is shown in FIG. 1 , but this example is not intended to be limiting. Characteristic of the example name tag is that the public identifier 109 is initially visible, and the private identifier 1 18 is initially non-visible, where the name tag is designed so that private identifier 1 18 is to be accessible only to the facility. Tamper-evident features of the name tag are present so that any attempt to access the private identifier 1 18 may be seen based on changes to the name tag.
  • the printed name tag is then transferred to a reception office.
  • the name tag may be affixed to the visitor by that reception office.
  • the reception office or other source may upload detailed information about the visitor, to include expiration date, a list of data breach paste sites, a source credit, and/or other information. This information may be recorded in an event 227 in the visitor history record 239 of the visitor data 230.
  • the visitor name tag may be scanned to obtain the public identifier 109 via the facility authentication device 206.
  • Events 227 may be created, and the visitor history record 239 may be updated based upon the time, status, and/or other information relating to the chain of custody for the visitor.
  • the visitor may be associated with a specific facility user via a connection in the credit rating data 245.
  • the public identifier 109 may be visible and the private identifier 1 18 may be non-visible. If the private identifier 1 18 is visible when the visitor arrives, the escort may understand that the name tag has been tampered with. If the name tag is intact, the facility user may perform a tamper-evident action in order to expose the private identifier 1 18. The facility user may then scan the private identifier 1 18 via the facility display device 209.
  • the visitor tracking application 221 may perform various checks to ensure that the visitor is authorized.
  • the visitor tracking application 221 may return an indication of whether the visitor is authorized to the facility display device 209 for rendering in a user interface 269. Additionally, information from the visitor history record 239 may be sent to the facility display device 209 for rendering in a user interface 269.
  • FIG. 3A shown is a flowchart that provides one example of the operation of a portion of the visitor tracking application 221 according to various embodiments. Beginning with box 303, the visitor tracking application 221 generates a public identifier 109 (FIG. 2) and a private identifier 1 18 (FIG. 2).
  • the visitor tracking application 221 stores the public identifier 109 and a hashed value of the private identifier 1 18 in the data store 218 (FIG. 2). in addition, a reversibiy encrypted version of the private identifier 1 18 may also be stored in some embodiments to facilitate recovery and/or rotation of the hashing function,
  • the visitor tracking application 221 initiates printing of a multi-factor identifier name tag (FIG. 1 ) via the name tag printing service 224 (FIG. 2).
  • the public identifier 109 and the private identifier 1 18 may be transferred to the name tag printing service 224.
  • the name fag printing service 224 may be operated by a third-party vendor, and the public identifier 109 and private identifier 1 18 may be securely transferred to the name tag printing service 224 via the network 215 (FIG. 2).
  • the visitor tracking application 221 may receive a confirmation from the name tag printing service 224 that printing has completed.
  • the visitor tracking application 221 may initiate a transfer of the name tag to a receptionist in order for the name tag to be assigned to a visitor.
  • the visitor tracking application 221 receives an event 227 (FIG. 2) corresponding to the public identifier 109.
  • the event 227 may be generated by a sensitive data application 266 (FIG. 2) executed in a facility authentication device 206 (FIG. 2), a reception device, or another device.
  • information about a visitor may be uploaded to the visitor tracking application 221 via a spreadsheet, comma-delimited file, and/or other file.
  • the visitor tracking application 221 may provide an indication of validity to the facility authentication device 206 from which the public identifier 109 was received, in box 318, the visitor tracking application 221 records the event 227 in the visitor history record 239 (FIG. 2) in the visitor data 230 (FIG. 2) for the visitor.
  • the visitor tracking application 221 determines whether another event 227 is received. If another event 227 is received, the visitor tracking application 221 returns to box 315. In this way, the visitor tracking application 221 may build up a visitor history record 239 for the visitor that includes multiple events 227 corresponding to a complete chain of custody for the visitor.
  • the visitor tracking application 221 receives one or more security credentials 248 (FIG. 2) from a facility display device 209 (FIG. 2). in box 333, the visitor tracking application 221 authorizes the facility display device 209 based at least in-part on the provided security credentials 248. In box 336, the visitor tracking application 221 receives a private identifier 1 18 (FIG. 2) from the facility display device 209 in an authorization request 242 (FIG. 2). For example, a user may use the sensitive data application 266 (FIG.
  • the visitor tracking application 221 assesses the authorization of the visitor.
  • the visitor tracking application 221 may compute a hashed value of the received private identifier 1 18 and compare that value with a stored hashed value of a private identifier 1 8.
  • the visitor tracking application 221 may reconcile the visitor history record 239 for the assigned visitor to ensure that there are no irregularities that may be associated with fraud. Based at least in part on the events 227 (FIG. 2) in the visitor history record 239, the visitor tracking application 221 is able to determine whether the visitor is to be considered authorized.
  • the visitor tracking application 221 determines whether the visitor is considered authorized. If so, the visitor tracking application 221 moves to box 345 and sends an indication of authorization to the facility display device 209. In some cases, a system administrator or other user may be informed of the irregularity or potential fraud relating to the visitor and/or the name tag. The visitor tracking application 221 then continues to box 354.
  • the visitor tracking application 221 sends at least a portion of the visitor history information contained in the visitor history record 239 to the facility display device 209.
  • This information may relate to the credit source, publicly acknowledged data breach, chain of custody, and/or other information about the visitor that may be gleaned from the events 227.
  • the sensitive data application 266 may render when and where the name tag was manufactured, when the name tag was shipped and where it was shipped from, when and where the name tag was delivered, and/or other information. In one embodiment, this information may be rendered in a user interface 289 (FIG. 2) including an interactive map.
  • the visitor tracking application 221 may record information about the authorization request 242. This information may be used in future authorization requests 242 to ensure that a maximum number of authorization requests 242 is not exceeded for the visitor. Thereafter, the portion of the visitor tracking application 221 ends.
  • FIG. 3C what is shown is a flowchart that provides an example of the operation of a portion of the name tag verification system 225 according to various embodiments. It is understood that the flowchart of FIG. 3C provides merely an example of the many different types of functional arrangements that may be employed to implement the operation of the portion of the name tag verification system 225 as described herein. As an alternative, the flowchart of FIG. 3C may be viewed as depicting an example of elements of a method implemented in the computing environment 203 (FIG. 2) according to one or more embodiments.
  • the name tag verification system 225 uses the sensor 226 (FIG. 2) to capture an image of the externally visible factor 103.
  • the externally visible factor 103 may be captured immediately after the NFC tag 1 15 is affixed on top of the name tag 12.
  • the name tag verification system 225 recognizes a public identifier 109 (FIG. 1 ) in the image of the externally visible factor 103. in doing so, the name tag verification system 225 may determine whether the public identifier 09 is a valid identifier. In box 369, the name tag verification system 225 recognizes a private identifier 1 18 (FIG. 1 ) of the hidden factor 106. In doing so, the name tag verification system 225 may determine whether the private identifier 1 18 is a valid identifier. For example, the name tag verification system 225 may query the visitor data 230 (FIG. 2) or the visitor tracking application 221 (FIG. 2) to determine identifier validity.
  • the name tag verification system 225 may confirm whether the identifiers conform to a predefined format.
  • the name tag verification system 225 determines whether the private identifier 1 18 is associated with the public identifier 109. For example, the name tag verification system 225 may query the visitor data 230 or the visitor tracking application 221 to determine whether the identifiers are associated with each other. As the private identifier 1 18 may be stored in the data store 218 as a hashed value, the name tag verification system 225 or other logic may compute a hashed value of the recognized private identifier 1 18 in order to compare the hashed value with the stored hashed value.
  • the public identifier 109 and the private identifier 1 18 may both be included on the name tag 1 12.
  • the state diagram 400 corresponds to a lifecycle of a visitor and its associated public identifier 109 (FIG. 2) and private identifier 1 18 (FIG. 2). Each of the following state transitions may be memorialized by events 227 (FIG, 2) in the visitor history record 239 (FIG, 2).
  • box 409 when the name tag has been affixed to the visitor, the visitor is in the "Labeled” state.
  • the visitor In box 412, when the visitor has been received by a reception center, the visitor is in the "received” state, in box 415, when the name tag has been shipped, the visitor is then in the "shipped” state, in box 418, when the authorization of the visitor has been verified by a facility, the visitor is in the "verified” state.
  • the visitor may not be in the "received” state, as the name tag may be shipped directly. Box 409 may instead transition directly to box 415. If various scenarios occur, the visitor may transition from any other state to box 421 in the "revoked” state. For example, if it is determined that private identifiers 1 18 have been compromised prior to name tag printing, the visitor may transition from "born” to "revoked.” Likewise, if name tags from the printer are lost, the name tag may transition from "printed” to "revoked.”

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The present disclosure is related to systems, methods, and/or processor readable media for unblocking display. In certain embodiments, the disclosure relates to systems and/or methods for presenting display access on devices where authorizations are obtained on the device on unblocking of the device. A wireless location system/method is disclosed having one or more location centers for authenticating visitors. Visitor location-based authentication requests can utilize Internet communication among a network of location centers. A plurality of visitor locating technologies may be used. Locations of other visitors in proximity of the visitor being located, indoor wireless signal detection units; and reduced coverage base stations. The system/method is useful for tracking, routing, and visitor and employee location, including applications for access exclusion from certain areas. Public and private unique identifiers are generated for a visitor, and an identifier name tag is printed and affixed to the visitor. The public identifier is visible, but the private identifier is not visible unless a tamper-evident action is performed. Events involving scans of the public identifier are recorded. The private identifier is scanned by a facility, and in response, authorization information for the visitor is sent to the facility. The authorization information is determined based at least in part on the events relative to the public identifier.

Description

FACILITY MEDIA ACCESS SAFEGUARD SYSTEMS
FIELD OF THE INVENTION
This present disclosure is generally related to systems, devices and/or methods for safeguarding of displaying digital content. Specifically, this disclosure relates to authorization of the device upon unblocking of the device.
BACKGROUND OF THE DISCLOSED TECHNOLOGY
To establish the facility security plan, facilities should review risk analysis data on persons or workforce members that need access to facilities and equipment. This includes staff, patients, visitors and business partners. Some common controls to prevent unauthorized physical access, tampering, and theft that covered facilities may want to consider include: 1 ) Locked doors, signs warning of restricted areas, surveillance cameras, alarms, 2) Property controls such as property control tags, engraving on equipment, 3) Personnel controls such as identification badges, visitor badges and/or escorts for large offices, or 4) Private security service or patrol for the facility.
In a large organization, because of the number of visitors and employees, this practice may be required for every visit, in a small office, once someone's identity has been verified it may not be necessary to check identity every time he or she visits, because the identity would already be known. When in a large organization, it may be impossible for a facility to verify the authorization of ail the visitors. For example, the locations of the visitors may not correspond to what the facility was expecting. The facility will typically have no way of verifying the locations of the visitors and their movements. Further, the facility will typically have no knowledge of what contents the visitors are authorized to access, or of no access authorization. Likewise, it may be difficult if not impossible for the facility to determine which visitors have access authorization in order to block digital content access. SUMMARY OF THE INVENTION
According to certain embodiments there may be provided a system for blocking and unblocking display of a mobile device comprising: a user interface of the system; and an application software adapted to display on the mobile device, wherein the application software is adapted to display on the screen of the mobile device upon receiving an authorization indicating that the display has been unblocked.
Certain embodiments may facilitate retrieval of an authorization from the visitor badges and display upon unblocking of the mobile device (or more generally, a user device) by the user. Certain embodiments may facilitate retrieval of an authorization from the connection hub and display upon unblocking of the user device by the connection hub account. BRIEF DESCRIPTION OF THE DRAWINGS
Many aspects of the present disclosure can be better understood with reference to the following drawings. The components in the drawings are not necessarily to scale, with emphasis instead being placed upon clearly illustrating the principles of the disclosure. Moreover, in the drawings, like reference numerals designate corresponding parts throughout the several views.
FIG. 1 is a drawing of a NFC ring according to an embodiment of the present disclosure.
FIG. 2 is a drawing of a networked environment according to various embodiments of the present disclosure.
FIGS. 3A and 3B are flowcharts illustrating one example of functionality implemented as portions of a visitor tracking application executed in a computing environment in the networked environment of FIG. 2 according to various embodiments of the present disclosure.
FIG. 3C is a flowchart illustrating one example of functionality implemented as portions of a visitor verification system executed in a computing environment in the networked environment of FIG. 2 according to various embodiments of the present disclosure.
FIG. 4 is a state diagram corresponding to one example of a lifecycle of a visitor in the networked environment of FIG. 2 according to various embodiments of the present disclosure. FIG. 5 Is a schematic block diagram that provides one example illustration of a computing environment employed in the networked environment of FIG. 2 according to various embodiments of the present disclosure. DETAILED DESCRIPTION OF EMBODIMENTS OF THE DISCLOSED
TECHNOLOGY
References will now be made in detail to the present exemplary
embodiments, examples of which are illustrated in the accompanying drawings. Certain examples are shown in the above-identified figures and described in detail below. In describing these examples, like or identical reference numbers are used to identify common or similar elements. The figures are not necessarily to scale and certain features and certain views of the figures may be shown exaggerated in scale or in schematic for clarity and/or conciseness.
The present disclosure relates to tracking and verifying authorization for visitors. There is an increasing demand by facilities for information about the visitors that they invite. The compliance movement, in particular, has raised awareness about the ultimate source of visitor that they invite. On site visiting visitors are becoming more desirable than visitors remote accessed from far away, at least for the visitors escorted by employees. Further, facilities have become more fickle about how media access is provided to visitors. Visitors with good credit ratings may be more desirable than visitors that have bad credit ratings, facilities are also more conscious of privacy and security, while data breach victims, identity theft victims, and so on, are perceived negatives. Also, there is a rising awareness of
trustworthiness and authorization levels, making it important for facilities to identify the credentials used in their visitors. Visitors whose connection accounts have been compromised may be disfavored. Issues relating to facility preferences apply to trade secret and other categories of confidential data as well.
Facilities may have relied upon representations as to pre-registered authorization that were not tied to identities of specific visitors. For instance, a visitor may be pre-registered, but there may be no way for facilities to be assured that the specific visitors they are registering have not been adulterated or have an acceptable source. Visitors are typically marked with name tags, but these identify the visitor generally, not a specific instance of the visit. Even assuming that a specific instance of a visit were marked with a unique code, the possibility exists for the code to be moved to different visitors or be replicated by fraudsters.
Various embodiments of the present disclosure facilitate tracking and authorization verification for a visitor through the use of a pair of public and private identifiers. Each specific instance of a visit may be associated with a name tag comprises of a unique public identifier and a unique private identifier. The name tag may, for example, comprise multiple physical factors. In one embodiment, an externally visible factor bears the public identifier, and a hidden factor bears the private identifier. The hidden factor may be a NFC ring, or a NFC tag on a name tag. While the visitor is being registered and escorted, only the externally visible factor bearing the public identifier may be visible. The public identifier may be scanned upon the occurrence of various events, thereby creating a history record uniquely associated to the visitor.
When a visitor is registered to a facility, the visitor may be associated with the connection hub account. Upon arrival, the facility may perform the
tamper-evident action with respect to the externally visible factor of the name tag, as well as verifying the private identifier on the name tag. The facility may scan the private identifier via a client device, or manually enter the private identifier via a web site. The authorization of the visitor may be determined, and the authorization verification and history of the visitor may be presented to the facility. This visitor-level tracking may be used to manage various operational processes for the visitors as well as to ensure quality and accuracy. In particular, visit-specific expirations may be monitored, and revocation of specific authorizations may be performed.
With reference to FIG. 1 , shown is an example of a multi-factor identifier name tag according to one embodiment. The multi-factor identifier name tag here includes an externally visible factor 103 and a hidden factor 106. The externally visible factor 103 may be opaque and constructed of laminated paper with foil in one embodiment. The hidden factor 106 may be constructed of a NFC ring 121 in one embodiment. The hidden factor 106 may be a NFC tag 1 18 on a NFC ring 121 , while the externally visible factor 103 may be a sticker affixed to the name tag 1 12, The externally visible factor 103 bears a public identifier 109. The public identifier 109 may comprise a barcode, a two-dimensional barcode (data matrix), quick response (QR) code, an alphanumeric siring, or other form of identifier. The externally visible factor 103 may include a hologram. The presence of a hologram may make it more difficult to create a knock-off of the name tag. The hologram may also serve as a tamper-evident seal to indicate whether the externally visible factor 103 has been tampered with and/or peeled off.
Returning to FIG. 1 , the hidden factor 106 bears a private identifier 1 18. The private identifier 1 8 may be contained in a NFC ring 121. Because of the design of the multi-factor identifier name tag, the private identifier 1 18 is non-visible. The hidden factor 106 may also contain instructions 124 for the facility. For example, instructions 124 may instruct the facility to scan private identifier 1 18 with a specific application.
The externally visible factor 103 may contain a transparent window above the area 127 in order for the public identifier 109 to be visible via the externally visible factor 103 such that the externally visible factor 103 bears the public identifier 109. In one embodiment, the public identifier 109 may adhere to the externally visible factor 103 and peel off with the externally visible factor 103. The area 127 may be specially coated with silicone or another substance such that the ink adheres to the underside of the externally visible factor 103.
Moving on to FIG. 2, shown is a networked environment 200 according to various embodiments. The networked environment 200 includes a computing environment 203, one or more facility authentication devices 206, and one or more facility display devices 209, which are in data communication with each other via a network 215. The network 2 5 includes, for example, the Internet, intranets, extranets, wide area networks (WANs), local area networks (LANs), wired networks, wireless networks, cable networks, satellite networks, or other suitable networks, etc., or any combination of two or more such networks.
The computing environment 203 may comprise, for example, a server computer or any other system providing computing capability. Alternatively, the computing environment 203 may employ a plurality of computing devices that may be arranged, for example, in one or more server banks or computer banks or other arrangements. Such computing devices may be located in a single installation or may be distributed among many different geographical locations. For example, the computing environment 203 may include a plurality of computing devices that together may comprise a hosted or "cloud" computing resource, a grid computing resource, and/or any other distributed computing arrangement. In some cases, the computing environment 203 may correspond to an elastic computing resource where the allotted capacity of processing, network, storage, or other
computing-related resources may vary over time.
Various applications and/or other functionality may be executed in the computing environment 203 according to various embodiments. Also, various data is stored in a data store 218 that is accessible to the computing environment 203. The data store 218 may be representative of a plurality of data stores 218 as can be appreciated. The data stored in the data store 218, for example, is associated with the operation of the various applications and/or functional facilities described below. The components executed on the computing environment 203, for example, include a visitor tracking application 221 a name tag printing service 224, a name tag verification system 225 in communication with sensors 226, and other applications, services, processes, systems, engines, or functionality not discussed in detail herein. The visitor tracking application 221 is executed to perform visitor tracking and authorization verification functions. The various functions performed by the visitor tracking application 221 may include generating public identifiers 109 and private identifiers 1 18, recording events 227 relating to visitor history, and
performing verification of authorization for a given private identifier 1 18
corresponding to a visitor.
The data stored in the data store 218 includes, for example, visitor data 230, connection hub data 233, name tag data 236, and potentially other data. The visitor data 230 includes various data corresponding to visitors. Each specific instance of a visit may be associated with a public identifier 109 and a private identifier 1 18. The private identifier 1 18 may be encrypted or otherwise maintained in a secure way. In one embodiment, the private identifier 1 18 may be encrypted using a reversible form of encryption, in another embodiment, the private identifier 1 18 may be encrypted using a non-reversible form of encryption (e.g., a hash), it may be that a reversibly encrypted form of the private identifier 1 18 may be maintained in order to perform rotations of a hashing function used to generate the non-reversibly encrypted form of the private identifier 1 18. The visitor data 230 may include a visitor history record 239 that records a plurality of events 227 associated with processing of the visitor. Each event 227 may be associated with the visitor by way of scanning or entering the public identifier 109 in connection with generating the respective event 227. An event 227 may be used to tie specific information to a visitor, such as visit date, location, data breach history, list of compromised online accounts, expiration date, source credit, and so on. The events 227 may also relate to the chain of custody for the name tag, including describing facilities who have had possession of the name tag and the times they gained or lost custody.
The visitor data 230 may also record the authorization requests 242 associated with the visitor. An authorization request 242 may correspond to a specific instance in which a private identifier 1 18 for a visitor is presented for authorization of the visitor. The authorization requests 242 may be recorded for the purpose of limiting the number of authorization requests 242 for the visitor to a maximum threshold. Although it may be desirable to allow for multiple authorization requests 242 for re-verification, limiting the total number of authorization requests 242 may ensure that a private identifier 1 18 is not reused in a fraudulent way.
The connection hub data 233 may include various data associated with a visitor, such as credit rating data 245, security credentials 248, and/or other data. The connection hub data 233 may record information relating to a specific facility, including a list of visitors, whether the visitor has been registered, whether the name tag has been returned, and so on. The credit rating data 245 may be associated with the specific visitor sent or to be sent to the facility. Thus, an individual connection hub account may be associated with the public identifier 109, the private identifier 1 18, and/or other information in the visitor data 230. The security credentials 248 may include usernames, passwords, and/or other credentials used in authorizing a user at a facility display device 209.
The name tag data 236 may indicate the respective public identifier 109 and private identifier 1 18 of the various name tags as well as the current status for each. The name tag data 236 may identify name tags that have been shipped to visitors but are not yet associated with visitors.
The facility authentication devices 206 and the facility display devices 209 are representative of a plurality of client devices that may be coupled to the network 215. Each of the facility authentication devices 206 and the facility display devices 209 may include a display 263. The display 263 may comprise, for example, one or more devices, such as liquid crystal display (LCD) displays, gas plasma-based flat panel displays, organic light emitting diode (OLED) displays, electrophoretic ink (E ink) displays, LCD projectors, or other types of display devices, etc.
Each of the facility authentication devices 206 and the facility display devices 209 may be configured to execute various applications such as a sensitive data application 266 and/or other applications. The sensitive data application 266 may be executed, for example, to access network content served up by the computing environment 203 and/or other servers, thereby rendering a user interface 269 on the display 263. To this end, the sensitive data application 266 may comprise, for example, a browser, a dedicated application, etc., and the user interface 269 may comprise a network page, an application screen, etc. Each of the facility authentication devices 206 and the facility display devices 209 may be configured to execute applications beyond the sensitive data application 266 such as, for example, email applications, social networking applications, word processors, spreadsheets, and/or other applications.
Next, a general description of the operation of the various components of the networked environment 200 is provided. To begin, the visitor tracking
application 221 creates visitor data 230, including a public identifier 109 and a private identifier 1 18. These identifiers are each unique for a particular visitor. The visitor tracking application 221 then initiates the printing of a name tag
corresponding to the visitor via the name tag printing service 224.
One example of such a name tag is shown in FIG. 1 , but this example is not intended to be limiting. Characteristic of the example name tag is that the public identifier 109 is initially visible, and the private identifier 1 18 is initially non-visible, where the name tag is designed so that private identifier 1 18 is to be accessible only to the facility. Tamper-evident features of the name tag are present so that any attempt to access the private identifier 1 18 may be seen based on changes to the name tag.
The printed name tag is then transferred to a reception office. The name tag may be affixed to the visitor by that reception office. Moreover, the reception office or other source may upload detailed information about the visitor, to include expiration date, a list of data breach paste sites, a source credit, and/or other information. This information may be recorded in an event 227 in the visitor history record 239 of the visitor data 230.
As the visitor is received by a reception center, the visitor name tag may be scanned to obtain the public identifier 109 via the facility authentication device 206. Events 227 may be created, and the visitor history record 239 may be updated based upon the time, status, and/or other information relating to the chain of custody for the visitor. The visitor may be associated with a specific facility user via a connection in the credit rating data 245.
The public identifier 109 may be visible and the private identifier 1 18 may be non-visible. If the private identifier 1 18 is visible when the visitor arrives, the escort may understand that the name tag has been tampered with. If the name tag is intact, the facility user may perform a tamper-evident action in order to expose the private identifier 1 18. The facility user may then scan the private identifier 1 18 via the facility display device 209.
When the private identifier 1 8 is scanned, the visitor tracking application 221 may perform various checks to ensure that the visitor is authorized. The visitor tracking application 221 may return an indication of whether the visitor is authorized to the facility display device 209 for rendering in a user interface 269. Additionally, information from the visitor history record 239 may be sent to the facility display device 209 for rendering in a user interface 269. Referring next to FIG. 3A, shown is a flowchart that provides one example of the operation of a portion of the visitor tracking application 221 according to various embodiments. Beginning with box 303, the visitor tracking application 221 generates a public identifier 109 (FIG. 2) and a private identifier 1 18 (FIG. 2). In box 306, the visitor tracking application 221 stores the public identifier 109 and a hashed value of the private identifier 1 18 in the data store 218 (FIG. 2). in addition, a reversibiy encrypted version of the private identifier 1 18 may also be stored in some embodiments to facilitate recovery and/or rotation of the hashing function,
in box 309, the visitor tracking application 221 initiates printing of a multi-factor identifier name tag (FIG. 1 ) via the name tag printing service 224 (FIG. 2). In this regard, the public identifier 109 and the private identifier 1 18 may be transferred to the name tag printing service 224. in some cases, the name fag printing service 224 may be operated by a third-party vendor, and the public identifier 109 and private identifier 1 18 may be securely transferred to the name tag printing service 224 via the network 215 (FIG. 2). in box 310, the visitor tracking application 221 may receive a confirmation from the name tag printing service 224 that printing has completed.
in box 312, the visitor tracking application 221 may initiate a transfer of the name tag to a receptionist in order for the name tag to be assigned to a visitor. In box 315, the visitor tracking application 221 receives an event 227 (FIG. 2) corresponding to the public identifier 109. The event 227 may be generated by a sensitive data application 266 (FIG. 2) executed in a facility authentication device 206 (FIG. 2), a reception device, or another device. In one embodiment, information about a visitor (or potentially multiple visitors) may be uploaded to the visitor tracking application 221 via a spreadsheet, comma-delimited file, and/or other file.
The visitor tracking application 221 may provide an indication of validity to the facility authentication device 206 from which the public identifier 109 was received, in box 318, the visitor tracking application 221 records the event 227 in the visitor history record 239 (FIG. 2) in the visitor data 230 (FIG. 2) for the visitor.
In box 321 , the visitor tracking application 221 determines whether another event 227 is received. If another event 227 is received, the visitor tracking application 221 returns to box 315. In this way, the visitor tracking application 221 may build up a visitor history record 239 for the visitor that includes multiple events 227 corresponding to a complete chain of custody for the visitor.
Turning now to FIG. 3B, what is shown is a flowchart that provides an example of the operation of another portion of the visitor tracking application 221 according to various embodiments. Beginning with box 330, the visitor tracking application 221 receives one or more security credentials 248 (FIG. 2) from a facility display device 209 (FIG. 2). in box 333, the visitor tracking application 221 authorizes the facility display device 209 based at least in-part on the provided security credentials 248. In box 336, the visitor tracking application 221 receives a private identifier 1 18 (FIG. 2) from the facility display device 209 in an authorization request 242 (FIG. 2). For example, a user may use the sensitive data application 266 (FIG. 2) to scan a QR code visible on a multi-factor identifier name tag (FIG. 1 ). In box 339, the visitor tracking application 221 assesses the authorization of the visitor. The visitor tracking application 221 may compute a hashed value of the received private identifier 1 18 and compare that value with a stored hashed value of a private identifier 1 8. The visitor tracking application 221 may reconcile the visitor history record 239 for the assigned visitor to ensure that there are no irregularities that may be associated with fraud. Based at least in part on the events 227 (FIG. 2) in the visitor history record 239, the visitor tracking application 221 is able to determine whether the visitor is to be considered authorized.
In box 342, the visitor tracking application 221 determines whether the visitor is considered authorized. If so, the visitor tracking application 221 moves to box 345 and sends an indication of authorization to the facility display device 209. In some cases, a system administrator or other user may be informed of the irregularity or potential fraud relating to the visitor and/or the name tag. The visitor tracking application 221 then continues to box 354.
In box 348, the visitor tracking application 221 sends at least a portion of the visitor history information contained in the visitor history record 239 to the facility display device 209. This information may relate to the credit source, publicly acknowledged data breach, chain of custody, and/or other information about the visitor that may be gleaned from the events 227. Ultimately, the sensitive data application 266 may render when and where the name tag was manufactured, when the name tag was shipped and where it was shipped from, when and where the name tag was delivered, and/or other information. In one embodiment, this information may be rendered in a user interface 289 (FIG. 2) including an interactive map.
In box 354, the visitor tracking application 221 may record information about the authorization request 242. This information may be used in future authorization requests 242 to ensure that a maximum number of authorization requests 242 is not exceeded for the visitor. Thereafter, the portion of the visitor tracking application 221 ends.
Moving now to FIG. 3C, what is shown is a flowchart that provides an example of the operation of a portion of the name tag verification system 225 according to various embodiments. It is understood that the flowchart of FIG. 3C provides merely an example of the many different types of functional arrangements that may be employed to implement the operation of the portion of the name tag verification system 225 as described herein. As an alternative, the flowchart of FIG. 3C may be viewed as depicting an example of elements of a method implemented in the computing environment 203 (FIG. 2) according to one or more embodiments.
In box 363, the name tag verification system 225 uses the sensor 226 (FIG. 2) to capture an image of the externally visible factor 103. The externally visible factor 103 may be captured immediately after the NFC tag 1 15 is affixed on top of the name tag 12.
In box 366, the name tag verification system 225 recognizes a public identifier 109 (FIG. 1 ) in the image of the externally visible factor 103. in doing so, the name tag verification system 225 may determine whether the public identifier 09 is a valid identifier. In box 369, the name tag verification system 225 recognizes a private identifier 1 18 (FIG. 1 ) of the hidden factor 106. In doing so, the name tag verification system 225 may determine whether the private identifier 1 18 is a valid identifier. For example, the name tag verification system 225 may query the visitor data 230 (FIG. 2) or the visitor tracking application 221 (FIG. 2) to determine identifier validity. Alternatively, the name tag verification system 225 may confirm whether the identifiers conform to a predefined format. In box 372, in order to verify a correct production of the multi-factor identifier name tag, the name tag verification system 225 determines whether the private identifier 1 18 is associated with the public identifier 109. For example, the name tag verification system 225 may query the visitor data 230 or the visitor tracking application 221 to determine whether the identifiers are associated with each other. As the private identifier 1 18 may be stored in the data store 218 as a hashed value, the name tag verification system 225 or other logic may compute a hashed value of the recognized private identifier 1 18 in order to compare the hashed value with the stored hashed value.
Although the flowchart of FIG. 3C relates to the use of multiple sensors 226 to capture QR code and a NFC ring, in some embodiments, the public identifier 109 and the private identifier 1 18 may both be included on the name tag 1 12.
Continuing now to FIG. 4, shown is an example of a state diagram 400 according to one embodiment. The state diagram 400 corresponds to a lifecycle of a visitor and its associated public identifier 109 (FIG. 2) and private identifier 1 18 (FIG. 2). Each of the following state transitions may be memorialized by events 227 (FIG, 2) in the visitor history record 239 (FIG, 2).
In box 409, when the name tag has been affixed to the visitor, the visitor is in the "Labeled" state. In box 412, when the visitor has been received by a reception center, the visitor is in the "received" state, in box 415, when the name tag has been shipped, the visitor is then in the "shipped" state, in box 418, when the authorization of the visitor has been verified by a facility, the visitor is in the "verified" state.
In some cases, the visitor may not be in the "received" state, as the name tag may be shipped directly. Box 409 may instead transition directly to box 415. If various scenarios occur, the visitor may transition from any other state to box 421 in the "revoked" state. For example, if it is determined that private identifiers 1 18 have been compromised prior to name tag printing, the visitor may transition from "born" to "revoked." Likewise, if name tags from the printer are lost, the name tag may transition from "printed" to "revoked."

Claims

WHAT IS CLAIMED
1. A physical facility safeguarding system for tracking and authenticating visitors by using multiple physical factors including NFC tag and QR code, wherein media access are automatically granted and revoked according to authentication of the factors, comprising:
at least one computing device;
a memory storing instructions configured to be executed by the computing device to implement a digital media display method, wherein the computing device further
generates, by at least one computing device, a public identifier and a private identifier,
receives, by the at least one computing device, the public identifier in association with individual ones of a plurality of events, the individual ones of the plurality of events representing a respective scan of the public identifier affixed to an item, the private identifier being also affixed to the item but not capable of being scanned, and
records, by the at least one computing device, the individual ones of the plurality of events in association with both the public identifier and the private identifier; and
a visitor tracking application executable in the at least one computing device, the visitor tracking application comprising: logic that receives an authentication request for a visitor, the authentication request specifying a private identifier for the visitor;
logic that, in response to receiving the authentication request, determines whether the visitor is authentic based at least in part on at least one history event in a visit history record, the visit history record corresponding to the private identifier, the at least one history event being recorded in association with a public identifier for the visitor; and
logic that sends information identifying whether the visitor is authentic to a facility display device.
2. The system of claim 1 wherein both the public identifier and the private identifier, received by the at least one computing device, are unique to the visitor.
3. The system of claim 1 , wherein the computing device that implements the digital media display method further prints an identifier name tag to be initiated, causes the public identifier to be initially visible on the identifier name tag, and causes the private identifier to be initially non-visible on the identifier name tag.
4. The system of claim , wherein the computing device that implements the digital media display method further stores an encrypted version of the private identifier.
5. The system of claim , wherein the computing device that implements the digital media display method further determines that the visitor has been registered by a facility; and associates the public identifier and the private identifier with the visitor.
6. The system of claim 1 , wherein the computing device that implements the digital media display method further receives the private identifier from a facility display device, and determines whether the visitor is authentic based at least in part on the recorded plurality of events, and sends user interface data to indicate whether the item is authentic to the facility display device.
7. The system of claim 1 , wherein the computing device that implements the digital media display method further determines that the private identifier has not been received beyond a maximum threshold number of times.
PCT/IB2017/054163 2017-07-11 2017-07-11 Facility media access safeguard systems WO2019012310A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/IB2017/054163 WO2019012310A1 (en) 2017-07-11 2017-07-11 Facility media access safeguard systems

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/IB2017/054163 WO2019012310A1 (en) 2017-07-11 2017-07-11 Facility media access safeguard systems

Publications (1)

Publication Number Publication Date
WO2019012310A1 true WO2019012310A1 (en) 2019-01-17

Family

ID=65001580

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2017/054163 WO2019012310A1 (en) 2017-07-11 2017-07-11 Facility media access safeguard systems

Country Status (1)

Country Link
WO (1) WO2019012310A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113746846A (en) * 2021-09-06 2021-12-03 滨州学院 Computer network security access processing system based on big data

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1808972A (en) * 2005-01-19 2006-07-26 国际商业机器公司 Recording device and recording method of generating information flow
CN101165701A (en) * 2006-10-17 2008-04-23 国际商业机器公司 Methods and systems for providing radio frequency identification (RFID) security mutual authentication
US20120143769A1 (en) * 2010-12-02 2012-06-07 Microsoft Corporation Commerce card
CN105024824A (en) * 2014-11-05 2015-11-04 祝国龙 Method for generating and verifying credible label based on asymmetrical encryption algorithm and system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1808972A (en) * 2005-01-19 2006-07-26 国际商业机器公司 Recording device and recording method of generating information flow
CN101165701A (en) * 2006-10-17 2008-04-23 国际商业机器公司 Methods and systems for providing radio frequency identification (RFID) security mutual authentication
US20120143769A1 (en) * 2010-12-02 2012-06-07 Microsoft Corporation Commerce card
CN105024824A (en) * 2014-11-05 2015-11-04 祝国龙 Method for generating and verifying credible label based on asymmetrical encryption algorithm and system

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113746846A (en) * 2021-09-06 2021-12-03 滨州学院 Computer network security access processing system based on big data
CN113746846B (en) * 2021-09-06 2023-08-08 滨州学院 Computer network security access processing system based on big data

Similar Documents

Publication Publication Date Title
US20200153868A1 (en) Converged logical and physical security
Abbas et al. Convergence of blockchain and IoT for secure transportation systems in smart cities
US9576412B2 (en) Network-assisted remote access portal
US20150278824A1 (en) Verification System
US20140304183A1 (en) Verification System
JP2019061672A (en) Secure access with time limit
US10657233B1 (en) Extending electronic ID information
EP3066860A2 (en) Authenticating and managing item ownership and authenticity
JP2014515142A (en) Method and system for authenticating an entity by a terminal
US9992030B2 (en) Method and a system for authenticating and identifying the location of a smartphone
WO2018134639A1 (en) Managing travel documents
CN106254366A (en) For the identification processing method patrolled and examined, Apparatus and system
US11928201B2 (en) Mobile credential with online/offline delivery
WO2019012310A1 (en) Facility media access safeguard systems
CN115563620A (en) Credible security method and security protection system for intelligent education platform
CN112235368A (en) RFID equipment management system based on alliance block chain
Saputra et al. Implementation of password-based key derivation function for authentication scheme in patrolling system
Katara et al. An ict-based border security framework: a case study in indian perspective
Katara et al. Strengthening Indian Border Security through Integrated ICT based Framework
JP5351718B2 (en) Identification system, certificate authority server and identification method
US20230039893A1 (en) System and method for transmitting unlock codes based on event triggers
Cook Enterprise solutions and technologies
Aparna et al. Authentic QR Codes for Traceability and Tamper Detection in IoT Enabled Waste Management Networks
Mostarda et al. Context-based authentication and transport of cultural assets
CN116090023A (en) Block chain-based data detection method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17917319

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17917319

Country of ref document: EP

Kind code of ref document: A1