US12307550B2 - Systems and methods for providing security system information through smart glasses - Google Patents
Systems and methods for providing security system information through smart glasses Download PDFInfo
- Publication number
- US12307550B2 US12307550B2 US17/814,797 US202217814797A US12307550B2 US 12307550 B2 US12307550 B2 US 12307550B2 US 202217814797 A US202217814797 A US 202217814797A US 12307550 B2 US12307550 B2 US 12307550B2
- Authority
- US
- United States
- Prior art keywords
- security
- smart glasses
- user interface
- system information
- security system
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/20—Scenes; Scene-specific elements in augmented reality scenes
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0141—Head-up displays characterised by optical features characterised by the informative content of the display
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B2027/0178—Eyeglass type
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/24—Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
Definitions
- the described aspects relate to smart wearable devices and security systems.
- Smart glasses display information to wearers through private user interfaces that seem integrated with the environment that the wearers are viewing. Although smart glasses hold great potential, they are limited in processing power due to their physical size, and this confines their usability. Because glasses are a prominent part of fashion and need to be comfortable to wear for long periods of time, chunky smart glasses with large hardware components may be impractical.
- aspects of the present disclosure relate generally to smart wearable devices and security systems, and more particularly, to providing security system information through smart glasses.
- An example aspect includes a method at a computing device for providing security system information using smart glasses, comprising receiving location information of the smart glasses in an environment comprising a plurality of security devices, wherein each security device outputs unique security system information about the environment.
- the method further includes identifying at least one security device that is associated with a location with the smart glasses based on the location information. Additionally, the method further includes retrieving the security system information from the at least one security device. Additionally, the method further includes generating a user interface element based on the security system information. Additionally, the method further includes transmitting the user interface element for display on a user interface of the smart glasses.
- Another example aspect includes an apparatus at a computing device for providing security system information using smart glasses, comprising a memory and a processor coupled with the memory.
- the processor is configured to receive location information of the smart glasses in an environment comprising a plurality of security devices, wherein each security device outputs unique security system information about the environment.
- the processor is further configured to identify at least one security device that is associated with a location with the smart glasses based on the location information. Additionally, the processor further configured to retrieve the security system information from the at least one security device. Additionally, the processor further configured to generate a user interface element based on the security system information. Additionally, the processor further configured to transmit the user interface element for display on a user interface of the smart glasses.
- Another example aspect includes an apparatus at a computing device for providing security system information using smart glasses, comprising means for receiving location information of the smart glasses in an environment comprising a plurality of security devices, wherein each security device outputs unique security system information about the environment.
- the apparatus further includes means for identifying at least one security device that is associated with a location with the smart glasses based on the location information. Additionally, the apparatus further includes means for retrieving the security system information from the at least one security device. Additionally, the apparatus further includes means for generating a user interface element based on the security system information. Additionally, the apparatus further includes means for transmitting the user interface element for display on a user interface of the smart glasses.
- Another example aspect includes a computer-readable medium having instructions stored thereon for a computing device providing security system information using smart glasses, wherein the instructions are executable by a processor to receive location information of the smart glasses in an environment comprising a plurality of security devices, wherein each security device outputs unique security system information about the environment.
- the instructions are further executable to identify at least one security device that is associated with a location with the smart glasses based on the location information. Additionally, the instructions are further executable to retrieve the security system information from the at least one security device. Additionally, the instructions are further executable to generate a user interface element based on the security system information. Additionally, the instructions are further executable to transmit the user interface element for display on a user interface of the smart glasses.
- the one or more aspects comprise the features hereinafter fully described and particularly pointed out in the claims.
- the following description and the annexed drawings set forth in detail certain illustrative features of the one or more aspects. These features are indicative, however, of but a few of the various ways in which the principles of various aspects may be employed, and this description is intended to include all such aspects and their equivalents.
- FIG. 1 is a diagram of an exemplary scenario in which smart glasses are used to view security system information, in accordance with exemplary aspects of the present disclosure.
- FIG. 2 is a block diagram of a computing device executing a smart glass user interface (UI) component, in accordance with exemplary aspects of the present disclosure.
- UI smart glass user interface
- FIG. 3 is a flowchart illustrating a method for providing security system information through smart glasses, in accordance with exemplary aspects of the present disclosure.
- FIG. 4 is a flowchart illustrating a method for generating a user interface element, in accordance with exemplary aspects of the present disclosure.
- FIG. 5 is a flowchart illustrating a method for providing updated security system information through smart glasses, in accordance with exemplary aspects of the present disclosure.
- FIG. 6 is a flowchart illustrating a method for providing security system information on the smart glasses based on a location change, in accordance with exemplary aspects of the present disclosure.
- the present disclosure describes systems and methods for providing security system information through smart glasses.
- smart glasses have a limited amount of processing capabilities as compared to full fledged computers.
- security system information generally requires a relatively large amount of storage and processing resources, as compared to other simpler types of information
- the present disclosure describes utilizing the location and perspective view of the smart glasses to determine which subset of the security system information should be provided and displayed on the smart glasses.
- FIG. 1 is a diagram of exemplary scenario 100 in which smart glasses are used to view security system information, in accordance with exemplary aspects of the present disclosure.
- Scenario 100 depicts smart glasses 102 being used in environment 104 .
- environment 104 may be an office building with a plurality of spaces including employee offices, hallways, bathrooms, cafeterias, etc.
- Glasses view 106 depicts what a wearer sees when using smart glasses 102 .
- the wearer is outside of conference room A18, which is accessed using security device 108 .
- security device 108 may be a card reader that locks and unlocks the door of conference room A18 based on the access rights associated with a scanned employee identity (ID) card.
- ID scanned employee identity
- UI element 110 When wearing smart glasses 102 , the wearer sees user interface (UI) element 110 hovering over and/or within a vicinity of security device 108 .
- UI element 110 may be an image or an augmented reality effect that depicts security system information specifically pertaining to security device 108 and/or an area associated with security device 108 .
- UI element 110 may include a status of security device 108 (e.g., “locked” or “unlocked”).
- UI element 110 may include information about how the security device 108 is used and has been historically interacted with.
- UI element 110 includes protection information (i.e., interact with security device 108 with an employee ID card), the last known interaction (e.g., “3:02 pm”), and the number of times security device 108 has been interacted with in a given period of time (e.g., “2” occupants suggests that at least two people interacted with security device 108 in the past hour).
- UI element 110 may further include images of the occupants (e.g., photos from the employee ID cards).
- UI element 110 is only visible in glasses view 106 .
- UI element 110 may be displayed on smart glasses 102 with a projector of smart glasses 102 that casts images (such as UI element 110 ) on the lens of smart glasses 102 .
- the projector may adjust the position on the lens where the image is displayed. This gives the illusion of the image being overlaid on environment 104 .
- computing device 200 may perform a method 300 of wireless communication, by such as via execution of smart glass UI component 215 by processor 205 and/or memory 210 .
- computing device 200 may be server connected to at least one pair of smart glasses and a plurality of security devices in an environment being monitored.
- the method 300 includes receiving location information of the smart glasses in an environment comprising a plurality of security devices, wherein each security device outputs unique security system information about the environment.
- computing device 200 , processor 205 , memory 210 , smart glass UI component 215 , and/or receiving component 220 may be configured to or may comprise means for receiving location information of smart glasses 102 in environment 104 comprising a plurality of security devices (including security device 108 ), wherein each security device outputs unique security system information about environment 104 .
- Environment 104 may be any space such as a school, a campus, an office building, an airport, etc., where security devices are installed. These security devices may include, but are not limited to, surveillance cameras, thermometers, moisture detectors, carbon monoxide detectors, card readers, biometrics readers (e.g., implementing fingerprint, retina, facial detection), and door locks. In some cases, environment 104 may be a large space divided into a plurality of smaller spaces. For example, an office building may be divided up into employee offices, a cafeteria, bathrooms, etc. Each space may have its own set of security devices. For example, there may be a security camera installed in the hallway that faces a conference room and the conference room may have a door that is linked to a card reader.
- security devices may include, but are not limited to, surveillance cameras, thermometers, moisture detectors, carbon monoxide detectors, card readers, biometrics readers (e.g., implementing fingerprint, retina, facial detection), and door locks.
- environment 104 may be a large space divided
- This example is showcased in FIG. 1 .
- the present disclosure describes identifying a subset of security system information that is most relevant to the wearer of smart glasses 102 at a given time. This subset may include information received from a select few security devices that are physically closest to smart glasses 102 .
- receiving component 220 which may be installed on a computing device connected to smart glasses 102 via a local area network (LAN) or a wide area network (WAN), may receive location information from smart glasses 102 .
- the location information includes global positioning system (GPS) coordinates of smart glasses 102 .
- GPS global positioning system
- the location information may be a qualitative value describing a space in environment 104 .
- the location information may be “Hallway 1 .” While the former can be determined using a GPS receiver installed in smart glasses 102 , the latter may be determined using a localization algorithm.
- computing device 200 , processor 205 , memory 210 , smart glass UI component 215 , and/or executing component 245 may be configured to or may comprise means for executing a localization algorithm that assigns the location of the smart glasses to the known location based on identifying the at least one security device in an input image.
- smart glasses 102 may be equipped with a camera that periodically captures images. An image may include glasses view 106 . Smart glasses 102 may transmit the image(s) to a computing device executing smart glass UI component 215 . Smart glass UI component 215 may detect a security device in the image and attempt to identify the security device.
- Identifying the security device may include determining a unique identifier of the security device (e.g., a serial code, a color scheme, a physical layout) relative to identifiers of other security devices in environment 104 .
- identifying the security device first involves detecting, in an image captured by the smart glasses, an identifier of the location and subsequently determining which security devices are located in the location.
- the identifier of the location is a number (e.g., a room number), an object (e.g., a window unique to the location), a visual sign (e.g., a pattern unique to the location), or a unique code (e.g., a QR code on the door).
- the at least one security device is a card reader of an access system that enables and disables access through a security door, wherein the security system information includes personnel identification information and/or card identification information associated with one or more cards scanned at the card reader over a period of time.
- Smart glass UI component 215 may further maintain a security system database that includes location (e.g., GPS coordinates, qualitative values), status, historical usage, and other security system information associated with each security device in environment 104 .
- location e.g., GPS coordinates, qualitative values
- status e.g., status, historical usage, and other security system information associated with each security device in environment 104 .
- security system database A portion of an example security system database is shown in Table 1 below, where column headers identify a type of security information and rows include corresponding values of the respective types of security information:
- smart glass UI component 215 detects the term “conference room A18” in the image received from smart glasses 102 .
- Smart glass UI component 215 may refer to the security system database above and determine that the image includes a card reader outside of conference room A18.
- security device 108 includes a serial code “AF23KJ78” that is visible in the image.
- Smart glass UI component 215 may detect the serial code and also determine that the image includes a card reader outside of conference room A18.
- the location information received is in the form of the image and using the security system database, smart glass UI component 215 determines the location of smart glasses 102 in environment 104 .
- the method 300 includes identifying at least one security device that is associated with a location with the smart glasses based on the location information.
- computing device 200 , processor 205 , memory 210 , smart glass UI component 215 , and/or identifying component 225 may be configured to or may comprise means for identifying at least one security device that is associated with a location with the smart glasses based on the location information.
- identifying component 225 may determine, using the security system database, that a card reader and a camera are present outside of conference room A18 (i.e., the locations of the security devices match “conference room A18”).
- the location information may include GPS coordinates of smart glasses 102 and the security system database may include GPS coordinates of each security device in environment 104 .
- Identifying component 225 may determines that a security device is associated with, and in particular shares, a location with the smart glasses if the GPS coordinates of the security device is within a threshold distance from the GPS coordinates of the smart glasses. It should be noted that a user does not need to be in the same room as the security device so long as the security device is in a line of sight of the glasses.
- a user outside a room could look at a door to a room (or a security device associated with the door) and see information from security devices in the room (e.g., occupants, temperature, smoke, threat level, etc.) using the smart glasses.
- security devices e.g., occupants, temperature, smoke, threat level, etc.
- the method 300 includes retrieving the security system information from the at least one security device.
- computing device 200 , processor 205 , memory 210 , smart glass UI component 215 , and/or retrieving component 230 may be configured to or may comprise means for retrieving the security system information from the at least one security device.
- each security device in environment 104 may periodically transmit security system information to computing device 200 associated with environment 104 .
- security device 108 may periodically transmit information to the computing device 200 whenever there is an interaction with the card reader. For example, whenever a person successfully enters or is rejected from conference room A18 due to a card scan at the card reader, security device 108 may transmit the information about the interaction and a timestamp to computing device 200 .
- a device such as a camera detects motion, the camera may transmit a set of frames capturing the motion and a timestamp to computing device 200 .
- a thermostat may transmit temperature settings to computing device 200 whenever the temperature is adjusted (either manually by a person or automatically).
- Computing device 200 is configured, using smart glass UI component 215 , to populate the received information in the security system database.
- smart glass UI component 215 may retrieve the latest security system information directly from the security devices identified as sharing the location with smart glasses 102 .
- smart glass UI component 215 may retrieve security system information from security device 108 by querying for a status update.
- the method 300 includes generating a UI element based on the security system information.
- computing device 200 , processor 205 , memory 210 , smart glass UI component 215 , and/or generating component 235 may be configured to or may comprise means for generating UI element 110 based on the security system information.
- UI element 110 is a visual element that may be any combination of an image, a video, text, and an augment reality effect. As shown in FIG. 1 , for example, UI element 110 includes text detailing a first portion of the security system information and images detailing a second portion of the security system information.
- the first portion includes status information (e.g., locked/unlocked), historical interaction information (e.g., number of entries, number of occupants, etc., within a given period of time), and device information (e.g., a method of interaction such as scanning an employee ID card).
- the second portion includes a visual representation of the first portion. For example, the occupants indicated in the first portion may be represented using images, augmented reality information, and videos.
- UI element 110 includes metadata indicating visual features of UI element 110 when displayed on the user interface of smart glasses 102 , and wherein the visual features include at least one augmented reality effect.
- the metadata may include a layout of the information.
- Smart glass UI component 215 may refer to an element database that includes information about available UI templates associated with different types of security system information. A portion of an example element database is shown in Table 2 below, where column headers identify a type of element and element parameters, and rows include corresponding values of the respective element type and parameters:
- generating component 235 uses information in the element database such as dimensions of the user element, the location, the text values, and the visual attributes, generating component 235 generates a UI element such as UI element 110 .
- the different values associated with the historical security system information are filled in variables such as [temp_val] for the thermostat.
- smart glass UI component 215 may deliver important and relevant information to a user.
- a user may be in an evacuation situation and may desire to exit environment 104 .
- smart glass UI component 215 may display security system information about locked doors, card readers, room occupants, etc.
- smart glasses 102 may be provided location-relative information that can help a user decide whether to attempt going through a certain path (e.g., if a door is locked and a UI element is generated via the smart glasses that indicates as such, a user may not waste time attempting to access the door).
- UI element 110 e.g., an alert may be generated that an unauthorized user entered the room.
- thermostat and a carbon monoxide detector may be security devices that indicate that a particular room has a breakout fire (e.g., the room temperature according to the thermostat is above a threshold temperature and the carbon monoxide detector detects carbon monoxide).
- a UI element generated by smart glass UI component 215 may include information from each of these detectors.
- the security system information includes one or more of: a device status, event information, historical device information, or an access recommendation.
- the at least one security device is a smart lock that enables and disables access through a security door
- the security system information comprises recommendation information about whether a user should access the security door based on a security event.
- the security event is one of: an evacuation, a fire, flooding, or an unauthorized entry in the environment.
- generating UI element 110 may further comprise providing access recommendations during security events.
- Generating a recommendation may include utilizing a plurality of rules.
- a rule may indicate that a fire alert should be generated in glasses view 106 if a carbon monoxide detector and a thermostat in the same location as the user report the detection of carbon monoxide and a room temperature above a threshold temperature, respectively.
- the fire alert may be shown in a UI element that is an AR effect depicting an animation of a fire or a fire fighter.
- Another rule may indicate that a new evacuation path alert should be generated in glasses view 106 if there is an evacuation alarm in a loudspeaker system (i.e., an audio-based security device) and the user is looking at a door that is locked.
- a loudspeaker system i.e., an audio-based security device
- the method 300 includes transmitting the user interface element for display on a user interface of the smart glasses.
- computing device 200 , processor 205 , memory 210 , smart glass UI component 215 , and/or transmitting component 240 may be configured to or may comprise means for transmitting the user interface element for display on a user interface of the smart glasses.
- computing device 200 may transmit, over a LAN or a WAN, information about the UI element to smart glasses 102 .
- the messages comprising the UI element in the payload may be reconstructed by smart glasses 102 and subsequently displayed to create glasses view 106 shown in FIG. 1 .
- the generating at block 308 of the user interface element comprises receiving, from the smart glasses, an image of the environment from a perspective of a wearer of the smart glasses.
- the camera may be located near a lens of smart glasses 102 such that the image captured by the camera includes glasses view 106 .
- the generating at block 308 of the user interface element comprises identifying the at least one security device in the image.
- smart glass UI component 215 may detect security device 108 .
- security device 108 may have a visible serial code or identifier that smart glass UI component 215 detects and associates with a particular security device from the security system database.
- the generating at block 308 of the user interface element comprises generating the user interface element such that the security system information is overlaid on an area of the user interface to appear within a distance threshold of where the at least one security device is visually located
- smart glass UI component 215 may identify the area covered by the security device in the image. In reference to FIG. 1 , if the rectangular area of glasses view 106 is the boundary of the image captured, smart glass UI component 215 may determine that security device 108 covers an area bounded by the coordinates (x,y): (800, 100), (850, 100), (800, 300), (850, 300). Smart glass UI component 215 subsequently positions the UI element in the vicinity of that region based on the visual attributes listed in the elements database.
- the method 300 may further include receiving updated security system information from the at least one security device.
- computing device 200 , processor 205 , memory 210 , smart glass UI component 215 , and/or receiving component 220 may be configured to or may comprise means for receiving updated security system information from the at least one security device.
- computing device 200 may received information from security device 108 that another person has entered conference room A18. Accordingly, the occupancy count is incremented by 1 and the last entry time is updated as well.
- the method 300 may further include determining that the location of the smart glasses has not changed.
- computing device 200 , processor 205 , memory 210 , smart glass UI component 215 , and/or determining component 250 may be configured to or may comprise means for determining that the location of the smart glasses has not changed.
- the method 300 may further include transmitting the updated security system information to the smart glasses in response to determining that the location of the smart glasses has not changed.
- computing device 200 , processor 205 , memory 210 , smart glass UI component 215 , and/or transmitting component 240 may be configured to or may comprise means for transmitting the updated security system information to the smart glasses in response to determining that the location of the smart glasses has not changed.
- the occupancy count and last entry time may be updated in UI element 110 .
- this may simply involve transmitting new values of the security system information to smart glasses 102 , which updates the UI element locally. In other aspects, this may involve generating a new UI element and transmitting the new UI element to smart glasses 102 .
- Smart glass UI component 215 may thus receive a request to view changes in historical security system information, wherein the request includes a fast forward or rewind request.
- the smart glasses receive this request via a voice command or a button press.
- the smart glasses may be equipped with retina scanners that may receive and interpret eye movements as gestures for rewinding/forwarding.
- smart glass UI component 215 may transmit a new user interface element for display on the user interface of the smart glasses, wherein the new user interface element executes the request.
- the new user interface element may iteratively show data from time t1 to time t2 (in a forward/backward motion). The user may then select a particular set of data to view on the new user interface element, which pauses at the set of data until the user makes another selection or looks away from the security device.
- the method 300 may further include determining that the location of the smart glasses has changed to a new location.
- computing device 200 , processor 205 , memory 210 , smart glass UI component 215 , and/or determining component 250 may be configured to or may comprise means for determining that the location of the smart glasses has changed to a new location.
- smart glass UI component 215 may determine that the GPS coordinates of the smart glasses has changed by a threshold amount or that an image received from the smart glasses depicts a security device in a different location in the environment.
- the method 300 may further include identifying at least one other security device located in the new location with the smart glasses based on updated location information associated with the new location.
- computing device 200 , processor 205 , memory 210 , smart glass UI component 215 , and/or identifying component 225 may be configured to or may comprise means for identifying at least one other security device located in the new location with the smart glasses based on updated location information associated with the new location.
- smart glass UI component 215 may determine from the security system database at least one other security device in the new location of the smart glasses.
- the method 300 may further include retrieving updated security system information from the at least one other security device.
- computing device 200 , processor 205 , memory 210 , smart glass UI component 215 , and/or retrieving component 230 may be configured to or may comprise means for retrieving updated security system information from the at least one other security device.
- the method 300 may further include generating an updated user interface element comprising the updated security system information.
- computing device 200 , processor 205 , memory 210 , smart glass UI component 215 , and/or generating component 235 may be configured to or may comprise means for generating an updated user interface element comprising the updated security system information.
- the method 300 may further include transmitting the updated user interface element for display on the user interface of the smart glasses.
- computing device 200 , processor 205 , memory 210 , smart glass UI component 215 , and/or transmitting component 240 may be configured to or may comprise means for transmitting the updated user interface element for display on the user interface of the smart glasses.
- the method 300 may further include transmitting, after a threshold period of time, a command to the smart glasses to delete the security system information from the at least one security device.
- computing device 200 , processor 205 , memory 210 , smart glass UI component 215 , and/or transmitting component 240 may be configured to or may comprise means for transmitting, after a threshold period of time, a command to the smart glasses to delete the security system information from the at least one security device.
- computing device 200 may transmit a command to delete data that was received and displayed more than a threshold period of time ago (e.g., 2 hours).
- a threshold period of time ago e.g. 2 hours
- the present disclosure describes the utilization of smart wearable glasses that integrate the ability to display historical security data in direct comparison to the live environment the wearer is in through real-time API integration.
- the historical data can also be displayed with metadata from security and building devices in the area that include but not be limited to cameras, readers, controllers, smoke detectors, carbon detectors, fire detection devices, fire notification devices, HVAC, Thermostats, Lighting controls, etc.
- the glasses may support wearable garments, which have the ability to provide inductive charging for the glasses to ensure sustainable power.
- the glasses provide the support for a camera, microphone, and speakers for full two-way communications with other glass wearers and a command center.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Optics & Photonics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
| TABLE 1 | |||||
| Security | Security | ||||
| Device ID | Device Type | Location | Status | . . . | Historical Usage |
| H12FJIDF | Thermostat | Hallway 1 | 78 F. | . . . | [1:00pm-80 F.] |
| DFDFS134 | Camera | Office 1 | Active | . . . | [12:14pm-Motion |
| Detected] | |||||
| . . . | . . . | . . . | . . . | . . . | . . . |
| AF23KJ78 | Card Reader | Conference | Locked | [3:02pm-Last Entry-by | |
| Room A18 | Employee439] | ||||
| AFZFS156 | Camera | Conference | Active | . . . | [3:14pm-Motion Detected] |
| Room A18 | |||||
| TABLE 2 | |||||
| Element | Dimensions | Location | Text | Visuals | |
| CARDREADER | 480 × 100px | Reader | . . . | “CARD | AR Effect: |
| position | READER” | Occupant Image | |||
| “Occupants:” | Animation | ||||
| [occ_value] | (50 × 50px; 20×) | ||||
| “Last Entry:” | |||||
| [lentry_val]; | |||||
| “Status:” | |||||
| [stat_val] | |||||
| “Protection:” | |||||
| [interact_val] | |||||
| CAMERA | 700 × 300px | (400, 400) | . . . | “MOTION | AR Effect: |
| DETECTED” | Motion Capture | ||||
| Boundary | |||||
| . . . | . . . | . . . | . . . | . . . | . . . |
| |
200 × 200px | Thermostat | . . . | “Current | AR Effect: |
| position | Temperature:” | Thermometer | |||
| [temp_val] | Object | ||||
Claims (20)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US17/814,797 US12307550B2 (en) | 2022-07-25 | 2022-07-25 | Systems and methods for providing security system information through smart glasses |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US17/814,797 US12307550B2 (en) | 2022-07-25 | 2022-07-25 | Systems and methods for providing security system information through smart glasses |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| US20240029319A1 US20240029319A1 (en) | 2024-01-25 |
| US12307550B2 true US12307550B2 (en) | 2025-05-20 |
Family
ID=89576866
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US17/814,797 Active US12307550B2 (en) | 2022-07-25 | 2022-07-25 | Systems and methods for providing security system information through smart glasses |
Country Status (1)
| Country | Link |
|---|---|
| US (1) | US12307550B2 (en) |
Citations (25)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20110087988A1 (en) * | 2009-10-12 | 2011-04-14 | Johnson Controls Technology Company | Graphical control elements for building management systems |
| US20110275432A1 (en) * | 2006-08-31 | 2011-11-10 | Lutnick Howard W | Game of chance systems and methods |
| US20120105202A1 (en) * | 2010-11-03 | 2012-05-03 | CISCO TECHNOLOGY, INC. A Corporation of the state of California | Identifying locations within a building using a mobile device |
| US20130031202A1 (en) * | 2011-07-26 | 2013-01-31 | Mick Jason L | Using Augmented Reality To Create An Interface For Datacenter And Systems Management |
| US8489065B2 (en) * | 2011-05-03 | 2013-07-16 | Robert M Green | Mobile device controller application for any security system |
| US20140050455A1 (en) * | 2012-08-20 | 2014-02-20 | Gorilla Technology Inc. | Correction method for object linking across video sequences in a multiple camera video surveillance system |
| US20150028993A1 (en) * | 2013-07-26 | 2015-01-29 | Tyco Integrated Security, LLC | Method and System for Self-discovery and Management of Wireless Security Devices |
| US20150327010A1 (en) * | 2014-05-07 | 2015-11-12 | Johnson Controls Technology Company | Systems and methods for detecting and using equipment location in a building management system |
| US20150325047A1 (en) * | 2014-05-06 | 2015-11-12 | Honeywell International Inc. | Apparatus and method for providing augmented reality for maintenance applications |
| US20170091998A1 (en) * | 2015-09-24 | 2017-03-30 | Tyco Fire & Security Gmbh | Fire/Security Service System with Augmented Reality |
| US20170301165A1 (en) * | 2016-04-14 | 2017-10-19 | Schlage Lock Company Llc | Bi-directional access control system |
| US20180151039A1 (en) * | 2016-11-28 | 2018-05-31 | Ring Inc. | Neighborhood Security Cameras |
| US10097879B1 (en) * | 2017-12-29 | 2018-10-09 | Rovi Guides, Inc. | Systems and methods for extending storage space of a user device |
| US10481862B2 (en) * | 2016-12-02 | 2019-11-19 | Bank Of America Corporation | Facilitating network security analysis using virtual reality display devices |
| US20200105250A1 (en) * | 2018-09-28 | 2020-04-02 | Comcast Cable Communications, Llc | Monitoring of One or More Audio/Video Collection Devices |
| US10769935B2 (en) * | 2016-09-12 | 2020-09-08 | Sensormatic Electronics, LLC | Method and apparatus for unified mobile application for installation of security products |
| US10878240B2 (en) * | 2017-06-19 | 2020-12-29 | Honeywell International Inc. | Augmented reality user interface on mobile device for presentation of information related to industrial process, control and automation system, or other system |
| US10991162B2 (en) * | 2018-12-04 | 2021-04-27 | Curious Company, LLC | Integrating a user of a head-mounted display into a process |
| US11048647B1 (en) * | 2019-12-31 | 2021-06-29 | Axis Ab | Management of resources in a modular control system |
| US11468641B2 (en) * | 2018-02-06 | 2022-10-11 | Servicenow, Inc. | Augmented reality assistant |
| US20230052463A1 (en) * | 2021-03-16 | 2023-02-16 | Blocktag, Inc. | Systems and Methods for Authentication of Security Devices Having Chaosmetrics Features |
| US11676228B2 (en) * | 2020-01-23 | 2023-06-13 | Rebls, Inc. | Systems, methods, and program products for facilitating parcel combination |
| US20230260387A1 (en) * | 2022-02-15 | 2023-08-17 | Johnson Controls Tyco IP Holdings LLP | Systems and methods for detecting security events in an environment |
| US11749096B2 (en) * | 2021-12-15 | 2023-09-05 | Honeywell International Inc. | Event device operation |
| US20240196049A1 (en) * | 2022-12-08 | 2024-06-13 | Synamedia Limited | Client Device Switching to Low Latency Content |
-
2022
- 2022-07-25 US US17/814,797 patent/US12307550B2/en active Active
Patent Citations (26)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20110275432A1 (en) * | 2006-08-31 | 2011-11-10 | Lutnick Howard W | Game of chance systems and methods |
| US20110087988A1 (en) * | 2009-10-12 | 2011-04-14 | Johnson Controls Technology Company | Graphical control elements for building management systems |
| US20120105202A1 (en) * | 2010-11-03 | 2012-05-03 | CISCO TECHNOLOGY, INC. A Corporation of the state of California | Identifying locations within a building using a mobile device |
| US8489065B2 (en) * | 2011-05-03 | 2013-07-16 | Robert M Green | Mobile device controller application for any security system |
| US20130031202A1 (en) * | 2011-07-26 | 2013-01-31 | Mick Jason L | Using Augmented Reality To Create An Interface For Datacenter And Systems Management |
| US20140050455A1 (en) * | 2012-08-20 | 2014-02-20 | Gorilla Technology Inc. | Correction method for object linking across video sequences in a multiple camera video surveillance system |
| US20150028993A1 (en) * | 2013-07-26 | 2015-01-29 | Tyco Integrated Security, LLC | Method and System for Self-discovery and Management of Wireless Security Devices |
| US20150325047A1 (en) * | 2014-05-06 | 2015-11-12 | Honeywell International Inc. | Apparatus and method for providing augmented reality for maintenance applications |
| US20150327010A1 (en) * | 2014-05-07 | 2015-11-12 | Johnson Controls Technology Company | Systems and methods for detecting and using equipment location in a building management system |
| US20170091998A1 (en) * | 2015-09-24 | 2017-03-30 | Tyco Fire & Security Gmbh | Fire/Security Service System with Augmented Reality |
| US10297129B2 (en) * | 2015-09-24 | 2019-05-21 | Tyco Fire & Security Gmbh | Fire/security service system with augmented reality |
| US20170301165A1 (en) * | 2016-04-14 | 2017-10-19 | Schlage Lock Company Llc | Bi-directional access control system |
| US10769935B2 (en) * | 2016-09-12 | 2020-09-08 | Sensormatic Electronics, LLC | Method and apparatus for unified mobile application for installation of security products |
| US20180151039A1 (en) * | 2016-11-28 | 2018-05-31 | Ring Inc. | Neighborhood Security Cameras |
| US10481862B2 (en) * | 2016-12-02 | 2019-11-19 | Bank Of America Corporation | Facilitating network security analysis using virtual reality display devices |
| US10878240B2 (en) * | 2017-06-19 | 2020-12-29 | Honeywell International Inc. | Augmented reality user interface on mobile device for presentation of information related to industrial process, control and automation system, or other system |
| US10097879B1 (en) * | 2017-12-29 | 2018-10-09 | Rovi Guides, Inc. | Systems and methods for extending storage space of a user device |
| US11468641B2 (en) * | 2018-02-06 | 2022-10-11 | Servicenow, Inc. | Augmented reality assistant |
| US20200105250A1 (en) * | 2018-09-28 | 2020-04-02 | Comcast Cable Communications, Llc | Monitoring of One or More Audio/Video Collection Devices |
| US10991162B2 (en) * | 2018-12-04 | 2021-04-27 | Curious Company, LLC | Integrating a user of a head-mounted display into a process |
| US11048647B1 (en) * | 2019-12-31 | 2021-06-29 | Axis Ab | Management of resources in a modular control system |
| US11676228B2 (en) * | 2020-01-23 | 2023-06-13 | Rebls, Inc. | Systems, methods, and program products for facilitating parcel combination |
| US20230052463A1 (en) * | 2021-03-16 | 2023-02-16 | Blocktag, Inc. | Systems and Methods for Authentication of Security Devices Having Chaosmetrics Features |
| US11749096B2 (en) * | 2021-12-15 | 2023-09-05 | Honeywell International Inc. | Event device operation |
| US20230260387A1 (en) * | 2022-02-15 | 2023-08-17 | Johnson Controls Tyco IP Holdings LLP | Systems and methods for detecting security events in an environment |
| US20240196049A1 (en) * | 2022-12-08 | 2024-06-13 | Synamedia Limited | Client Device Switching to Low Latency Content |
Also Published As
| Publication number | Publication date |
|---|---|
| US20240029319A1 (en) | 2024-01-25 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US12155665B2 (en) | Methods and system for monitoring and assessing employee moods | |
| US20180018861A1 (en) | Holographic Technology Implemented Security Solution | |
| US20240161592A1 (en) | Proactive loss prevention system | |
| US11776308B2 (en) | Frictionless access control system embodying satellite cameras for facial recognition | |
| US12002046B2 (en) | Face authentication system and face authentication method | |
| KR102441599B1 (en) | occupancy control device | |
| JP2021072475A (en) | Monitoring system and monitoring system setting program | |
| US11308792B2 (en) | Security systems integration | |
| JP6440327B2 (en) | Crime prevention system, crime prevention method, and robot | |
| CN113490936A (en) | Face authentication device and face authentication method | |
| US20220313095A1 (en) | Thermal and optical analyses and assessments | |
| CN104468690B (en) | The device of the method and distributed systems that executed by the device in distributed system | |
| CN113508389A (en) | Facial authentication registration device and facial authentication registration method | |
| KR20190085376A (en) | Aapparatus of processing image and method of providing image thereof | |
| US12307550B2 (en) | Systems and methods for providing security system information through smart glasses | |
| US11587420B2 (en) | Systems and methods of combining RFID and VMS for people tracking and intrusion detection | |
| US20220027985A1 (en) | Information processing device, information processing system, and information processing method, and program | |
| US20190096153A1 (en) | Smart digital door lock and method for controlling the same | |
| US20210067521A1 (en) | Detecting and Identifying Devices at Enterprise Locations to Protect Enterprise-Managed Information and Resources | |
| Sad et al. | An interactive low-cost smart assistant system: Information kiosk as plug & play device | |
| JP5989289B1 (en) | Communication terminal identification information identification processing system | |
| US12211280B2 (en) | Vision system for classifying persons based on visual appearance and dwell locations | |
| US11361630B1 (en) | Identifying and logging mobile devices posing security threats | |
| US11330006B2 (en) | Detecting and identifying devices at enterprise locations to protect enterprise-managed information and resources | |
| US20250285520A1 (en) | Tiered motion detection for video surveillance systems |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
| AS | Assignment |
Owner name: JOHNSON CONTROLS TYCO IP HOLDINGS LLP, WISCONSIN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:OUELLETTE, JASON M;PARIPALLY, GOPAL;REEL/FRAME:060628/0821 Effective date: 20220725 |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
| AS | Assignment |
Owner name: TYCO FIRE & SECURITY GMBH, SWITZERLAND Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:JOHNSON CONTROLS TYCO IP HOLDINGS LLP;REEL/FRAME:068494/0384 Effective date: 20240201 Owner name: TYCO FIRE & SECURITY GMBH, SWITZERLAND Free format text: ASSIGNMENT OF ASSIGNOR'S INTEREST;ASSIGNOR:JOHNSON CONTROLS TYCO IP HOLDINGS LLP;REEL/FRAME:068494/0384 Effective date: 20240201 |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: AWAITING TC RESP., ISSUE FEE NOT PAID |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
| STCF | Information on status: patent grant |
Free format text: PATENTED CASE |