WO2018150270A1 - Augmented reality enabled windows - Google Patents

Augmented reality enabled windows Download PDF

Info

Publication number
WO2018150270A1
WO2018150270A1 PCT/IB2018/000233 IB2018000233W WO2018150270A1 WO 2018150270 A1 WO2018150270 A1 WO 2018150270A1 IB 2018000233 W IB2018000233 W IB 2018000233W WO 2018150270 A1 WO2018150270 A1 WO 2018150270A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
camera
image data
electronic system
support
Prior art date
Application number
PCT/IB2018/000233
Other languages
French (fr)
Inventor
Pak Kit LAM
Peter Han Joo CHONG
Original Assignee
Zyetric Logic Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zyetric Logic Limited filed Critical Zyetric Logic Limited
Publication of WO2018150270A1 publication Critical patent/WO2018150270A1/en

Links

Classifications

    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41HARMOUR; ARMOURED TURRETS; ARMOURED OR ARMED VEHICLES; MEANS OF ATTACK OR DEFENCE, e.g. CAMOUFLAGE, IN GENERAL
    • F41H5/00Armour; Armour plates
    • F41H5/02Plate construction
    • F41H5/04Plate construction composed of more than one layer
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41HARMOUR; ARMOURED TURRETS; ARMOURED OR ARMED VEHICLES; MEANS OF ATTACK OR DEFENCE, e.g. CAMOUFLAGE, IN GENERAL
    • F41H5/00Armour; Armour plates
    • F41H5/02Plate construction
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41HARMOUR; ARMOURED TURRETS; ARMOURED OR ARMED VEHICLES; MEANS OF ATTACK OR DEFENCE, e.g. CAMOUFLAGE, IN GENERAL
    • F41H5/00Armour; Armour plates
    • F41H5/02Plate construction
    • F41H5/04Plate construction composed of more than one layer
    • F41H5/0407Transparent bullet-proof laminatesinformative reference: layered products essentially comprising glass in general B32B17/06, e.g. B32B17/10009; manufacture or composition of glass, e.g. joining glass to glass C03; permanent multiple-glazing windows, e.g. with spacing therebetween, E06B3/66
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41HARMOUR; ARMOURED TURRETS; ARMOURED OR ARMED VEHICLES; MEANS OF ATTACK OR DEFENCE, e.g. CAMOUFLAGE, IN GENERAL
    • F41H5/00Armour; Armour plates
    • F41H5/02Plate construction
    • F41H5/04Plate construction composed of more than one layer
    • F41H5/0442Layered armour containing metal
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41HARMOUR; ARMOURED TURRETS; ARMOURED OR ARMED VEHICLES; MEANS OF ATTACK OR DEFENCE, e.g. CAMOUFLAGE, IN GENERAL
    • F41H5/00Armour; Armour plates
    • F41H5/26Peepholes; Windows; Loopholes
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19602Image analysis to detect motion of the intruder, e.g. by frame subtraction
    • G08B13/19613Recognition of a predetermined image pattern or behaviour pattern indicating theft or intrusion
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19617Surveillance camera constructional details
    • G08B13/19632Camera support structures, e.g. attachment means, poles
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19639Details of the system layout
    • G08B13/19645Multiple cameras, each having view on one of a plurality of scenes, e.g. multiple cameras for multi-room surveillance or for tracking an object by view hand-over
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19639Details of the system layout
    • G08B13/19652Systems using zones in a single scene defined for different treatment, e.g. outer zone gives pre-alarm, inner zone gives alarm
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19678User interface
    • G08B13/19682Graphic User Interface [GUI] presenting system data to the user, e.g. information on a screen helping a user interacting with an alarm system
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19678User interface
    • G08B13/19691Signalling events for better perception by user, e.g. indicating alarms by making display brighter, adding text, creating a sound
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B15/00Identifying, scaring or incapacitating burglars, thieves or intruders, e.g. by explosives
    • G08B15/001Concealed systems, e.g. disguised alarm systems to make covert systems
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B25/00Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
    • G08B25/01Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium
    • G08B25/08Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium using communication transmission lines
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
    • EFIXED CONSTRUCTIONS
    • E06DOORS, WINDOWS, SHUTTERS, OR ROLLER BLINDS IN GENERAL; LADDERS
    • E06BFIXED OR MOVABLE CLOSURES FOR OPENINGS IN BUILDINGS, VEHICLES, FENCES OR LIKE ENCLOSURES IN GENERAL, e.g. DOORS, WINDOWS, BLINDS, GATES
    • E06B5/00Doors, windows, or like closures for special purposes; Border constructions therefor
    • E06B5/003Storm doors; Combination-screen-and-storm-doors
    • EFIXED CONSTRUCTIONS
    • E06DOORS, WINDOWS, SHUTTERS, OR ROLLER BLINDS IN GENERAL; LADDERS
    • E06BFIXED OR MOVABLE CLOSURES FOR OPENINGS IN BUILDINGS, VEHICLES, FENCES OR LIKE ENCLOSURES IN GENERAL, e.g. DOORS, WINDOWS, BLINDS, GATES
    • E06B5/00Doors, windows, or like closures for special purposes; Border constructions therefor
    • E06B5/10Doors, windows, or like closures for special purposes; Border constructions therefor for protection against air-raid or other war-like action; for other protective purposes
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19663Surveillance related processing done local to the camera

Definitions

  • the present disclosure relates to augmented reality and, more specifically, to augmented reality as applied to windows.
  • Facial recognition technologies have gradually improved to a reasonably accurate level when the cameras can capture the face of the potential suspects, which is virtually an unachievable requirement for surveillance camera networks.
  • Security cameras are not typically deployed at eye-level.
  • terrorists often are intelligent enough to avoid surveillance cameras when they prepare for the attacks (e.g., hiding a bomb). This therefore makes facial recognition only useful in the cases such as corporate office entrance security where the people entering into a gate are usually cooperative and would look right into the facial recognition camera.
  • an electronic device includes a first camera on a first side of the electronic device. There is a second camera on a second side of the electronic device. A first display on the first side of the electronic device is configured to display image data captured from the second camera. A second display on the second side of the electronic device is configured to display image data captured from the first camera.
  • FIGS. 1 A-1B depict an embodiment of the present technology.
  • FIG. 2 depicts a cross-sectional view of an embodiment of the present technology.
  • FIG. 3 depicts the response of an embodiment of the present technology to a terror event.
  • FIG. 4 depicts an exemplary device that may be used to implement embodiments of the present technology.
  • Some embodiments of the present technology can be used to replace glass windows in buildings (e.g., street-level windows), such as the glass window of a coffee shop, a display windows of retail shops, or just a normal window at the ground level of a building, with an advanced augmented reality (AR) enabled window that can monitor everyone walking by it on the street outside near the window, and display information to the people inside if specific people (e.g., criminals or terrorists), events (e.g., car accident or terrorists act), or conditions are present outside.
  • buildings e.g., street-level windows
  • AR augmented reality
  • the two sides of such advanced AR-enabled windows are separated by arbitrary thickness of various materials (e.g., metal, brick, etc.) that can provide various levels of protection to people behind the widow.
  • the windows may have a thickness of bullet-proof or explosive-proof material.
  • AR-enabled window system 100 includes a back layer 102 that is analogous to a smart tablet or large display or monitor equipped with an advanced camera (back layer camera 106) and screen (back layer display 104 facing away in FIG. 1 A) optionally enabled with AR and facial recognition technologies.
  • Back layer camera 106 is responsible for capturing the image from the "back" side of the AR-enabled window (e.g., a view of environment 108 within the field of view of back layer camera 106) and sending it to front layer display 114, through a communication medium (e.g., a high-speed wired
  • Image data captured by front layer camera 116 of front layer 1 12 is sent to back layer display 104 of back layer 102 so that image data captured from front layer camera 116 can be displayed on the back layer display 104.
  • An optional middle layer 110 can be made of any material ranging, including, wood, metal, composites, or plastics, depending on the requirements. This middle layer can even be absent in some cases (e.g., the front layer and the back layer are directly attached to each other).
  • One possible way to use middle layer 110 is a metal layer which makes AR-enabled window system 100 bulletproof at a much lower cost than using bullet-proof glass.
  • middle layer 110 can include a processing system (e.g., a computer) that processes various AR data received from front camera 116 and the back layer camera 106 locally.
  • the processing system can serve as a middle agent between front layer 112 and back layer 102 and external processing resources (e.g., cloud computing resources, server resources, workstation resources, etc.).
  • the processing system or external processing resources can also process the image data from the front and back cameras so as to make the images displayed on the respective display appear as though the display are a translucent window.
  • the image data may be resized, cropped, filtered, or otherwise modified as necessary to make the image displayed appear as though it is being viewed through a window.
  • the image processing may also take into account data describing a location, gaze, orientation, or other characteristic of one or more persons viewing the display that is displaying the image data.
  • This data may be extracted from image data from the front layer camera or the front layer camera (e.g., visible light data or depth information in the image data) or may be based on other sensor data (e.g., other cameras, depth sensors, etc.).
  • back layer 102 of AR-enabled window system 100 includes a back layer camera 106 and back layer display 104.
  • Front layer camera 116 is responsible for capturing the image from the "front" side of the AR-enabled window (e.g., a view of environment 118 within the field of view of front layer camera 116) and sending it to back layer display 104, through a communication medium (e.g., a high-speed wired communication bus, such as USB, Ethernet, PCIe, etc.) between the two layers, so that image 105 of environment 118 can be seen on back layer display 104 by the people at the "back" side of AR-enabled window system 100 in environment 108.
  • a communication medium e.g., a high-speed wired communication bus, such as USB, Ethernet, PCIe, etc.
  • the communication interface between front layer 112 and back layer 102 is a high speed communication interface (HSCI), which is a network interface that allows front layer 1 12 to communicate with back layer 102 and vice versa.
  • HSCI high speed communication interface
  • the HSCI also allows either or both layers to communicate with a processing system associated with the system (e.g., located within middle layer 110 or next to system 100) and/or provide for additional communication capabilities.
  • FIGS. 1 A and IB depict the operation of AR-enabled window system 100 from the "front-facing" view and "back-facing” view, respectively.
  • front- facing view 115 lets a person looking into the front layer display 114 see the video image captured by back layer camera 106 (e.g., the field of view of back layer camera 106 is displayed on front layer display 114). That is, effectively this person is "seeing through” AR-enabled window system 100 to see background environment 108.
  • FIG. IB with back-facing view 105 a person looking into back layer display 104 will see the video image captured by front layer camera 1 16 (e.g., the field of view of front layer camera 1 16 is displayed on back layer display 104).
  • the person is also "seeing through” AR-enabled window system 100 to see background environment 118.
  • the AR-enabled window is effectively (or virtually) “transparent” with respect to the viewers from either side of it.
  • This "transparency" is implemented by first layer camera 1 16 and back layer camera 106 as well as the front layer display 114and back layer display 104.
  • FIGS. 1 A and IB depict front layer camera 116 and back layer 106 as separated from front layer display 114 and back layer display 104, respectively, the cameras may be integrated with the displays in some embodiments. Additionally, for ease of discussion, system 100 is shown so that both sides can be seen at the same time. In practice, system 100 may be implemented into a wall or building so that the displays appear as they are a window of the wall or building. In some cases, the wall or other structure of the building can be middle layer 110.
  • AR-enabled window system 100 This "transparency" effectively makes AR-enabled window system 100 a normal glass from the viewer's point of view.
  • AR- enable window system 100 has enabled many possibilities. For example, some innovative applications of using AR-enable window system 100 with a processing system are described below.
  • One example application is to enable bulletproof or other levels of protective material of AR-enabled window system 100 by implementing middle layer 110 with a bulletproof metal, or other appropriately protective material.
  • Glass window are very common decoration at street level, but they are also very fragile.
  • Bulletproof windows are, however, very expensive.
  • the cost of producing a bulletproof or other form of protective glass windows can be significantly reduced.
  • AR-enabled window system 100 includes facial recognition technology. For example, if AR-enabled window system 100 is deployed at the street level, it can become a very efficient surveillance system which could possibly prevent crime or terrorist attack from happening in a very efficient way. For instance, in one example, many AR-enabled windows are deployed on a busy street, with a processing system providing a network interface connected to the nearest authorities. The back layer cameras (in this example, the "back layer" is the layer facing the outside environment) on the AR-enabled window will be serving as an active surveillance system.
  • any terrorist, dangerous suspects walking by, dangerous items (e.g., guns), and other features of interest can be identified. Once such suspect or other feature is identified, the information can immediately be sent to the nearby authorities through the network interface attached to the system.
  • FIG. 2 depicts a side view of AR-enabled window system 100.
  • HSCI 200 enables display of captured image data from back layer camera 106 on front layer display 114 and enables display of captured image data from front layer camera 1 16 on back layer display 104.
  • HSCI 200 can be implemented as a dedicated interface for communication between the screens and displays (e.g., using standard interfaces such as DisplayPort, HDMI, Thunderbolt, USB, etc.) or parts of a processing system implemented within AR-enabled window system 100 and possibly within middle layer 110.
  • Optional network interface 202 communicates data (e.g., captured image data from front layer camera 116 and/or back layer camera 106) to external computing resources 204 for further processing.
  • the surveillance system could identify suspicious actions, such as bomb installation or person carrying or waiving a gun, in front of the back layer cameras, and immediately warn the people with a view of the front layer display in some augmented reality manner, such as depicted in FIG. 3.
  • suspicious actions such as bomb installation or person carrying or waiving a gun
  • the back layer camera 106 of AR-enabled window system 100 is actively monitoring the street (e.g., by sending image data to local or remote processing resources)
  • data from the back layer camera identifies such action, and can result in immediate warning message 306 on the corresponding front layer display 114 that is displaying image data 304 from back layer camera 106.
  • the people behind AR-enable window system 100 can immediately take protective actions against the attack (e.g., lock the door, hide, evacuate, etc.). If middle layer 110 is present and made of bulletproof material, then the people behind AR-enabled window system 100 will be further protected. Of course, if AR-enabled window system 100 is connected to the authorities, the police will be notified as well.
  • a processing system of AR-enabled window system 100 can include a high-performance CPU by itself, or can be connected to some other external computing resources (e.g., in the cloud).
  • computing system 400 may be used to implement portions of (or all of) system 100 described above that implements any combination of the above embodiments (e.g., the communication between the front and back layers, the processing system, etc.).
  • Computing system 400 may include, for example, a processor, memory, storage, and input/output peripherals (e.g., display, keyboard, stylus, drawing device, disk drive, Internet connection, camera/scanner, microphone, speaker, etc.).
  • computing system 400 may include circuitry or other specialized hardware for carrying out some or all aspects of the processes.
  • the main system 402 may include a motherboard 404 with a bus that connects an input/output (I/O) section 406, one or more microprocessors 408, and a memory section 410, which may have a flash memory card 412 related to it.
  • Memory section 410 may contain computer-executable instructions and/or data for carrying out the processes above.
  • the I/O section 406 may be connected to front display 412 and back display 413 (e.g., to display views), a front camera 428, a back camera 430, a microphone 416 (e.g., to obtain an audio recording), a speaker 418 (e.g., to play from one side of system 100 on the another side), a disk storage unit 420, and a media drive unit 422.
  • the media drive unit 422 can read/write a non-transitory computer-readable storage medium 424, which can contain programs 426 and/or data used to implement the techniques described above.
  • a non-transitory computer-readable storage medium can be used to store (e.g., tangibly embody) one or more computer programs for performing any one of the above-described processes by means of a computer.
  • the computer program may be written, for example, in a general -purpose programming language (e.g., Pascal, C, C++, Java, or the like) or some specialized application-specific language.
  • Computing system 400 may also include communication interface 440.
  • This interface includes one or more interfaces for communicating information with other computing systems.
  • communication interface 440 optionally includes one or more of a wireless interface (e.g., WiFi, 802.11, Bluetooth, 4G, LTE, etc.), wired interface (USB, Ethernet, Thunderbolt, etc.), or other type of interface (e.g., IR).
  • a wireless interface e.g., WiFi, 802.11, Bluetooth, 4G, LTE, etc.
  • wired interface USB, Ethernet, Thunderbolt, etc.
  • Thunderbolt Thunderbolt
  • IR other type of interface
  • a system comprising:
  • a first camera configured to capture first image data of a first field of view in a first direction
  • a second camera configured to capture second image data of a second field of view in a second direction opposite the first direction
  • a first display facing the first direction and configured to display a portion of the second image data, the first display coupled to the second camera via a first communication line;
  • a second display facing the second direction and configured to display a portion of the first image data, the second display coupled to the first camera via a second communication line.
  • the electronic system of item 1 further comprising:
  • first display is mounted on a first side of the support and the second display is mounted on a second side of the support.
  • the electronic system of any one of items 1-4 further comprising: a processing system coupled to the first and second cameras and the first and second displays.
  • the processing system includes a facial recognition system that processing captured image data from the first camera or the second camera. 7. The electronic device of any one of item 5 or item 6, wherein the processing system is configured to cause to display on the first display captured image information from the second camera with information based on processing of captured image information from the second camera, wherein the displayed information is not captured image data from the second camera.
  • first communication line and the second communication line are part of the same communication medium or are the same communication line.
  • first communication line provides a direct connection between the first display and the second camera and the second communication line provides a direct connection between the second display and the first camera.
  • first display is disposed on a first side of the middle layer and the second display is disposed on a second side of the middle layer.
  • a method comprising:
  • first image data e.g., video data
  • first image data is of a field of view of the first camera
  • second image data e.g., video data
  • first image data on the second display while displaying the second image data on the first display so that the display of the first image data and the second image data gives an appearance that the first display and second display are a translucent window (e.g., processing the image data to give the correct perspective, correct size, correct color, etc.).
  • a translucent window e.g., processing the image data to give the correct perspective, correct size, correct color, etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Business, Economics & Management (AREA)
  • Emergency Management (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Ceramic Engineering (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

A system includes a first camera configured to capture first image data of a first field of view in a first direction. The system also includes a second camera configured to capture second image data of a second field of view in a second direction opposite the first direction. A first display faces the first direction and is configured to display a portion of the second image data. The first display is coupled to the second camera via a first communication line. A second display facing the second direction is configured to display a portion of the first image data, the second display coupled to the first camera via a second communication line.

Description

AUGMENTED REALITY ENABLED WINDOWS
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] This application claims priority to U.S. Provisional Patent Application Serial No. 62/460,594, entitled "Augmented Reality Enabled Windows," filed February 17, 2017, the content of which is hereby incorporated by reference for all purposes.
FIELD
[0002] The present disclosure relates to augmented reality and, more specifically, to augmented reality as applied to windows.
BACKGROUND
[0003] As the frequency of terrorist attacks increases to an alarming stage virtually in every corner of the world, especially the well developed countries, many governments spend significant amount of money to deploy surveillance cameras at every busy street. The major drawback of this is that surveillance cameras can provide useful information only after the attack has happened.
[0004] Facial recognition technologies have gradually improved to a reasonably accurate level when the cameras can capture the face of the potential suspects, which is virtually an unachievable requirement for surveillance camera networks. Security cameras are not typically deployed at eye-level. In addition, terrorists often are intelligent enough to avoid surveillance cameras when they prepare for the attacks (e.g., hiding a bomb). This therefore makes facial recognition only useful in the cases such as corporate office entrance security where the people entering into a gate are usually cooperative and would look right into the facial recognition camera.
[0005] A result of accidents or terrorist attacks of many types is that virtually all the street-level glass windows will break. The broken glass can easily cause serious injury or even death to the people near the glasses, who are usually the customers inside some shops or restaurants on the street. However, glass window deployment is typically necessary because of its transparency. SUMMARY
[0006] In one embodiment, an electronic device includes a first camera on a first side of the electronic device. There is a second camera on a second side of the electronic device. A first display on the first side of the electronic device is configured to display image data captured from the second camera. A second display on the second side of the electronic device is configured to display image data captured from the first camera.
BRIEF DESCRIPTION OF THE FIGURES
[0007] The present application can be best understood by reference to the figures described below taken in conjunction with the accompanying drawing figures, in which like parts may be referred to by like numerals.
[0008] FIGS. 1 A-1B depict an embodiment of the present technology.
[0009] FIG. 2 depicts a cross-sectional view of an embodiment of the present technology.
[0010] FIG. 3 depicts the response of an embodiment of the present technology to a terror event.
[0011] FIG. 4 depicts an exemplary device that may be used to implement embodiments of the present technology.
DETAILED DESCRIPTION
[0012] The following description is presented to enable a person of ordinary skill in the art to make and use the various embodiments. Descriptions of specific devices, techniques, and applications are provided only as examples. Various modifications to the examples described herein will be readily apparent to those of ordinary skill in the art, and the general principles defined herein may be applied to other examples and applications without departing from the spirit and scope of the present technology. Thus, the disclosed technology is not intended to be limited to the examples described herein and shown, but is to be accorded the scope consistent with the claims. [0013] Some embodiments of the present technology can be used to replace glass windows in buildings (e.g., street-level windows), such as the glass window of a coffee shop, a display windows of retail shops, or just a normal window at the ground level of a building, with an advanced augmented reality (AR) enabled window that can monitor everyone walking by it on the street outside near the window, and display information to the people inside if specific people (e.g., criminals or terrorists), events (e.g., car accident or terrorists act), or conditions are present outside.
[0014] In some embodiments, the two sides of such advanced AR-enabled windows are separated by arbitrary thickness of various materials (e.g., metal, brick, etc.) that can provide various levels of protection to people behind the widow. For example, the windows may have a thickness of bullet-proof or explosive-proof material. As a result, even if there is a dangerous or threatening event on one side of the window, the broken class or other objects could not enter to the other side at all, which
significantly minimizes the injury after the attack.
[0015] As depicted in FIG. 1 A, in one embodiment, AR-enabled window system 100 includes a back layer 102 that is analogous to a smart tablet or large display or monitor equipped with an advanced camera (back layer camera 106) and screen (back layer display 104 facing away in FIG. 1 A) optionally enabled with AR and facial recognition technologies. Back layer camera 106 is responsible for capturing the image from the "back" side of the AR-enabled window (e.g., a view of environment 108 within the field of view of back layer camera 106) and sending it to front layer display 114, through a communication medium (e.g., a high-speed wired
communication bus, such as USB, Ethernet, PCIe, etc.) between the two layers, so that image 115 of environment 108 can be seen on front display 114 by the people at the "front" side of AR-enabled window system 100 in environment 118. The opposite is also true. Image data captured by front layer camera 116 of front layer 1 12 is sent to back layer display 104 of back layer 102 so that image data captured from front layer camera 116 can be displayed on the back layer display 104.
[0016] An optional middle layer 110 can be made of any material ranging, including, wood, metal, composites, or plastics, depending on the requirements. This middle layer can even be absent in some cases (e.g., the front layer and the back layer are directly attached to each other). One possible way to use middle layer 110 is a metal layer which makes AR-enabled window system 100 bulletproof at a much lower cost than using bullet-proof glass. In some possible configurations, middle layer 110 can include a processing system (e.g., a computer) that processes various AR data received from front camera 116 and the back layer camera 106 locally.
Optionally, if a network interface is available for the processing system to connect to a network (e.g., the Internet or an intranet), the processing system can serve as a middle agent between front layer 112 and back layer 102 and external processing resources (e.g., cloud computing resources, server resources, workstation resources, etc.). The processing system or external processing resources can also process the image data from the front and back cameras so as to make the images displayed on the respective display appear as though the display are a translucent window. For example, the image data may be resized, cropped, filtered, or otherwise modified as necessary to make the image displayed appear as though it is being viewed through a window. The image processing may also take into account data describing a location, gaze, orientation, or other characteristic of one or more persons viewing the display that is displaying the image data. This data may be extracted from image data from the front layer camera or the front layer camera (e.g., visible light data or depth information in the image data) or may be based on other sensor data (e.g., other cameras, depth sensors, etc.).
[0017] As depicted in FIG. IB, in the embodiment, back layer 102 of AR-enabled window system 100 includes a back layer camera 106 and back layer display 104. Front layer camera 116 is responsible for capturing the image from the "front" side of the AR-enabled window (e.g., a view of environment 118 within the field of view of front layer camera 116) and sending it to back layer display 104, through a communication medium (e.g., a high-speed wired communication bus, such as USB, Ethernet, PCIe, etc.) between the two layers, so that image 105 of environment 118 can be seen on back layer display 104 by the people at the "back" side of AR-enabled window system 100 in environment 108.
[0018] In some examples, the communication interface between front layer 112 and back layer 102 is a high speed communication interface (HSCI), which is a network interface that allows front layer 1 12 to communicate with back layer 102 and vice versa. Optionally, the HSCI also allows either or both layers to communicate with a processing system associated with the system (e.g., located within middle layer 110 or next to system 100) and/or provide for additional communication capabilities.
[0019] FIGS. 1 A and IB depict the operation of AR-enabled window system 100 from the "front-facing" view and "back-facing" view, respectively. In FIG. 1 A, front- facing view 115 lets a person looking into the front layer display 114 see the video image captured by back layer camera 106 (e.g., the field of view of back layer camera 106 is displayed on front layer display 114). That is, effectively this person is "seeing through" AR-enabled window system 100 to see background environment 108.
Similarly, in FIG. IB with back-facing view 105, a person looking into back layer display 104 will see the video image captured by front layer camera 1 16 (e.g., the field of view of front layer camera 1 16 is displayed on back layer display 104). In other words, the person is also "seeing through" AR-enabled window system 100 to see background environment 118. As a result, the AR-enabled window is effectively (or virtually) "transparent" with respect to the viewers from either side of it. This "transparency" is implemented by first layer camera 1 16 and back layer camera 106 as well as the front layer display 114and back layer display 104.
[0020] While FIGS. 1 A and IB depict front layer camera 116 and back layer 106 as separated from front layer display 114 and back layer display 104, respectively, the cameras may be integrated with the displays in some embodiments. Additionally, for ease of discussion, system 100 is shown so that both sides can be seen at the same time. In practice, system 100 may be implemented into a wall or building so that the displays appear as they are a window of the wall or building. In some cases, the wall or other structure of the building can be middle layer 110.
[0021] This "transparency" effectively makes AR-enabled window system 100 a normal glass from the viewer's point of view. However, with the addition of a processing system in communication with front layer 112 and back layer 102, AR- enable window system 100 has enabled many possibilities. For example, some innovative applications of using AR-enable window system 100 with a processing system are described below.
[0022] One example application is to enable bulletproof or other levels of protective material of AR-enabled window system 100 by implementing middle layer 110 with a bulletproof metal, or other appropriately protective material. Glass window are very common decoration at street level, but they are also very fragile. Bulletproof windows are, however, very expensive. By using embodiments of AR-enabled window system 100, the cost of producing a bulletproof or other form of protective glass windows can be significantly reduced.
[0023] In some embodiments, AR-enabled window system 100 includes facial recognition technology. For example, if AR-enabled window system 100 is deployed at the street level, it can become a very efficient surveillance system which could possibly prevent crime or terrorist attack from happening in a very efficient way. For instance, in one example, many AR-enabled windows are deployed on a busy street, with a processing system providing a network interface connected to the nearest authorities. The back layer cameras (in this example, the "back layer" is the layer facing the outside environment) on the AR-enabled window will be serving as an active surveillance system. Through the built-in facial recognition and/or image recognition system available from the processing system or external processing resources in communication with the system, any terrorist, dangerous suspects walking by, dangerous items (e.g., guns), and other features of interest can be identified. Once such suspect or other feature is identified, the information can immediately be sent to the nearby authorities through the network interface attached to the system.
[0024] FIG. 2 depicts a side view of AR-enabled window system 100. HSCI 200 enables display of captured image data from back layer camera 106 on front layer display 114 and enables display of captured image data from front layer camera 1 16 on back layer display 104. HSCI 200 can be implemented as a dedicated interface for communication between the screens and displays (e.g., using standard interfaces such as DisplayPort, HDMI, Thunderbolt, USB, etc.) or parts of a processing system implemented within AR-enabled window system 100 and possibly within middle layer 110. Optional network interface 202 communicates data (e.g., captured image data from front layer camera 116 and/or back layer camera 106) to external computing resources 204 for further processing.
[0025] For example, the surveillance system could identify suspicious actions, such as bomb installation or person carrying or waiving a gun, in front of the back layer cameras, and immediately warn the people with a view of the front layer display in some augmented reality manner, such as depicted in FIG. 3. For example, assume that gunman 302 suddenly lifts a gun on the street. Since the back layer camera 106 of AR-enabled window system 100 is actively monitoring the street (e.g., by sending image data to local or remote processing resources), data from the back layer camera (or multiple back cameras) identifies such action, and can result in immediate warning message 306 on the corresponding front layer display 114 that is displaying image data 304 from back layer camera 106. As a result, the people behind AR-enable window system 100 (e.g., the diners in FIG. 3) can immediately take protective actions against the attack (e.g., lock the door, hide, evacuate, etc.). If middle layer 110 is present and made of bulletproof material, then the people behind AR-enabled window system 100 will be further protected. Of course, if AR-enabled window system 100 is connected to the authorities, the police will be notified as well. In addition, since relatively intensive processing power maybe needed to enable the active monitoring functionality, a processing system of AR-enabled window system 100 can include a high-performance CPU by itself, or can be connected to some other external computing resources (e.g., in the cloud).
[0026] Note that from gunman 302's point of view, he/she only perceives a normal window because what he/she sees is the video image captured by front layer camera 116 (e.g., people inside the shop or restaurant).
[0027] Turning now to FIG. 4, components of an exemplary computing system 400, configured to perform any of the above-described processes and/or operations are depicted. For example, computing system 400 may be used to implement portions of (or all of) system 100 described above that implements any combination of the above embodiments (e.g., the communication between the front and back layers, the processing system, etc.). Computing system 400 may include, for example, a processor, memory, storage, and input/output peripherals (e.g., display, keyboard, stylus, drawing device, disk drive, Internet connection, camera/scanner, microphone, speaker, etc.). However, computing system 400 may include circuitry or other specialized hardware for carrying out some or all aspects of the processes.
[0028] In computing system 400, the main system 402 may include a motherboard 404 with a bus that connects an input/output (I/O) section 406, one or more microprocessors 408, and a memory section 410, which may have a flash memory card 412 related to it. Memory section 410 may contain computer-executable instructions and/or data for carrying out the processes above. The I/O section 406 may be connected to front display 412 and back display 413 (e.g., to display views), a front camera 428, a back camera 430, a microphone 416 (e.g., to obtain an audio recording), a speaker 418 (e.g., to play from one side of system 100 on the another side), a disk storage unit 420, and a media drive unit 422. The media drive unit 422 can read/write a non-transitory computer-readable storage medium 424, which can contain programs 426 and/or data used to implement the techniques described above.
[0029] Additionally, a non-transitory computer-readable storage medium can be used to store (e.g., tangibly embody) one or more computer programs for performing any one of the above-described processes by means of a computer. The computer program may be written, for example, in a general -purpose programming language (e.g., Pascal, C, C++, Java, or the like) or some specialized application-specific language.
[0030] Computing system 400 may also include communication interface 440. This interface includes one or more interfaces for communicating information with other computing systems. For example communication interface 440 optionally includes one or more of a wireless interface (e.g., WiFi, 802.11, Bluetooth, 4G, LTE, etc.), wired interface (USB, Ethernet, Thunderbolt, etc.), or other type of interface (e.g., IR).
[0031] Various exemplary embodiments are described herein. Reference is made to these examples in a non-limiting sense. They are provided to illustrate more broadly applicable aspects of the disclosed technology. Various changes may be made and equivalents may be substituted without departing from the true spirit and scope of the various embodiments. In addition, many modifications may be made to adapt a particular situation, material, composition of matter, process, process act(s) or step(s) to the objective(s), spirit or scope of the various embodiments. Further, as will be appreciated by those with skill in the art, each of the individual variations described and illustrated herein has discrete components and features which may be readily separated from or combined with the features of any of the other several embodiments without departing from the scope or spirit of the various embodiments. [0032] Exemplary methods, non-transitory computer-readable storage media, systems, and electronic devices are set out in the following items:
1. A system comprising:
a first camera configured to capture first image data of a first field of view in a first direction;
a second camera configured to capture second image data of a second field of view in a second direction opposite the first direction;
a first display facing the first direction and configured to display a portion of the second image data, the first display coupled to the second camera via a first communication line; and
a second display facing the second direction and configured to display a portion of the first image data, the second display coupled to the first camera via a second communication line.
2. The electronic system of item 1 further comprising:
a support, wherein the first display is mounted on a first side of the support and the second display is mounted on a second side of the support.
3. The electronic system of item 2, wherein the first side is opposite the second side.
4. The electronic system of any one of items 1-3, wherein the first display is mounted on a first support surface of the support and the second display is mounted on a second support surface of the support, wherein the first support surface is opposite the second support surface.
5. The electronic system of any one of items 1-4 further comprising: a processing system coupled to the first and second cameras and the first and second displays.
6. The electronic device of item 5, wherein the processing system includes a facial recognition system that processing captured image data from the first camera or the second camera. 7. The electronic device of any one of item 5 or item 6, wherein the processing system is configured to cause to display on the first display captured image information from the second camera with information based on processing of captured image information from the second camera, wherein the displayed information is not captured image data from the second camera.
8. The electronic device of any one of items 5-7, wherein the processing system is configured to identify a threat in the first field of view based on the first image data and to cause display of a message on the second display based on the threat.
9. The electronic device of item 8, wherein the processing system is configured to identify to cause display of an indicator of the threat on the second display.
10. The electronic device of any one of item 2-9, wherein the support is made of a bulletproof material.
11. The electronic device of any one of item 2-10, wherein the support is made of metal.
12. The electronic device of any one of items 1-11, wherein the first camera is integrated with the first display.
13. The electronic device of any one of items 1-12, wherein the first display is integrated into a wall of a building.
14. The electronic device of any one of items 1-13, wherein the first display is integrated into a wall of a building.
15. The electronic device of any one of items 1-14, wherein the first communication line and the second communication line are part of the same communication medium or are the same communication line. 16. The electronic device of any one of items 1-15, wherein the first communication line provides a direct connection between the first display and the second camera and the second communication line provides a direct connection between the second display and the first camera.
17. The electronic system of claim 1 further comprising:
a middle layer, wherein the first display is disposed on a first side of the middle layer and the second display is disposed on a second side of the middle layer.
18. The electronic system of item 17, wherein the first side of the middle layer is opposite a second side of the middle layer.
19. The electronic system of item 17 or 18, wherein the middle is made of a bulletproof material.
20. The electronic system of item 17 or 18, wherein the middle layer is made of metal.
21. A method comprising:
at an electronic system having a first display, a second display, a first camera, a second camera, and communication lines:
capturing first image data (e.g., video data) with the first camera, wherein the first image data is of a field of view of the first camera;
capturing second image data (e.g., video data) with the second camera, wherein the second image data is of a field of view of the second camera; and
displaying the first image data on the second display while displaying the second image data on the first display so that the display of the first image data and the second image data gives an appearance that the first display and second display are a translucent window (e.g., processing the image data to give the correct perspective, correct size, correct color, etc.).

Claims

CLAIMS What is claimed is:
1. A system comprising:
a first camera configured to capture first image data of a first field of view in a first direction;
a second camera configured to capture second image data of a second field of view in a second direction opposite the first direction;
a first display facing the first direction and configured to display a portion of the second image data, the first display coupled to the second camera via a first communication line; and
a second display facing the second direction and configured to display a portion of the first image data, the second display coupled to the first camera via a second communication line.
2. The electronic system of claim 1 further comprising:
a support, wherein the first display is mounted on a first side of the support and the second display is mounted on a second side of the support.
3. The electronic system of claim 2, wherein the first side is opposite the second side.
4. The electronic system of claim 1, wherein the first display is mounted on a first support surface of the support and the second display is mounted on a second support surface of the support, wherein the first support surface is opposite the second support surface.
5. The electronic system of claim 1 further comprising:
a processing system coupled to the first and second cameras and the first and second displays.
6. The electronic system of claim 5, wherein the processing system includes a facial recognition system that processing captured image data from the first camera or the second camera.
7. The electronic system of claim 5, wherein the processing system is configured to cause to display on the first display captured image information from the second camera with information based on processing of captured image information from the second camera, wherein the displayed information is not captured image data from the second camera.
8. The electronic system of claim 5, wherein the processing system is configured to identify a threat in the first field of view based on the first image data and to cause display of a message on the second display based on the threat.
9. The electronic system of claim 8, wherein the processing system is configured to identify to cause display of an indicator of the threat on the second display.
10. The electronic system of claim 2, wherein the support is made of a bulletproof material.
11. The electronic system of claim 2, wherein the support is made of metal.
12. The electronic system of claim 1, wherein the first camera is integrated with the first display.
13. The electronic device of claim 1, wherein the first display is integrated into a wall of a building.
14. The electronic system of claim 1, wherein the first display is integrated into a wall of a building.
15. The electronic system of claim 1, wherein the first communication line and the second communication line are part of the same communication medium or are the same communication line.
16. The electronic system of claim 1, wherein the first communication line provides a direct connection between the first display and the second camera and the second communication line provides a direct connection between the second display and the first camera.
17. The electronic system of claim 1 further comprising:
a middle layer, wherein the first display is disposed on a first side of the middle layer and the second display is disposed on a second side of the middle layer.
18. The electronic system of claim 17, wherein the first side of the middle layer is opposite a second side of the middle layer.
19. The electronic system of claim 17 or 18, wherein the middle is made of a bulletproof material.
20. The electronic system of claim 17 or 18, wherein the middle layer is made of metal.
21. A method comprising:
at an electronic system having a first display, a second display, a first camera, a second camera, and communication lines:
capturing first image data with the first camera, wherein the first image data is of a field of view of the first camera;
capturing second image data with the second camera, wherein the
second image data is of a field of view of the second camera; and
displaying the first image data on the second display while displaying the second image data on the first display so that the display of the first image data and the second image data gives an appearance that the first display and second display are a translucent window.
PCT/IB2018/000233 2017-02-17 2018-02-17 Augmented reality enabled windows WO2018150270A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201762460594P 2017-02-17 2017-02-17
US62/460,594 2017-02-17

Publications (1)

Publication Number Publication Date
WO2018150270A1 true WO2018150270A1 (en) 2018-08-23

Family

ID=63170144

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2018/000233 WO2018150270A1 (en) 2017-02-17 2018-02-17 Augmented reality enabled windows

Country Status (1)

Country Link
WO (1) WO2018150270A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080201116A1 (en) * 2007-02-16 2008-08-21 Matsushita Electric Industrial Co., Ltd. Surveillance system and methods
US20120218191A1 (en) * 2011-02-25 2012-08-30 Amazon Technologies, Inc. Multi-display type device interactions

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080201116A1 (en) * 2007-02-16 2008-08-21 Matsushita Electric Industrial Co., Ltd. Surveillance system and methods
US20120218191A1 (en) * 2011-02-25 2012-08-30 Amazon Technologies, Inc. Multi-display type device interactions

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
"BoldVu Outdoor LCD Displays", LG-MRI, 31 December 2016 (2016-12-31), XP055537827, Retrieved from the Internet <URL:https://web.archive.ore/web/20170223204519/https://lg-mri.com/boldvu-outdoor-lcd-displays> [retrieved on 20180718] *
"Digital Signage 2017", VESTEL VISUAL SOLUTIONS, 31 December 2017 (2017-12-31), pages 1 - 60, XP055537822, Retrieved from the Internet <URL:http://www.vestelvisualsolutions.com/files/catalogue_1485448292.pdf> [retrieved on 20180701] *
WESTOVER, BRIAN: "Asus Taichi 21, Review & Rating", PCMAG.COM, 30 January 2013 (2013-01-30), XP055537826, Retrieved from the Internet <URL:https://www.pcmag.com/article2/0,2817,2414883,00.asp> [retrieved on 20180718] *

Similar Documents

Publication Publication Date Title
US10937290B2 (en) Protection of privacy in video monitoring systems
US10798313B2 (en) Preserving privacy in surveillance
US9928630B2 (en) Hiding sensitive content visible through a transparent display
US8860812B2 (en) Ambient presentation of surveillance data
CN104980653A (en) System and method of camera parameter updates in video surveillance systems
US9733881B2 (en) Managing digital object viewability for a transparent display system
TW200841737A (en) Video analytics for banking business process monitoring
US20150124091A1 (en) Door viewing system and apparatus
US20210390215A1 (en) Method for automatically protecting an object, a person or an item of information or visual work from a risk of unwanted viewing
KR102365995B1 (en) Display device capable of automatically adjusting displayed image and method thereof
KR101890134B1 (en) The analysis system and controlling method of moving image data by a CCTV monitor
US9292249B2 (en) System with content display management
TW201801520A (en) Image device and method for detecting a chief
WO2018150270A1 (en) Augmented reality enabled windows
KR20160003996A (en) Apparatus and method for video analytics
Mann et al. FreeGlass for developers,“haccessibility”, and Digital Eye Glass+ Lifeglogging research in a (sur/sous) veillance society
Michael Sousveillance: Implications for privacy, security, trust, and the law
JP2012120105A (en) Video communication system, video communication apparatus, and information disclosure degree adjusting apparatus
TWI480795B (en) Video monitoring and analyzing method
US10935796B2 (en) Sensor data conveyance
US11081089B2 (en) Flicker-fusion-based image capture privacy control system
WO2022022809A1 (en) Masking device
US20090284602A1 (en) Monitor with cameras
Michael Redefining surveillance: Implications for privacy, security, trust and the law
JP7180243B2 (en) surveillance systems and cameras

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18754148

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18754148

Country of ref document: EP

Kind code of ref document: A1