US20120229637A1 - System and method for detecting a camera - Google Patents
System and method for detecting a camera Download PDFInfo
- Publication number
- US20120229637A1 US20120229637A1 US13/476,603 US201213476603A US2012229637A1 US 20120229637 A1 US20120229637 A1 US 20120229637A1 US 201213476603 A US201213476603 A US 201213476603A US 2012229637 A1 US2012229637 A1 US 2012229637A1
- Authority
- US
- United States
- Prior art keywords
- image
- retro
- target area
- reflections
- axis
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000000034 method Methods 0.000 title claims abstract description 27
- 230000003287 optical effect Effects 0.000 claims description 57
- 238000005286 illumination Methods 0.000 claims description 48
- 238000001914 filtration Methods 0.000 claims description 5
- 238000001514 detection method Methods 0.000 description 22
- 238000010586 diagram Methods 0.000 description 18
- 238000012545 processing Methods 0.000 description 10
- 230000006870 function Effects 0.000 description 7
- 238000005516 engineering process Methods 0.000 description 4
- 238000003384 imaging method Methods 0.000 description 4
- 230000008901 benefit Effects 0.000 description 3
- 230000008859 change Effects 0.000 description 3
- 238000004519 manufacturing process Methods 0.000 description 3
- 238000003909 pattern recognition Methods 0.000 description 3
- 238000012952 Resampling Methods 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 2
- 230000005670 electromagnetic radiation Effects 0.000 description 2
- 230000008447 perception Effects 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 238000012216 screening Methods 0.000 description 2
- 230000035945 sensitivity Effects 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 238000012935 Averaging Methods 0.000 description 1
- 241000282326 Felis catus Species 0.000 description 1
- 235000002017 Zea mays subsp mays Nutrition 0.000 description 1
- 241000482268 Zea mays subsp. mays Species 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 230000000740 bleeding effect Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 230000008878 coupling Effects 0.000 description 1
- 238000010168 coupling process Methods 0.000 description 1
- 238000005859 coupling reaction Methods 0.000 description 1
- 238000007405 data analysis Methods 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 230000004313 glare Effects 0.000 description 1
- 230000000873 masking effect Effects 0.000 description 1
- 230000000116 mitigating effect Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 238000002310 reflectometry Methods 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 210000001525 retina Anatomy 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 238000009738 saturating Methods 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/40—Analysis of texture
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
- G06V10/12—Details of acquisition arrangements; Constructional details thereof
- G06V10/14—Optical characteristics of the device performing the acquisition or on the illumination arrangements
- G06V10/147—Details of sensors, e.g. sensor lenses
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/60—Extraction of image or video features relating to illumination properties, e.g. using a reflectance or lighting model
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/56—Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
Definitions
- Motion pictures are generally first released in movie theaters before being made available on consumer media. This limited monopoly insures revenue so that production companies can recoup the costs of production.
- the method of the present embodiment includes, but is not limited to: acquiring a first image of a target area on the same axis as an illumination source; acquiring a second image of the target area on a different axis from the illumination source; and identifying retro-reflections in the target area by analyzing the first image and the second image by: filtering the first image; creating an image mask using the second image; applying the image mask to the first image; and separating features in the foreground in the first image larger than a predetermined size.
- the system of the present embodiment includes, but is not limited to: an illumination system for illuminating a target area; an image acquisition system for capturing a first image of the target area on the same axis as the illumination system and for capturing a second image of the target area on a different axis from the illumination system; and a processor and computer readable media having computer code for causing a processor to identify retro-reflections in the target area by analyzing the first image and the second image.
- FIG. 1 is a pictorial view of one embodiment of the system in a carrying case
- FIG. 2 is a pictorial view of a hidden pirate camera being used in a movie theater
- FIG. 3A is a pictorial view of a further embodiment of the system using three lenses
- FIG. 3B is a schematic diagram of the embodiment depicted in FIG. 3A ;
- FIG. 4 is a block diagram describing the functional layout of the embodiment depicted in FIGS. 3A and 3B ;
- FIG. 5 is a pictorial view depicting a still further embodiment of the system employing a single illuminator for dual-axis illumination;
- FIG. 6 is a block diagram describing the functional layout of the embodiment depicted in FIG. 5 ;
- FIG. 7 is an electrical schematic diagram of one embodiment of the main controller board
- FIGS. 8A , 8 B, 8 C and 8 D are schematic diagrams depicting digital resampling
- FIG. 9 is a schematic diagram depicting a still further embodiment of the system employing two scanning mirrors.
- FIG. 10 is a block diagram describing one embodiment of the method of detecting optical devices
- FIG. 11 is a block diagram describing one embodiment of the method of image processing
- FIG. 12 is a schematic diagram depicting a still further embodiment of the system employing a second off-axis camera.
- FIG. 13 is a flowchart describing another embodiment of the method of detecting optical devices.
- the present teachings relate to the field of optical detection systems and methods and, more particularly, to a system and method for detecting hidden optical devices such as, although not limited thereto, cameras and video recorders.
- Disclosed herein are methods of and systems for locating the surreptitious use of “pirate” cameras in areas such as theaters, although not limited thereto.
- any place where one desires to prevent the use of cameras, video equipment, or other optical instruments may be a suitable use for this system. Examples include, although not limited thereto, sporting events, dramatic theater, political events, private functions, art galleries, trade shows, research laboratories, manufacturing facilities, bathrooms, hotel rooms, meetings, protection from snipers, etc.
- the system comprises an image capturing system (also referred to as a camera) and an illuminator, although not limited thereto.
- an area of interest may be illuminated with light by the illuminator and images of the area of interest may be taken with the camera at different exposure levels. Images of pirate cameras exhibit unique characteristics at different exposure levels because the light from the illuminator is reflected by their optical lenses. Comparing the images with the help of an algorithm, which may be implemented in software or hardware, although not limited thereto, helps to identify and locate a pirate camera.
- Detection of optical equipment using retro-reflection occurs whenever light entering the lens of an optical system is focused onto and reflected back from the focal plane of a lens system.
- the “lens system” can be, although not limited thereto, that of a camera, telescope, scope optics or the lens of an eye.
- the focal plane can be, although not limited thereto, a film plane, an electronic imaging device such as a charge-coupled device (CCD), or the retina of the eye.
- CCD charge-coupled device
- the amount of reflectivity at the focal plane can be very low and still produce detectable retro signals because the “gain” from the collection area of the lens system usually is quite large.
- the retro-reflection signal from optical equipment is along the same line-of-sight (LOS) as the interrogation beam of the illuminator (on-axis), but other sources of light in the vicinity of the target may produce a response as well, e.g., glare sources.
- LOS line-of-sight
- glare sources e.g., glare sources.
- These other random sources are “clutter” (also referred to as “glint”), and while detection without clutter rejection is possible, it often results in too many false positives.
- the system may comprise an off-axis illuminator 104 , a camera 100 and a filter 102 , although not limited to this embodiment.
- the off-axis illuminator 104 illuminates the target area in a specific band of light.
- the off-axis illuminator 104 may be close enough to the camera 100 that it is within the angle of incidence of any reflected light. Consequently, the off-axis illuminator 104 is able to generate retro-reflection from any optical devices in the target area.
- an infrared (IR) illuminator may be used.
- Other forms of light may also be used and the present teachings are not limited to this particular type of light.
- the IR illuminator may operate in the near IR band with a center wavelength with a range of approximately between 700 nm and 1600 nm and a bandwidth around approximately 100 nm, although not limited to these particular ranges. Other types of light in various bandwidths may be more appropriate under difference circumstances and are discussed further below.
- a laser may be used instead.
- the off-axis illuminator 104 provides light to the target area which will be reflected by certain objects such as optical equipment, which helps to identify the use of pirate cameras.
- the system may also include a camera 100 , which may be a charge-coupled device (CCD) digital camera, although not limited to this embodiment.
- a camera 100 may be a charge-coupled device (CCD) digital camera, although not limited to this embodiment.
- CCD charge-coupled device
- the camera 100 may be able to detect the wavelength emitted by the off-axis illuminator 104 and record images that may be manipulated digitally. Additionally, the camera 100 may include means for manipulating exposure by aperture, exposure time or otherwise, although not limited to this embodiment.
- the camera 100 may include a filter 102 that accepts the wavelengths of light provided by the off-axis illuminator 104 while rejecting all other wavelengths, although not limited to this embodiment. This is helpful to identify true retro-reflections of the light from the off-axis illuminator 104 .
- the filter 102 will help to isolate the reflected IR light from visible light, assisting in identifying reflections.
- the system may be constructed into a single kit which may be portable or they may be installed separately at a location in a permanent or semi-permanent fashion, although not limited to this embodiment.
- a carrying case 106 may contain the camera 100 , off-axis illuminator 104 and filter 102 , so that the system may be easily transported and brought to a temporary location such as an art gallery, although not limited to this embodiment.
- a more permanent location such as a movie theater may choose to incorporate the system into the stage or screen to assure a clear view of the gallery, although not limited to this embodiment.
- a large target area such as a theater, although not limited thereto, may be illuminated in sections by the off-axis illuminator 104 .
- the camera 100 may then capture multiple exposures of the illumination area, such as a short exposure image and a long exposure image, although not limited thereto.
- the amount of exposure may be controlled by changing the exposure time or other means, such as by changing the aperture size, and the system is not limited to this embodiment. Multiple images provide a more reliable method of detection of any pirate cameras.
- the short exposure image exposure time may be set to obtain the following image, although not limited to this embodiment:
- the intensity of the image of the pirate camera lens is near the detection camera's maximum possible intensity
- the long exposure image exposure time may be set to obtain the following image, although not limited to this embodiment:
- the image of the pirate camera lens still appears as a local maximum, e.g., a well defined peak of bright pixels surrounded by darker pixels;
- the current exposure times for the short and long exposures are approximately 80 ms and approximately 750 ms, respectively, although not limited to these particular ranges.
- the exact exposure times depend on a number of settings/factors such as lens size, aperture size and camera sensitivity. It may be advantageous to keep the camera 100 relatively still between the exposures so that everything that is not moving in the illuminated area appears in substantially the same place in each image, although not limited to this embodiment.
- the output of the camera 100 may be coupled to a processor and storage device (neither shown in FIG. 1 ) such as a personal computer, which may be programmed according to the present system, although not limited to this embodiment.
- the coupling may be by physical connection to a proximate processor, or the camera 100 output may connect to the processor wirelessly, although not limited to this embodiment.
- the processor may contain software or have configured hardware in order to manipulate the images on the storage device, although not limited to this embodiment.
- the processor may manipulate the captured short and long exposure images according to the following algorithm, although not limited to this embodiment:
- the processor may determine the presence and location of any pirate cameras in the images taken by the camera 100 since the positives that are in both images are pirate cameras.
- the positives in the images are a result of light from the off-axis illuminator 104 reflected by the pirate camera lens.
- the positives (or reflected light) may have the following characteristics, although not limited to this embodiment:
- the signal is a local maximum for a wide range of exposure levels
- the signal maintains roughly the same size across a wide range of exposure levels, while other image objects change apparent size as they are illuminated, saturating and ‘bleeding’ into other objects; and 4. It does not move while images of it are acquired.
- the unique reflected signal characteristic of pirate cameras may be due the following, although not limited to this embodiment:
- the off-axis illuminator 104 is close enough to the camera 100 that it is within the angle of incidence of any reflected light. Consequently, the off-axis illuminator is able to generate retro-reflection from any optical devices in the target area.
- the high-exposure (e.g., 750 ms, etc.) image will include reflections from all background clutter.
- the low-exposure (e.g., 80 ms, etc.) image will include just the retro-reflections of optical devices. So optical devices may be identified by their presence in both images.
- FIG. 2 shown is a pictorial view of a hidden pirate camera being used in a movie theater.
- Pirate cameras 110 are known to have been smuggled into movie theaters and hidden in many different ways.
- a pirate camera 110 may be hidden in a popcorn box 112 .
- the pirate camera 110 may be smuggled into the movie theater and positioned in such a way that it has a clear view of the screen.
- the pirate camera 110 may record the entire motion picture and then be duplicated and sold on the black market.
- the system disclosed herein is able to identify a pirate camera 110 no matter how it is hidden since its lens will be directed toward the screen in order to record the movie. By positioning the present system at the screen facing outwards, any pirate camera 110 may be detected by light reflecting off of its lens.
- the system may have multiple lenses 130 , each with a specific purpose.
- the system may have: an off-axis illuminator 104 ; a camera 100 (or detector lens); and an on-axis illuminator 134 , each having lenses operationally connected.
- each lens may be a zoom lens, a wide-angle lens, or some other type of lens appropriate for the system.
- the illumination light sources for the off-axis illuminator 104 and on-axis illuminator 134 may be LEDs 128 (light-emitting-diodes), although many other wavelengths of light are appropriate and the system is not limited to this particular embodiment. LEDs may be utilized because a pirate camera may employ an IR filter to block any retro-reflection from an IR illuminator.
- the system may also incorporate a beam splitter 120 and mirror 122 , although not limited to this embodiment, in order to provide for the on-axis illuminator 134 .
- the mirror 122 may reflect the light from the on-axis illuminator 134 to the beam splitter 120 , which in turn reflects the light to the target area along the axis of the camera 100 (on-axis). While the beam splitter 120 reflects the light from the on-axis illuminator 134 to the target area, it may still allow any light reflected from the target area to pass through to the camera 100. On-axis illumination is helpful to find true retro-reflection from pirate cameras.
- the off-axis illuminator 104 may then be used as a discriminator against false positives by the on-axis illuminator 134 .
- the system detects a pirate camera (or other optical device in the target area) by light reflected from the on-axis illuminator 134 , it may activate the off-axis illuminator 104 to get reflected light from a different angle.
- a comparison of the reflections by the on-axis illuminator 134 and the off-axis illuminator 104 helps to minimize false positives.
- Optical equipment will only exhibit true retro-reflection, so if off-axis illumination of the same area also returns reflected light, it has to be background clutter and can not be a pirate camera.
- the system may do subsequent interrogations to confirm any detections, although not limited to this embodiment.
- the system may be on a pan/tilt track 124 , although not limited thereto, permitting the scanning of an area larger than the field of view of a fixed camera 100 lens.
- the system may scan the desired target area—for example, although not limited thereto, a theater audience—with a low level interrogation source several times during a movie.
- the ability to pan may keep the feature size to a reasonable geometry, permitting the use of a lens with a smaller pixel count and an illumination source that requires less power, although not limited to this embodiment.
- the pixel count is used to make detections by measuring the size of the reflected light.
- Pirate camera lenses for example, may exhibit more reflection than background clutter due to the retro-reflection of the optical lens.
- the system could also be used without a pan/tilt track 124 with a lens having a larger pixel count and an illumination source with power capable of illuminating the entire target area.
- the system may also incorporate forensic image acquisition, although not limited to this embodiment, and may take a forensic image of any people near a detected pirate camera.
- the system may do so by incorporating a capability to acquire and store images of the pirate camera and a nearby pirate camera operator in very low light levels, such as inside a theater, and transmit this image wirelessly to a system operator located elsewhere in the movie complex.
- the system may wirelessly notify an official, such as an usher or security guards, of the presence of a pirate camera, although not limited to this embodiment.
- the system may do so by incorporating a method to send an alert of positive camera detection to email, cell phone via MMS technology, or a pager, although not limited to these embodiments.
- This alert may include a copy of the forensic image of the pirate camera and operator as well as their physical location within the target area (e.g., a theater, etc.), although not limited to this embodiment.
- an usher may immediately receive notification of the pirate camera detection and immediately find the pirate camera and operator with minimal interruption to the rest of the audience.
- Control and monitoring of the system may be completely remote and in almost all cases such detection will be unknown to the operator of the pirate camera being interrogated.
- Features of the system may include active source control, high resolution imaging, remote zoom control, pre-programmed scan control, remote operation and image transmission, and low cost, although not limited to these embodiments.
- the on-axis illuminator 134 may use a highly directional LED 128 for its illumination source, which projects light through a zoom lens 206 positioned by the controller board underneath the pan and tilt platform 138 to a fixed mirror 122 .
- the light is then projected along the on-axis illumination light path 118 to the (dichroic) beam splitter 120 which projects the light along the axis of the camera 100 to the target area. Any reflection of the illumination light from objects in the target area is passed back through the beam splitter 120 and zoom lens 206 to the camera 100 .
- Signals from the camera pixels illuminated by the reflection are provided to the Local PC 146 as images and then sent by the controller board to an antenna which sends the image to a Remote PC (not shown in FIG. 3B ) controlled by a system operator.
- the image seen by the system operator could be either a retro-reflection from a pirate camera (or other optical device) in the target area if the path of the reflection is on the axis of camera 100 , or it could be background glint (clutter) if the path of the reflection is omni-directional.
- An off-axis illuminator 104 may be positioned above the camera 100 (as shown in FIG. 3A ) and used to reduce false positives of detections by the on-axis illuminator 134 .
- the light is projected from an LED 128 through a zoom lens 206 along the off-axis illumination light path 119 . Any reflection of the light from objects in the target area is passed back through the beam splitter 120 and zoom lens 206 to the camera 100 . Signals from the camera pixels illuminated by the reflection are provided to the Local PC 146 as images and sent by the controller board to an antenna which sends the image to the Remote PC controlled by a system operator.
- the image seen by the system operator can only be background glint (clutter) as a retro-reflection from a pirate camera illuminated by light projected from the off-axis illuminator 104 cannot be reflected back to the camera 100 .
- the system operator gets a reflection when light is projected from the on-axis illuminator 134 , but not when light is projected from the off-axis illuminator 104 , pointed to the same location in the target area by the pan motor 137 and tilt motor 139 along the pan/tilt track 124 by the controller board, the reflection is a retro-reflection of a camera and not background glint.
- the reflection is background glint.
- One embodiment of the method for identifying optical equipment in the target area includes the steps of illuminating an area with a light source on a first axis; capturing a first image of the illuminated area; identifying a potential optical device by its reflection characteristics in the first image; illuminating the potential optical device with a light source on a second axis; capturing a second image of the potential optical device; and identifying an actual optical device by its reflection characteristics in the second image.
- the Local PC may automatically detect the retro-reflection of pirate cameras with the dual-axis illumination method and image processing algorithm (discussed further below) by employing software stored on computer readable media.
- the Remote PC may also have computer instructions on computer readable media used to control the functions of the system.
- the Remote PC may have a graphical user interface (GUI) that permits a system operator, positioned remotely, although not limited thereto, to control the system functions via the Local PC, either wirelessly or otherwise.
- GUI graphical user interface
- Each Remote PC may control a number of different Local PCs. For example, a movie theater complex may have 16 different screening rooms all employing the system and a single system operator may control and monitor them all from a single location.
- the GUI may have a map of the target area (e.g., theater) and communicate with the Local PC to identify the location of any pirate cameras and obtain forensic imaging of potential pirate camera operators.
- the Remote PC may monitor the system, accepts alerts, and store images provided by the Local PC, although not limited thereto.
- the system operator may take control of the system through the GUI and pan/zoon the target area looking for a specific person, although not limited thereto.
- the Remote PC may also further analyze the images sent by the Local PC.
- the system may have a Remote PC 144 which acts as the master controller for all detectors (e.g., one Remote PC 144 could control several Local PCs), although not limited to this embodiment.
- the Local PC 146 may handle all functions of a local detector from image processing to motion control to illumination control, although not limited to this embodiment.
- the camera 172 may detect pirate cameras and also be used for real time situational awareness, although not limited to this embodiment.
- the controller board 170 may handle illuminators (e.g., on-axis lens 174 and off-axis lens 176 , etc.), light sensor 164 , pan motor 180 and tilt motor 182 , all lenses, and communicates with the Local PC 146 , although not limited to this embodiment.
- the light sensor 164 may keep track of the general illumination level of the target area to optimize accuracy, although not limited to this embodiment.
- Two LEDs may act as the main illuminators for detection and discrimination, although not limited to this embodiment. Both on-axis or off-axis light may be controlled with scanning mirrors.
- the system may incorporate non-sequential scanning, although not limited to this embodiment. The system may do so by incorporating a capability to randomly scan an entire movie theater several times during the showing of a movie. This minimizes the perception of the LED flash utilized by the device to detect camcorders.
- the system may also incorporate countermeasures mitigation, although not limited to this embodiment.
- the system may do so by incorporating image processing algorithms which provide the capability to be able to detect a retro reflection from a camcorder equipped with countermeasures such as circular polarizer filters, intended to defeat the pirate camera detection system. In this way, the user of a pirate camera will be unable to avoid detection.
- the system may also incorporate short pulse interrogation, although not limited to this embodiment.
- the system may do so by incorporating high powered LEDs that possess the capability to be pulsed at very short (e.g., sub-millisecond) durations to detect camcorders illegally in operation inside a movie theater.
- the QinetiQ-NA developed drivers possess the capability to be pulsed at sub-millisecond pulse durations with a high degree of accuracy. These durations may be preferable to minimize the perceptibility by the theater audience.
- FIG. 5 shown is a pictorial view depicting a still further embodiment of the system employing a single illuminator for dual-axis illumination.
- a mirror 122 may be movably controlled by a mirror controller 202 to enable the use of a single dual-axis illuminator 204 .
- the mirror controller 202 may allow the mirror 122 to change positions to reflect light from the dual-axis illuminator 204 as needed.
- the mirror 122 may reflect light from the dual-axis illuminator 204 to the beam splitter 120 , and then the light is reflected to the target area as on-axis light.
- the mirror controller 202 may move the mirror 122 out of the way (as shown in FIG. 5 ) so that the dual-axis illuminator 204 emits off-axis light directly to the target area.
- Light reflected by the dual-axis illuminator 204 either on-axis or off-axis, may travel through the beam splitter 120 and be recorded by the camera 100 .
- the use of a mirror controller 202 and mirror 122 able to change positions eliminates the need for a second illuminator.
- the main controller board functions to direct the capabilities of each of the components of the system.
- the main controller board may be controlled locally by a Local PC.
- a Remote PC may permit an operator of the system to control the Local PC from another location.
- the detection equipment may be installed inconspicuously near a stage or presentation area facing the audience, but a system operator may be in another room or at a remote location controlling interrogation for pirate cameras.
- a system operator may even be in the audience and control the system by way of a wireless handheld computer device. This would allow the system operator to initiate an interrogation, identify a pirate camera, zoom and pan the camera to acquire forensic image information, as well as other functions.
- FIG. 7 shown is an electrical schematic diagram of one embodiment of the main controller board.
- FIGS. 8A , 8 B, 8 C and 8 D shown are schematic diagrams depicting digital resampling.
- the field of view (FOV) is adjusted to maintain a constant foot print (e.g., number of seats, etc.) in a scan view of the target area regardless of the distance from the camera 100 .
- This is possible because the zoom range of the optics and the cross-sectional area of the fixed mirror in the system can accommodate the full range of the target area from the far-field FOV 250 to the near-field FOV 254 .
- detection of pirate cameras (or other optical devices) is based on source size in the focal plane of the camera, maintaining a constant footprint as a function of range is desirable in order to use the same detection and imaging algorithms for each row in the theater.
- FIG. 8B shows reflection at mid field FOV 252 over 4 pixels
- FIG. 8C shows reflection at mid-field FOV 252 enlarged by the ratio of mid-field/near-field (the signal average over mid-field/near-field pixels)
- FIG. 8D shows the reflection as resampled.
- the zoom range of the optics and the cross-sectional area of the scanning mirrors in one instance may not be sufficient to maintain the source size of the image in the focal plane of the camera 100 in the near-field FOV 254 as shown in FIGS. 8A and 8B .
- the size of the image is increased by the ratio of the distance from the camera to the limit of the zoom optics to adjust the mid-field FOV 252 and the distance from the camera to the near-field FOV 254 as shown in FIG. 8C . Therefore, in one instance digital re-sampling may be employed to decrease resolution in the near-field to the same level as images taken in the rest of the target area. In one instance this is accomplished by averaging the signal intensity over a pixel count equal to the same ratio the image has been enlarged as shown FIG. 8C . The portion of the image that is outside the FOV of each near-field scan as shown in FIG. 8A is captured in previous and subsequent scans and added to the re-sized image as shown in FIG. 8D .
- a dual-axis illuminator 204 projects light through a zoom lens 206 positioned by the controller board 170 to a mirror 122 capable of being moved in multiple positions.
- a mirror controller 202 which may be an electro magnet, although not limited thereto, controlled by a signal from the controller board 170 , the light is projected to a (dichroic) beam splitter 120 and is reflected as on-axis illumination light 118 to the on-axis scan mirror 212 .
- the on-axis scan mirror 212 is controlled by a pan/tilt motor 214 and associated pan/tilt controller 216 .
- the on-axis scan mirror 212 projects the light in a scanning motion left and right and up and down by the pan/tilt motor 214 .
- Any reflection of the illumination light is reflected by objects in the target area, captured by the on-axis scan mirror 212 , and projected through the beam splitter 120 and camera lens 222 to the camera 100 .
- Pixels illuminated by the reflection in the captured image are processed by the Local PC 146 and sent by the controller board 170 to a wireless network card 224 which sends the image to a Remote PC controlled by a system operator.
- the image seen by the system operator could be a retro-reflection off a pirate camera if the path of the reflection is on the axis of the camera, or background glint if the path of the reflection is omni-directional.
- the mirror 122 When the mirror 122 is placed in the up position by the mirror controller 202 , such as by turning off the electro magnet, controlled by a signal from the controller board 170 , the light is projected along the off-axis illumination light 119 path to the off-axis scan mirror 210 controlled by a pan/tilt motor 214 and associated pan/tilt controller 216 .
- the off-axis scan mirror 210 projects the light in a scanning motion left and right and up and down by the pan/tilt motor 214 . Any reflection from objects in the target area are then captured by the on-axis scan mirror 212 and projected through the beam splitter 120 and camera lens 222 to the camera 100 . Pixels in the captured image illuminated by reflection of objects in the target area are imaged by the Local PC 146 and sent by the controller board 170 to the wireless network card 224 which sends the image to the Remote PC controlled by a system operator.
- the image seen by the system operator can only be background glint (clutter) as the reflected light projected by off-axis scan mirror 210 is off-axis from the camera 100 .
- a retro-reflection from a pirate camera illuminated by light projected from the off-axis scan mirror 210 cannot be reflected back to the camera 100 .
- the reflection is a true retro-reflection off a pirate camera and not background glint.
- the reflection is background glint (clutter).
- Scanning mirrors may be employed for non-sequential scanning. This minimizes the perception of the LED flash to people in the target area.
- both an on-axis scan mirror 212 and a off-axis scan mirror 210 are synchronized so that they aim in the same direction.
- the on-axis scan mirror 212 may both disperse light from the dual-axis illuminator 204 and collect any reflections from objects in the target area. If a reflection is discovered, it may be the retro-reflection from an optical device. Off-axis light may then be used as a discriminator against false positives.
- the off-axis scan mirror 210 may disperse off-axis light from the dual-axis illuminator 204 and the on-axis scan mirror 212 may again collect any reflections from objects in the target area. If an off-axis light reflection is found at the same spot where an on-axis light reflection was found, it is not a true retro-reflection of an optical device, but is instead background clutter.
- FIG. 10 shown is a block diagram describing one embodiment of the method of detecting optical devices.
- On-axis illumination may be used to first identify potential optical devices in the target area by their retro-reflection. Since optical devices can only exhibit retro-reflection on-axis, a subsequent, off-axis illumination source is then used. If the potential optical device also reflects light from the off-axis illumination source, it is not an optical device.
- the following steps may be performed, although not limited thereto: illuminating an area with a light source on a first axis; capturing a first image of the illuminated area; identifying a potential optical device by its reflection characteristics in the first image; illuminating the potential optical device with a light source on a second axis; capturing a second image of the potential optical device; and identifying an actual optical device by its reflection characteristics in the second image.
- the Local PC may have computer readable media running software to employ the image processing algorithm, which identifies the retro-reflection of potential optical devices in the captured images, although not limited thereto.
- the algorithm may use characteristics of the non-illuminated and off-axis images to use for comparison with a captured image. True retro-reflections will exhibit distinguishable reflection characteristics, so the algorithm may identify reflections over a minimum size.
- the thresholds in the algorithm are variable. Increasing the sensitivity may have the advantage of detecting of optical devices equipped with a circular polarizer or other countermeasures.
- a disadvantage may include the false positive detection of human eye retro-reflections. To mitigate this, the system may conduct multiple interrogations of the same location in order to track and compare detection points.
- a blue LED illumination source may also be used to reflect significantly less light off of the human eye.
- the following steps may be performed, although not limited thereto: apply high-pass filter to the on-axis image; create a mask from the non-illuminated and off-axis images; apply mask to filtered on-axis image; auto threshold result; and accept features of a certain size.
- a camera or cameras may first capture several images of the region of interest for manipulation and analysis.
- the images may include, although not limited thereto: 1) an on-axis image of the illuminated target area; 2) an off-axis image of the illuminated target area; and 3) a non-illuminated image of the target area (multiple images may be taken, discussed further below).
- Each of these initial images may first be resized if they are taken at a distance which is less than the minimum optical zoom distance. Resizing may be based on a “distance ratio” equal to the actual distance divided by the minimum optical zoom distance. Resizing serves to normalize any features in the images for all distances.
- the acquired images may be used to identify retro-reflections of any optical devices found in the target area.
- a “working image” used for image processing may begin with a “maximum static background image,” although not limited thereto, which may be created by computing the maximum pixel value at each pixel location of all of the non-illuminated images to obtain a single non illuminated image. Taking multiple non-illuminated images reduces the illumination variability due to ambient light such as the picture screen as well as light having different illumination frequencies such as running lights.
- the working image may then be subtracted from the on-axis image, although not limited thereto, to create a new working image used in subsequent processing. This serves to minimize any static illumination sources and nullify inherent camera noise.
- the working image may be convolved with a low pass filter, although not limited thereto.
- a 3 ⁇ 3 Gaussian filter kernel may be used, although not limited thereto:
- a low pass filter removes high frequency detail (e.g., blurs) and reduces the optical differences caused by aliasing and a reduced depth of field.
- the working image may be convolved with the high pass filter.
- a 5 ⁇ 5 high pass filter kernel may be used, although not limited thereto.
- a high pass filter extracts high frequency detail (e.g., sharpens) by removing low frequencies and separating any retro-reflection signals from background signals.
- An image mask may be created to accentuate all of the sources of reflection in the region of interest, although not limited thereto.
- the image mask may begin with a formation of a maximum image by computing the maximum pixel value at each pixel location of all of the non-illuminated images and the off-axis image to obtain a single image. From the maximum image all sources of reflection and illumination for a finite amount of time are gathered. The maximum image may then be blurred using a low pass filter to reduce the optical differences caused by aliasing and a reduced depth of field. The image mask may then be sharpened using a high pass filter to separate retro-reflection from the background. A maximum filter may then be applied to the image resulting from low pass filtering the maximum image.
- the maximum filter may be a 5 ⁇ 5 window, although not limited thereto, which slides across the image 1 pixel at a time and sets the pixel value of every pixel in the 5 ⁇ 5 window to the maximum value found inside that same window. This increases the contrast and magnifies the size of any features. Sliding a window 1 pixel at a time samples the neighboring region around every individual pixel as opposed to a subset of pixels.
- a histogram based binary threshold using 256 bins may be applied to the image resulting from the application of the maximal filter.
- a threshold may be applied to the image using the histogram based on a predetermined value, which may be determined empirically. Applying this threshold separates the foreground (signal) from the background (noise).
- the image resulting from the application of the threshold is referred to as an image mask.
- the image mask may then be applied to the working image, although not limited thereto, by multiplying the two images together. This minimizes the false positive detection rate by masking out potential sources of reflection caused by screen illumination or off axis illumination.
- the signal (foreground) is separated from the noise (background).
- an ISODATA (Iterative Self-Organizing Data Analysis Techniques) algorithm may then be applied to the working image to find thresholds, although not limited thereto. (See, for example, Thresholding Using the ISODATA Clustering Algorithm, IEEE Transactions on Systems, Man and Cybernetics, Volume 10, Issue 11, Date: November 1980, Pages: 771-774, which is incorporated by reference herein is entirety.) This may be accomplished by using a 7 ⁇ 7 window, although not limited thereto:
- a starting threshold value is picked, which may be a midpoint pixel value for the neighboring area.
- the number of pixels above the threshold(foreground) and the number of pixels below the threshold(background) are counted in running subtotals.
- the new threshold may be equal to: ((foregroundTotal/foregroundCount)+(backgroundTotal/backgroundCount))/2. If new threshold is equal to the previous threshold, then the pixels within the window may use the new threshold, otherwise this process may be repeated. This serves to separate the foreground (signal) from the background (noise).
- these teachings are not limited only to the ISODATA algorithm but other clustering and pattern recognition algorithms are within the scope of these teachings. (See, for example, Handbook of Pattern Recognition and Image Processing, T. Y. Young and K. S. Fu, Chapter 2, pp.33-57, 1986, ISBN 0-12-774560-2, which is incorporated by reference herein is entirety.)
- features found in the signal (foreground) that may be larger than the maximum retro reflection size may be excluded.
- feature size discrimination may be performed on the image resulting from the clustering operation or the image resulting from the multiplication of the image mask with the working image (both of which are referred to as the resulting working image, or simply the working image) using a window of predetermined size.
- a window of predetermined size In one exemplary embodiment, an 11 ⁇ 11 window is utilized, although not limited thereto.
- the feature size discrimination determines if a blob, or group of pixels such as 2 more, is inside the window.
- the blob and all of its edges remain inside the window area, it passes the size constraints; however, if the blob extends out past the border of the window, the blob fails the size constraints. This process helps to exclude features found in the foreground which are larger than a maximum retro-reflection size.
- FIG. 12 shown is a schematic diagram depicting a still further embodiment of the system employing a second off-axis camera 262 .
- This embodiment employs two cameras and a single dual-axis illuminator 204 .
- This embodiment allows for the detection of optical devices by taking two images of an illuminated target area at the same time.
- the dual-axis illuminator 204 projects light along the on-axis illumination light path 118 through the beam splitter 120 and to the on-axis scan mirror 212 for dispersal to the target area. Any reflected light from the target area is captured by the on-axis scan mirror 212 and sent back to the beam splitter 120 where it is now reflected and sent to the camera 100 .
- any reflected light is captured by the off-axis scan mirror 210 and travels along the off-axis reflected light path 260 to the off-axis camera 262 .
- any optical devices can be identified since retro-reflection from the dual-axis illuminator 204 will only be captured by the camera 100 . If the off axis camera 262 also captures a reflection when both the on-axis scan mirror 212 and off-axis scan mirror 210 are directed at the same illuminated area, then it is not a retro-reflection and must instead be background glint.
- FIG. 13 shown is a flowchart describing another embodiment of the method of detecting optical devices.
- the following steps may be performed, although not limited thereto: acquiring a first image of a target area on the same axis as an illumination source; acquiring a second image of the target area on a different axis from the illumination source; and identifying retro-reflections in the target area by analyzing the first image and the second image by: filtering the first image; creating an image mask using the second image; applying the image mask to the first image; and separating features in the foreground in the first image larger than a predetermined size.
- illumination refers to any source of electro-magnetic radiation and it is not limited to LEDs, infrared light, or any other form of light. As discussed above, electromagnetic radiation of different wavelengths may be preferable in certain circumstances and an illuminator that creates a retro-reflection in optical devices at any wavelength may be used with the system.
- a camera refers to any image acquisition system.
- a camera may comprise optical, electronic, and/or mechanical components. As discussed above, it only requirement is that it be able to acquire an image of the target area which may be used for detecting optical devices.
Landscapes
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- Software Systems (AREA)
- Vascular Medicine (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Signal Processing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Studio Devices (AREA)
- Closed-Circuit Television Systems (AREA)
- Emergency Alarm Devices (AREA)
- Image Analysis (AREA)
- Burglar Alarm Systems (AREA)
Abstract
A system and method for detecting a camera. In one embodiment, although not limited thereto, an illuminator illuminates an area of interest. A camera then takes multiple pictures of the illuminated area and an algorithm is then used to compare the pictures and locate and pirate cameras based on the reflection characteristics.
Description
- This application is a continuation of co-pending U.S. application Ser. No. 12/545,504 filed Aug. 21, 2009, entitled SYSTEM AND METHOD FOR DETECTING A CAMERA, which in turn claims priority of U.S. Provisional Application Ser. No. 61/091,955, filed on Aug. 26, 2008, and U.S. Provisional Application Ser. No. 61/176,700, filed on May 8, 2009, which this application incorporates by reference in their entirety for all purposes.
- Many people have the need or desire for privacy. With the advance of technology related to photographic and video recording equipment has come widespread use. In fact, many cell phones now have cameras as standard equipment. Such accessibility to recording equipment enables users to easily and surreptitiously record images of people and their private property.
- Motion pictures are generally first released in movie theaters before being made available on consumer media. This limited monopoly insures revenue so that production companies can recoup the costs of production. There have been numerous incidents in which movies have been pirated during screenings in theaters and then released on the black market. Movies are sometimes pirated by smuggling video cameras into a theater and filming the showing. The pirated video may then be copied and distributed illegally to consumers. Movie pirating is a major problem that is estimated to be costing the movie industry billions of dollars a year in lost profits.
- A need exists in the motion picture industry as well as many others to address problems associated with the unauthorized use of cameras, video recorders, or other optical devices. Therefore, it would be beneficial to have a superior system and method for detecting optical devices.
- The needs set forth herein as well as further and other needs and advantages are addressed by the present embodiments, which illustrate solutions and advantages described below.
- The method of the present embodiment includes, but is not limited to: acquiring a first image of a target area on the same axis as an illumination source; acquiring a second image of the target area on a different axis from the illumination source; and identifying retro-reflections in the target area by analyzing the first image and the second image by: filtering the first image; creating an image mask using the second image; applying the image mask to the first image; and separating features in the foreground in the first image larger than a predetermined size.
- The system of the present embodiment includes, but is not limited to: an illumination system for illuminating a target area; an image acquisition system for capturing a first image of the target area on the same axis as the illumination system and for capturing a second image of the target area on a different axis from the illumination system; and a processor and computer readable media having computer code for causing a processor to identify retro-reflections in the target area by analyzing the first image and the second image.
- Other embodiments of the system and method are described in detail below and are also part of the present teachings.
- For a better understanding of the present embodiments, together with other and further aspects thereof, reference is made to the accompanying drawings and detailed description, and its scope will be pointed out in the appended claims.
-
FIG. 1 is a pictorial view of one embodiment of the system in a carrying case; -
FIG. 2 is a pictorial view of a hidden pirate camera being used in a movie theater; -
FIG. 3A is a pictorial view of a further embodiment of the system using three lenses; -
FIG. 3B is a schematic diagram of the embodiment depicted inFIG. 3A ; -
FIG. 4 is a block diagram describing the functional layout of the embodiment depicted inFIGS. 3A and 3B ; -
FIG. 5 is a pictorial view depicting a still further embodiment of the system employing a single illuminator for dual-axis illumination; -
FIG. 6 is a block diagram describing the functional layout of the embodiment depicted inFIG. 5 ; -
FIG. 7 is an electrical schematic diagram of one embodiment of the main controller board; -
FIGS. 8A , 8B, 8C and 8D are schematic diagrams depicting digital resampling; -
FIG. 9 is a schematic diagram depicting a still further embodiment of the system employing two scanning mirrors; -
FIG. 10 is a block diagram describing one embodiment of the method of detecting optical devices; -
FIG. 11 is a block diagram describing one embodiment of the method of image processing; -
FIG. 12 is a schematic diagram depicting a still further embodiment of the system employing a second off-axis camera; and -
FIG. 13 is a flowchart describing another embodiment of the method of detecting optical devices. - The present teachings are described more fully hereinafter with reference to the accompanying drawings, in which the present embodiments are shown. The following description is presented for illustrative purposes only and the present teachings should not be limited to these embodiments.
- The present teachings relate to the field of optical detection systems and methods and, more particularly, to a system and method for detecting hidden optical devices such as, although not limited thereto, cameras and video recorders. Disclosed herein are methods of and systems for locating the surreptitious use of “pirate” cameras in areas such as theaters, although not limited thereto. In fact, any place where one desires to prevent the use of cameras, video equipment, or other optical instruments may be a suitable use for this system. Examples include, although not limited thereto, sporting events, dramatic theater, political events, private functions, art galleries, trade shows, research laboratories, manufacturing facilities, bathrooms, hotel rooms, meetings, protection from snipers, etc.
- Camera phones and other related consumer technology have made it much easier to take still photographs and video anywhere, creating a legitimate concern among those who wish to retain some level of privacy or secrecy. Companies are concerned about these devices since they compromise the security of their intellectual property, providing an easy way to steal ideas and proprietary information. But banning or confiscating such equipment is difficult and increasingly inappropriate given such widespread adoption and reliance on them.
- In one embodiment, the system comprises an image capturing system (also referred to as a camera) and an illuminator, although not limited thereto. In this embodiment, an area of interest may be illuminated with light by the illuminator and images of the area of interest may be taken with the camera at different exposure levels. Images of pirate cameras exhibit unique characteristics at different exposure levels because the light from the illuminator is reflected by their optical lenses. Comparing the images with the help of an algorithm, which may be implemented in software or hardware, although not limited thereto, helps to identify and locate a pirate camera.
- Detection of optical equipment using retro-reflection, sometimes called the “cat's eye” response, occurs whenever light entering the lens of an optical system is focused onto and reflected back from the focal plane of a lens system. The “lens system” can be, although not limited thereto, that of a camera, telescope, scope optics or the lens of an eye. The focal plane can be, although not limited thereto, a film plane, an electronic imaging device such as a charge-coupled device (CCD), or the retina of the eye. The amount of reflectivity at the focal plane can be very low and still produce detectable retro signals because the “gain” from the collection area of the lens system usually is quite large.
- The retro-reflection signal from optical equipment is along the same line-of-sight (LOS) as the interrogation beam of the illuminator (on-axis), but other sources of light in the vicinity of the target may produce a response as well, e.g., glare sources. These other random sources are “clutter” (also referred to as “glint”), and while detection without clutter rejection is possible, it often results in too many false positives.
- The system is discussed below in terms of pirate cameras, but the system and method of use are not limited to these particular devices. In fact, any type of optical equipment may be identified with the system disclosed herein and whenever it would be beneficial to identify such optical equipment is a potential application for the system and method of use.
- Referring now to
FIG. 1 , shown is a pictorial view of one embodiment of the system in a carryingcase 106. The system may comprise an off-axis illuminator 104, acamera 100 and afilter 102, although not limited to this embodiment. The off-axis illuminator 104 illuminates the target area in a specific band of light. In this embodiment, the off-axis illuminator 104 may be close enough to thecamera 100 that it is within the angle of incidence of any reflected light. Consequently, the off-axis illuminator 104 is able to generate retro-reflection from any optical devices in the target area. In one embodiment, although not limited thereto, an infrared (IR) illuminator may be used. Other forms of light may also be used and the present teachings are not limited to this particular type of light. The IR illuminator may operate in the near IR band with a center wavelength with a range of approximately between 700 nm and 1600 nm and a bandwidth around approximately 100 nm, although not limited to these particular ranges. Other types of light in various bandwidths may be more appropriate under difference circumstances and are discussed further below. For example, although not limited thereto, a laser may be used instead. The off-axis illuminator 104 provides light to the target area which will be reflected by certain objects such as optical equipment, which helps to identify the use of pirate cameras. - The system may also include a
camera 100, which may be a charge-coupled device (CCD) digital camera, although not limited to this embodiment. For example, although not limited thereto, camera technologies such as CMOS or film may also be used. Thecamera 100 may be able to detect the wavelength emitted by the off-axis illuminator 104 and record images that may be manipulated digitally. Additionally, thecamera 100 may include means for manipulating exposure by aperture, exposure time or otherwise, although not limited to this embodiment. - The
camera 100 may include afilter 102 that accepts the wavelengths of light provided by the off-axis illuminator 104 while rejecting all other wavelengths, although not limited to this embodiment. This is helpful to identify true retro-reflections of the light from the off-axis illuminator 104. For example, although not limited thereto, if the off-axis illuminator 104 emits invisible IR light, thefilter 102 will help to isolate the reflected IR light from visible light, assisting in identifying reflections. - The system may be constructed into a single kit which may be portable or they may be installed separately at a location in a permanent or semi-permanent fashion, although not limited to this embodiment. A carrying
case 106 may contain thecamera 100, off-axis illuminator 104 andfilter 102, so that the system may be easily transported and brought to a temporary location such as an art gallery, although not limited to this embodiment. In the alternative, a more permanent location such as a movie theater may choose to incorporate the system into the stage or screen to assure a clear view of the gallery, although not limited to this embodiment. - In operation, a large target area, such as a theater, although not limited thereto, may be illuminated in sections by the off-
axis illuminator 104. Thecamera 100 may then capture multiple exposures of the illumination area, such as a short exposure image and a long exposure image, although not limited thereto. The amount of exposure may be controlled by changing the exposure time or other means, such as by changing the aperture size, and the system is not limited to this embodiment. Multiple images provide a more reliable method of detection of any pirate cameras. - The short exposure image exposure time may be set to obtain the following image, although not limited to this embodiment:
- 1. There appears a local maximum, e.g., a well-defined peak of bright pixels representing the reflected light surrounded by darker pixels, roughly in the middle of the pirate camera lens, although not limited to this embodiment. This local maximum is referred to as the “pirate camera reflection;”
- 2. The intensity of the image of the pirate camera lens is near the detection camera's maximum possible intensity; and
- 3. Most other objects in the area of interest do not appear or are faint.
- The long exposure image exposure time may be set to obtain the following image, although not limited to this embodiment:
- 1. The image of the pirate camera lens still appears as a local maximum, e.g., a well defined peak of bright pixels surrounded by darker pixels; and
- 2. Images of other objects are bright, even perhaps over-exposed.
- The current exposure times for the short and long exposures are approximately 80 ms and approximately 750 ms, respectively, although not limited to these particular ranges. The exact exposure times depend on a number of settings/factors such as lens size, aperture size and camera sensitivity. It may be advantageous to keep the
camera 100 relatively still between the exposures so that everything that is not moving in the illuminated area appears in substantially the same place in each image, although not limited to this embodiment. - The output of the
camera 100 may be coupled to a processor and storage device (neither shown inFIG. 1 ) such as a personal computer, which may be programmed according to the present system, although not limited to this embodiment. The coupling may be by physical connection to a proximate processor, or thecamera 100 output may connect to the processor wirelessly, although not limited to this embodiment. The processor may contain software or have configured hardware in order to manipulate the images on the storage device, although not limited to this embodiment. - The processor may manipulate the captured short and long exposure images according to the following algorithm, although not limited to this embodiment:
- 1. For each image:
-
- a. Find local maxima; and
- b. Select maxima below a given size to find positives;
- 2. Choose only positives, discussed further below, that are in both images.
- In this way the processor may determine the presence and location of any pirate cameras in the images taken by the
camera 100 since the positives that are in both images are pirate cameras. - The positives in the images are a result of light from the off-
axis illuminator 104 reflected by the pirate camera lens. The positives (or reflected light) may have the following characteristics, although not limited to this embodiment: - 1. It is a strong signal that is brighter than most objects. For example, although not limited thereto, in a darkened theater setting the reflected light will be easy to identify;
- 2. The signal is a local maximum for a wide range of exposure levels;
- 3. The signal maintains roughly the same size across a wide range of exposure levels, while other image objects change apparent size as they are illuminated, saturating and ‘bleeding’ into other objects; and 4. It does not move while images of it are acquired.
- These characteristics make it possible to discriminate between light reflected from pirate cameras and light reflected from other objects.
- The unique reflected signal characteristic of pirate cameras may be due the following, although not limited to this embodiment:
- 1. Reflection off pirate camera lenses;
- 2. Spoiled retro-reflection. Off
axis illuminator 104 light bounces off of a pirate camera's internal filter and returns to thedetection camera 100. Since the pirate camera's internal filter is close to (but not right at) the focal point of its lensing system, the retro-reflection returns to the source, but at a slightly broader angle than it would with a pure retro-reflection; or - 3. A combination of the above two signals.
- In this embodiment, the off-
axis illuminator 104 is close enough to thecamera 100 that it is within the angle of incidence of any reflected light. Consequently, the off-axis illuminator is able to generate retro-reflection from any optical devices in the target area. The high-exposure (e.g., 750 ms, etc.) image will include reflections from all background clutter. The low-exposure (e.g., 80 ms, etc.) image will include just the retro-reflections of optical devices. So optical devices may be identified by their presence in both images. - Referring now to
FIG. 2 , shown is a pictorial view of a hidden pirate camera being used in a movie theater.Pirate cameras 110 are known to have been smuggled into movie theaters and hidden in many different ways. For example, apirate camera 110 may be hidden in apopcorn box 112. Thepirate camera 110 may be smuggled into the movie theater and positioned in such a way that it has a clear view of the screen. Thepirate camera 110 may record the entire motion picture and then be duplicated and sold on the black market. The system disclosed herein is able to identify apirate camera 110 no matter how it is hidden since its lens will be directed toward the screen in order to record the movie. By positioning the present system at the screen facing outwards, anypirate camera 110 may be detected by light reflecting off of its lens. - Referring now to
FIG. 3A , shown is a pictorial view of a further embodiment of the system using three lenses. The system may havemultiple lenses 130, each with a specific purpose. For example, although not limited thereto, the system may have: an off-axis illuminator 104; a camera 100 (or detector lens); and an on-axis illuminator 134, each having lenses operationally connected. For example, although not limited thereto, each lens may be a zoom lens, a wide-angle lens, or some other type of lens appropriate for the system. - The illumination light sources for the off-
axis illuminator 104 and on-axis illuminator 134 may be LEDs 128 (light-emitting-diodes), although many other wavelengths of light are appropriate and the system is not limited to this particular embodiment. LEDs may be utilized because a pirate camera may employ an IR filter to block any retro-reflection from an IR illuminator. The system may also incorporate abeam splitter 120 andmirror 122, although not limited to this embodiment, in order to provide for the on-axis illuminator 134. Themirror 122 may reflect the light from the on-axis illuminator 134 to thebeam splitter 120, which in turn reflects the light to the target area along the axis of the camera 100 (on-axis). While thebeam splitter 120 reflects the light from the on-axis illuminator 134 to the target area, it may still allow any light reflected from the target area to pass through to thecamera 100. On-axis illumination is helpful to find true retro-reflection from pirate cameras. - The off-
axis illuminator 104 may then be used as a discriminator against false positives by the on-axis illuminator 134. When the system detects a pirate camera (or other optical device in the target area) by light reflected from the on-axis illuminator 134, it may activate the off-axis illuminator 104 to get reflected light from a different angle. A comparison of the reflections by the on-axis illuminator 134 and the off-axis illuminator 104 helps to minimize false positives. Optical equipment will only exhibit true retro-reflection, so if off-axis illumination of the same area also returns reflected light, it has to be background clutter and can not be a pirate camera. The system may do subsequent interrogations to confirm any detections, although not limited to this embodiment. - The system may be on a pan/
tilt track 124, although not limited thereto, permitting the scanning of an area larger than the field of view of a fixedcamera 100 lens. The system may scan the desired target area—for example, although not limited thereto, a theater audience—with a low level interrogation source several times during a movie. The ability to pan may keep the feature size to a reasonable geometry, permitting the use of a lens with a smaller pixel count and an illumination source that requires less power, although not limited to this embodiment. The pixel count is used to make detections by measuring the size of the reflected light. Pirate camera lenses, for example, may exhibit more reflection than background clutter due to the retro-reflection of the optical lens. The system could also be used without a pan/tilt track 124 with a lens having a larger pixel count and an illumination source with power capable of illuminating the entire target area. - The system may also incorporate forensic image acquisition, although not limited to this embodiment, and may take a forensic image of any people near a detected pirate camera. The system may do so by incorporating a capability to acquire and store images of the pirate camera and a nearby pirate camera operator in very low light levels, such as inside a theater, and transmit this image wirelessly to a system operator located elsewhere in the movie complex.
- The system may wirelessly notify an official, such as an usher or security guards, of the presence of a pirate camera, although not limited to this embodiment. The system may do so by incorporating a method to send an alert of positive camera detection to email, cell phone via MMS technology, or a pager, although not limited to these embodiments. This alert may include a copy of the forensic image of the pirate camera and operator as well as their physical location within the target area (e.g., a theater, etc.), although not limited to this embodiment. In this way, an usher may immediately receive notification of the pirate camera detection and immediately find the pirate camera and operator with minimal interruption to the rest of the audience.
- Control and monitoring of the system may be completely remote and in almost all cases such detection will be unknown to the operator of the pirate camera being interrogated. Features of the system may include active source control, high resolution imaging, remote zoom control, pre-programmed scan control, remote operation and image transmission, and low cost, although not limited to these embodiments.
- Referring now to
FIG. 3B , shown is a schematic diagram of the embodiment depicted inFIG. 3A . The on-axis illuminator 134 may use a highlydirectional LED 128 for its illumination source, which projects light through azoom lens 206 positioned by the controller board underneath the pan andtilt platform 138 to a fixedmirror 122. The light is then projected along the on-axis illuminationlight path 118 to the (dichroic)beam splitter 120 which projects the light along the axis of thecamera 100 to the target area. Any reflection of the illumination light from objects in the target area is passed back through thebeam splitter 120 andzoom lens 206 to thecamera 100. Signals from the camera pixels illuminated by the reflection are provided to theLocal PC 146 as images and then sent by the controller board to an antenna which sends the image to a Remote PC (not shown inFIG. 3B ) controlled by a system operator. The image seen by the system operator could be either a retro-reflection from a pirate camera (or other optical device) in the target area if the path of the reflection is on the axis ofcamera 100, or it could be background glint (clutter) if the path of the reflection is omni-directional. - An off-
axis illuminator 104 may be positioned above the camera 100 (as shown inFIG. 3A ) and used to reduce false positives of detections by the on-axis illuminator 134. When the light is projected from anLED 128 through azoom lens 206 along the off-axis illuminationlight path 119. Any reflection of the light from objects in the target area is passed back through thebeam splitter 120 andzoom lens 206 to thecamera 100. Signals from the camera pixels illuminated by the reflection are provided to theLocal PC 146 as images and sent by the controller board to an antenna which sends the image to the Remote PC controlled by a system operator. In this case, the image seen by the system operator can only be background glint (clutter) as a retro-reflection from a pirate camera illuminated by light projected from the off-axis illuminator 104 cannot be reflected back to thecamera 100. Accordingly, if the system operator gets a reflection when light is projected from the on-axis illuminator 134, but not when light is projected from the off-axis illuminator 104, pointed to the same location in the target area by thepan motor 137 andtilt motor 139 along the pan/tilt track 124 by the controller board, the reflection is a retro-reflection of a camera and not background glint. However, if the operator gets a reflection when both the on-axis illuminator 134 and off-axis illuminator 104 are pointed to the same location, the reflection is background glint. - One embodiment of the method for identifying optical equipment in the target area, although not limited thereto, includes the steps of illuminating an area with a light source on a first axis; capturing a first image of the illuminated area; identifying a potential optical device by its reflection characteristics in the first image; illuminating the potential optical device with a light source on a second axis; capturing a second image of the potential optical device; and identifying an actual optical device by its reflection characteristics in the second image. The Local PC may automatically detect the retro-reflection of pirate cameras with the dual-axis illumination method and image processing algorithm (discussed further below) by employing software stored on computer readable media.
- The Remote PC may also have computer instructions on computer readable media used to control the functions of the system. The Remote PC may have a graphical user interface (GUI) that permits a system operator, positioned remotely, although not limited thereto, to control the system functions via the Local PC, either wirelessly or otherwise. Each Remote PC may control a number of different Local PCs. For example, a movie theater complex may have 16 different screening rooms all employing the system and a single system operator may control and monitor them all from a single location. The GUI may have a map of the target area (e.g., theater) and communicate with the Local PC to identify the location of any pirate cameras and obtain forensic imaging of potential pirate camera operators. The Remote PC may monitor the system, accepts alerts, and store images provided by the Local PC, although not limited thereto. For example, the system operator may take control of the system through the GUI and pan/zoon the target area looking for a specific person, although not limited thereto. The Remote PC may also further analyze the images sent by the Local PC.
- Referring now to
FIG. 4 , shown is a block diagram describing the functional layout of the embodiment depicted inFIGS. 3A and 3B . The system may have aRemote PC 144 which acts as the master controller for all detectors (e.g., oneRemote PC 144 could control several Local PCs), although not limited to this embodiment. TheLocal PC 146 may handle all functions of a local detector from image processing to motion control to illumination control, although not limited to this embodiment. Thecamera 172 may detect pirate cameras and also be used for real time situational awareness, although not limited to this embodiment. Thecontroller board 170 may handle illuminators (e.g., on-axis lens 174 and off-axis lens 176, etc.),light sensor 164,pan motor 180 andtilt motor 182, all lenses, and communicates with theLocal PC 146, although not limited to this embodiment. Thelight sensor 164 may keep track of the general illumination level of the target area to optimize accuracy, although not limited to this embodiment. - Two LEDs (e.g., on-
axis lens 174 and off-axis lens 176, etc.) may act as the main illuminators for detection and discrimination, although not limited to this embodiment. Both on-axis or off-axis light may be controlled with scanning mirrors. The system may incorporate non-sequential scanning, although not limited to this embodiment. The system may do so by incorporating a capability to randomly scan an entire movie theater several times during the showing of a movie. This minimizes the perception of the LED flash utilized by the device to detect camcorders. - The system may also incorporate countermeasures mitigation, although not limited to this embodiment. The system may do so by incorporating image processing algorithms which provide the capability to be able to detect a retro reflection from a camcorder equipped with countermeasures such as circular polarizer filters, intended to defeat the pirate camera detection system. In this way, the user of a pirate camera will be unable to avoid detection.
- The system may also incorporate short pulse interrogation, although not limited to this embodiment. The system may do so by incorporating high powered LEDs that possess the capability to be pulsed at very short (e.g., sub-millisecond) durations to detect camcorders illegally in operation inside a movie theater. The QinetiQ-NA developed drivers possess the capability to be pulsed at sub-millisecond pulse durations with a high degree of accuracy. These durations may be preferable to minimize the perceptibility by the theater audience.
- Referring now to
FIG. 5 , shown is a pictorial view depicting a still further embodiment of the system employing a single illuminator for dual-axis illumination. Instead of both an off-axis illuminator 104 and an on-axis illuminator 134 (shown inFIG. 3 ), amirror 122 may be movably controlled by amirror controller 202 to enable the use of a single dual-axis illuminator 204. Themirror controller 202 may allow themirror 122 to change positions to reflect light from the dual-axis illuminator 204 as needed. For example, in one position themirror 122 may reflect light from the dual-axis illuminator 204 to thebeam splitter 120, and then the light is reflected to the target area as on-axis light. In another position, themirror controller 202 may move themirror 122 out of the way (as shown inFIG. 5 ) so that the dual-axis illuminator 204 emits off-axis light directly to the target area. Light reflected by the dual-axis illuminator 204, either on-axis or off-axis, may travel through thebeam splitter 120 and be recorded by thecamera 100. The use of amirror controller 202 andmirror 122 able to change positions eliminates the need for a second illuminator. - Referring now to
FIG. 6 , shown is a block diagram describing the functional layout of the embodiment depicted inFIG. 5 . The main controller board functions to direct the capabilities of each of the components of the system. The main controller board may be controlled locally by a Local PC. A Remote PC may permit an operator of the system to control the Local PC from another location. In this way, the detection equipment may be installed inconspicuously near a stage or presentation area facing the audience, but a system operator may be in another room or at a remote location controlling interrogation for pirate cameras. For example, although not limited thereto, a system operator may even be in the audience and control the system by way of a wireless handheld computer device. This would allow the system operator to initiate an interrogation, identify a pirate camera, zoom and pan the camera to acquire forensic image information, as well as other functions. - Referring now to
FIG. 7 , shown is an electrical schematic diagram of one embodiment of the main controller board. - Referring now to
FIGS. 8A , 8B, 8C and 8D, shown are schematic diagrams depicting digital resampling. The field of view (FOV) is adjusted to maintain a constant foot print (e.g., number of seats, etc.) in a scan view of the target area regardless of the distance from thecamera 100. This is possible because the zoom range of the optics and the cross-sectional area of the fixed mirror in the system can accommodate the full range of the target area from the far-field FOV 250 to the near-field FOV 254. Since detection of pirate cameras (or other optical devices) is based on source size in the focal plane of the camera, maintaining a constant footprint as a function of range is desirable in order to use the same detection and imaging algorithms for each row in the theater. -
FIG. 8B shows reflection atmid field FOV 252 over 4 pixels,FIG. 8C shows reflection atmid-field FOV 252 enlarged by the ratio of mid-field/near-field (the signal average over mid-field/near-field pixels), andFIG. 8D shows the reflection as resampled. The zoom range of the optics and the cross-sectional area of the scanning mirrors in one instance may not be sufficient to maintain the source size of the image in the focal plane of thecamera 100 in the near-field FOV 254 as shown inFIGS. 8A and 8B . As a result, the size of the image is increased by the ratio of the distance from the camera to the limit of the zoom optics to adjust themid-field FOV 252 and the distance from the camera to the near-field FOV 254 as shown inFIG. 8C . Therefore, in one instance digital re-sampling may be employed to decrease resolution in the near-field to the same level as images taken in the rest of the target area. In one instance this is accomplished by averaging the signal intensity over a pixel count equal to the same ratio the image has been enlarged as shownFIG. 8C . The portion of the image that is outside the FOV of each near-field scan as shown inFIG. 8A is captured in previous and subsequent scans and added to the re-sized image as shown inFIG. 8D . - Referring now to
FIG. 9 , shown is a schematic diagram depicting a still further embodiment of the system employing two scanning mirrors. In this embodiment, a dual-axis illuminator 204, in this instance a highly directional LED, projects light through azoom lens 206 positioned by thecontroller board 170 to amirror 122 capable of being moved in multiple positions. When themirror 122 is positioned in the down position by a mirror controller 202 (as shown), which may be an electro magnet, although not limited thereto, controlled by a signal from thecontroller board 170, the light is projected to a (dichroic)beam splitter 120 and is reflected as on-axis illumination light 118 to the on-axis scan mirror 212. The on-axis scan mirror 212 is controlled by a pan/tilt motor 214 and associated pan/tilt controller 216. The on-axis scan mirror 212 projects the light in a scanning motion left and right and up and down by the pan/tilt motor 214. - Any reflection of the illumination light is reflected by objects in the target area, captured by the on-
axis scan mirror 212, and projected through thebeam splitter 120 andcamera lens 222 to thecamera 100. Pixels illuminated by the reflection in the captured image are processed by theLocal PC 146 and sent by thecontroller board 170 to awireless network card 224 which sends the image to a Remote PC controlled by a system operator. The image seen by the system operator could be a retro-reflection off a pirate camera if the path of the reflection is on the axis of the camera, or background glint if the path of the reflection is omni-directional. - When the
mirror 122 is placed in the up position by themirror controller 202, such as by turning off the electro magnet, controlled by a signal from thecontroller board 170, the light is projected along the off-axis illumination light 119 path to the off-axis scan mirror 210 controlled by a pan/tilt motor 214 and associated pan/tilt controller 216. The off-axis scan mirror 210 projects the light in a scanning motion left and right and up and down by the pan/tilt motor 214. Any reflection from objects in the target area are then captured by the on-axis scan mirror 212 and projected through thebeam splitter 120 andcamera lens 222 to thecamera 100. Pixels in the captured image illuminated by reflection of objects in the target area are imaged by theLocal PC 146 and sent by thecontroller board 170 to thewireless network card 224 which sends the image to the Remote PC controlled by a system operator. - In this case, the image seen by the system operator can only be background glint (clutter) as the reflected light projected by off-
axis scan mirror 210 is off-axis from thecamera 100. A retro-reflection from a pirate camera illuminated by light projected from the off-axis scan mirror 210 cannot be reflected back to thecamera 100. Accordingly, if the operator gets a reflection when light is projected from the on-axis scan mirror 212, but not when light is projected from the off-axis scan mirror 210 pointed to the same location, the reflection is a true retro-reflection off a pirate camera and not background glint. However, if the system operator gets a reflection when both the on-axis scan mirror 212 and the off-axis scan mirror 210 are pointed to the same location, the reflection is background glint (clutter). - Scanning mirrors may be employed for non-sequential scanning. This minimizes the perception of the LED flash to people in the target area. Typically, both an on-
axis scan mirror 212 and a off-axis scan mirror 210 are synchronized so that they aim in the same direction. In this way, the on-axis scan mirror 212 may both disperse light from the dual-axis illuminator 204 and collect any reflections from objects in the target area. If a reflection is discovered, it may be the retro-reflection from an optical device. Off-axis light may then be used as a discriminator against false positives. The off-axis scan mirror 210 may disperse off-axis light from the dual-axis illuminator 204 and the on-axis scan mirror 212 may again collect any reflections from objects in the target area. If an off-axis light reflection is found at the same spot where an on-axis light reflection was found, it is not a true retro-reflection of an optical device, but is instead background clutter. - Referring now to
FIG. 10 , shown is a block diagram describing one embodiment of the method of detecting optical devices. Using illumination sources on multiple axes, it is possible to reduce false positives. On-axis illumination may be used to first identify potential optical devices in the target area by their retro-reflection. Since optical devices can only exhibit retro-reflection on-axis, a subsequent, off-axis illumination source is then used. If the potential optical device also reflects light from the off-axis illumination source, it is not an optical device. - The following steps may be performed, although not limited thereto: illuminating an area with a light source on a first axis; capturing a first image of the illuminated area; identifying a potential optical device by its reflection characteristics in the first image; illuminating the potential optical device with a light source on a second axis; capturing a second image of the potential optical device; and identifying an actual optical device by its reflection characteristics in the second image.
- Referring now to
FIG. 11 , shown is a block diagram describing one embodiment of the method of image processing. The Local PC may have computer readable media running software to employ the image processing algorithm, which identifies the retro-reflection of potential optical devices in the captured images, although not limited thereto. The algorithm may use characteristics of the non-illuminated and off-axis images to use for comparison with a captured image. True retro-reflections will exhibit distinguishable reflection characteristics, so the algorithm may identify reflections over a minimum size. - The thresholds in the algorithm are variable. Increasing the sensitivity may have the advantage of detecting of optical devices equipped with a circular polarizer or other countermeasures. A disadvantage may include the false positive detection of human eye retro-reflections. To mitigate this, the system may conduct multiple interrogations of the same location in order to track and compare detection points. A blue LED illumination source may also be used to reflect significantly less light off of the human eye.
- The following steps may be performed, although not limited thereto: apply high-pass filter to the on-axis image; create a mask from the non-illuminated and off-axis images; apply mask to filtered on-axis image; auto threshold result; and accept features of a certain size.
- In one embodiment of the image processing algorithm, although not limited thereto, a camera or cameras may first capture several images of the region of interest for manipulation and analysis. The images may include, although not limited thereto: 1) an on-axis image of the illuminated target area; 2) an off-axis image of the illuminated target area; and 3) a non-illuminated image of the target area (multiple images may be taken, discussed further below). Each of these initial images may first be resized if they are taken at a distance which is less than the minimum optical zoom distance. Resizing may be based on a “distance ratio” equal to the actual distance divided by the minimum optical zoom distance. Resizing serves to normalize any features in the images for all distances.
- The acquired images may be used to identify retro-reflections of any optical devices found in the target area. A “working image” used for image processing may begin with a “maximum static background image,” although not limited thereto, which may be created by computing the maximum pixel value at each pixel location of all of the non-illuminated images to obtain a single non illuminated image. Taking multiple non-illuminated images reduces the illumination variability due to ambient light such as the picture screen as well as light having different illumination frequencies such as running lights.
- The working image may then be subtracted from the on-axis image, although not limited thereto, to create a new working image used in subsequent processing. This serves to minimize any static illumination sources and nullify inherent camera noise. In one instance, the working image may be convolved with a low pass filter, although not limited thereto. In one instance, a 3×3 Gaussian filter kernel may be used, although not limited thereto:
-
- A low pass filter removes high frequency detail (e.g., blurs) and reduces the optical differences caused by aliasing and a reduced depth of field.
- Next, in one instance, although not limited thereto, the working image may be convolved with the high pass filter. In one instance, a 5×5 high pass filter kernel may be used, although not limited thereto. A high pass filter extracts high frequency detail (e.g., sharpens) by removing low frequencies and separating any retro-reflection signals from background signals.
- An image mask may be created to accentuate all of the sources of reflection in the region of interest, although not limited thereto. The image mask may begin with a formation of a maximum image by computing the maximum pixel value at each pixel location of all of the non-illuminated images and the off-axis image to obtain a single image. From the maximum image all sources of reflection and illumination for a finite amount of time are gathered. The maximum image may then be blurred using a low pass filter to reduce the optical differences caused by aliasing and a reduced depth of field. The image mask may then be sharpened using a high pass filter to separate retro-reflection from the background. A maximum filter may then be applied to the image resulting from low pass filtering the maximum image. In one instance, the maximum filter may be a 5×5 window, although not limited thereto, which slides across the
image 1 pixel at a time and sets the pixel value of every pixel in the 5×5 window to the maximum value found inside that same window. This increases the contrast and magnifies the size of any features. Sliding awindow 1 pixel at a time samples the neighboring region around every individual pixel as opposed to a subset of pixels. In one instance, a histogram based binary threshold using 256 bins may be applied to the image resulting from the application of the maximal filter. A threshold may be applied to the image using the histogram based on a predetermined value, which may be determined empirically. Applying this threshold separates the foreground (signal) from the background (noise). The image resulting from the application of the threshold is referred to as an image mask. - The image mask may then be applied to the working image, although not limited thereto, by multiplying the two images together. This minimizes the false positive detection rate by masking out potential sources of reflection caused by screen illumination or off axis illumination.
- In one embodiment, using clustering or pattern recognition techniques, the signal (foreground) is separated from the noise (background). In one instance, an ISODATA (Iterative Self-Organizing Data Analysis Techniques) algorithm may then be applied to the working image to find thresholds, although not limited thereto. (See, for example, Thresholding Using the ISODATA Clustering Algorithm, IEEE Transactions on Systems, Man and Cybernetics,
Volume 10, Issue 11, Date: November 1980, Pages: 771-774, which is incorporated by reference herein is entirety.) This may be accomplished by using a 7×7 window, although not limited thereto: -
- First, a starting threshold value is picked, which may be a midpoint pixel value for the neighboring area. The number of pixels above the threshold(foreground) and the number of pixels below the threshold(background) are counted in running subtotals. The new threshold may be equal to: ((foregroundTotal/foregroundCount)+(backgroundTotal/backgroundCount))/2. If new threshold is equal to the previous threshold, then the pixels within the window may use the new threshold, otherwise this process may be repeated. This serves to separate the foreground (signal) from the background (noise). It should be noted that these teachings are not limited only to the ISODATA algorithm but other clustering and pattern recognition algorithms are within the scope of these teachings. (See, for example, Handbook of Pattern Recognition and Image Processing, T. Y. Young and K. S. Fu,
Chapter 2, pp.33-57, 1986, ISBN 0-12-774560-2, which is incorporated by reference herein is entirety.) - Finally, in one embodiment, features found in the signal (foreground) that may be larger than the maximum retro reflection size may be excluded. In one instance, feature size discrimination may be performed on the image resulting from the clustering operation or the image resulting from the multiplication of the image mask with the working image (both of which are referred to as the resulting working image, or simply the working image) using a window of predetermined size. In one exemplary embodiment, an 11×11 window is utilized, although not limited thereto. The feature size discrimination determines if a blob, or group of pixels such as 2 more, is inside the window. If the blob and all of its edges remain inside the window area, it passes the size constraints; however, if the blob extends out past the border of the window, the blob fails the size constraints. This process helps to exclude features found in the foreground which are larger than a maximum retro-reflection size.
- Referring now to
FIG. 12 , shown is a schematic diagram depicting a still further embodiment of the system employing a second off-axis camera 262. This embodiment, although not limited thereto, employs two cameras and a single dual-axis illuminator 204. This embodiment allows for the detection of optical devices by taking two images of an illuminated target area at the same time. The dual-axis illuminator 204 projects light along the on-axis illuminationlight path 118 through thebeam splitter 120 and to the on-axis scan mirror 212 for dispersal to the target area. Any reflected light from the target area is captured by the on-axis scan mirror 212 and sent back to thebeam splitter 120 where it is now reflected and sent to thecamera 100. - At the same time, any reflected light is captured by the off-
axis scan mirror 210 and travels along the off-axis reflectedlight path 260 to the off-axis camera 262. With a single illumination of the target area, any optical devices can be identified since retro-reflection from the dual-axis illuminator 204 will only be captured by thecamera 100. If theoff axis camera 262 also captures a reflection when both the on-axis scan mirror 212 and off-axis scan mirror 210 are directed at the same illuminated area, then it is not a retro-reflection and must instead be background glint. - Referring not to
FIG. 13 , shown is a flowchart describing another embodiment of the method of detecting optical devices. The following steps may be performed, although not limited thereto: acquiring a first image of a target area on the same axis as an illumination source; acquiring a second image of the target area on a different axis from the illumination source; and identifying retro-reflections in the target area by analyzing the first image and the second image by: filtering the first image; creating an image mask using the second image; applying the image mask to the first image; and separating features in the foreground in the first image larger than a predetermined size. - The term “illuminator” used herein refers to any source of electro-magnetic radiation and it is not limited to LEDs, infrared light, or any other form of light. As discussed above, electromagnetic radiation of different wavelengths may be preferable in certain circumstances and an illuminator that creates a retro-reflection in optical devices at any wavelength may be used with the system.
- Similarly, the term “camera” used herein refers to any image acquisition system. A camera may comprise optical, electronic, and/or mechanical components. As discussed above, it only requirement is that it be able to acquire an image of the target area which may be used for detecting optical devices.
- While the present teachings have been described above in terms of specific embodiments, it is to be understood that they are not limited to these disclosed embodiments. Many modifications and other embodiments will come to mind to those skilled in the art to which this pertains, and which are intended to be and are covered by both this disclosure and the appended claims. It is intended that the scope of the present teachings should be determined by proper interpretation and construction of the appended claims and their legal equivalents, as understood by those of skill in the art relying upon the disclosure in this specification and the attached drawings.
Claims (21)
1. A system for detecting an optical device, comprising:
an illumination system for illuminating a target area;
an image acquisition system for capturing a first image of the target area on the same axis as the illumination system and for capturing a second image of the target area on a different axis from the illumination system; and
a processor and computer readable media having computer code for causing a processor to identify retro-reflections in the target area by analyzing the first image and the second image;
wherein the computer code for causing a processor to identify retro-reflections filters the first image to isolate retro-reflections.
2. The system of claim 1 wherein the computer code for causing a processor to identify retro-reflections filters the first image to isolate retro-reflections by applying a low pass filter and applying a high pass filter.
3. The system of claim 1 further comprising an image acquisition system for capturing a third image of the non-illuminated target area and wherein the computer code for causing a processor to identify retro-reflections subtracts the third image from the first image before it filters the first image to isolate retro-reflections.
4. The system of claim 3 wherein the computer code for causing a processor to identify retro-reflections creates an image mask and applies the image mask to the first image after it filters the first image to isolate retro-reflections.
5. The system of claim 4 wherein the image mask comprises:
a maximum image created using the second image and the third image.
6. The system of claim 4 wherein the computer code for causing a processor to identify retro-reflections applies a low pass filter to the image mask and applies a high pass filter to the image mask before it applies the image mask.
7. The system of claim 4 wherein the computer code for causing a processor to identify retro-reflections magnifies the image mask to separate the foreground from the background in the image mask before it applies the image mask.
8. The system of claim 4 wherein the computer code for causing a processor to identify retro-reflections applies a histogram of thresholds to the image mask before it applies the image mask.
9. The system of claim 4 wherein the computer code for causing a processor to identify retro-reflections separates the foreground from the background in the first image after it applies the image mask.
10. The system of claim 9 wherein the computer code for causing a processor to identify retro-reflections separates features in the foreground in the first image larger than a predetermined size.
11. The system of claim 1 wherein the illumination system comprises an LED light.
12. The system of claim 11 wherein the LED light pulses in periods of less than approximately 5 milliseconds.
13. The system of claim 11 further comprising a steering mirror that directs the LED light.
14. The system of claim 1 further comprising a filter adjacent to the image acquisition system.
15. The system of claim 1 further comprising a forensic image acquisition system for acquiring a forensic image of a person when an optical device is detected.
16. The system of claim 1 further comprising a user alert system for alerting a user when an optical device is detected.
17. The system of claim 1 wherein the processor and computer readable media are remote from the illumination system.
18. The system of claim 17 wherein the processor and computer readable media communicate with more than one image acquisition system.
19. A method of detecting an optical device, comprising the steps of
(a) acquiring a first image of a target area on the same axis as an illumination source;
(b) acquiring a second image of the target area on a different axis from the illumination source; and
(c) identifying retro-reflections in the target area by analyzing the first image and the second image by:
filtering the first image;
creating an image mask using the second image;
applying the image mask to the first image; and
separating features in the foreground in the first image larger than a predetermined size.
20. The method of claim 19 wherein the target area is a movie theater.
21. A system for detecting an optical device, comprising:
means for illuminating a target area with illumination;
means for capturing a first image of the target area on the same axis as the illumination;
means for capturing a second image of the target area on a different axis from the illumination;
a processor and computer readable media having computer code for causing a processor to identify retro-reflections in the target area by:
filtering the first image;
creating an image mask using the second image;
applying the image mask to the first image; and
separating features in the foreground in the first image larger than a predetermined size.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/476,603 US20120229637A1 (en) | 2008-08-26 | 2012-05-21 | System and method for detecting a camera |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US9195508P | 2008-08-26 | 2008-08-26 | |
US17670009P | 2009-05-08 | 2009-05-08 | |
US12/545,504 US8184175B2 (en) | 2008-08-26 | 2009-08-21 | System and method for detecting a camera |
US13/476,603 US20120229637A1 (en) | 2008-08-26 | 2012-05-21 | System and method for detecting a camera |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/545,504 Continuation US8184175B2 (en) | 2008-08-26 | 2009-08-21 | System and method for detecting a camera |
Publications (1)
Publication Number | Publication Date |
---|---|
US20120229637A1 true US20120229637A1 (en) | 2012-09-13 |
Family
ID=41724801
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/545,504 Expired - Fee Related US8184175B2 (en) | 2008-08-26 | 2009-08-21 | System and method for detecting a camera |
US13/476,603 Abandoned US20120229637A1 (en) | 2008-08-26 | 2012-05-21 | System and method for detecting a camera |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/545,504 Expired - Fee Related US8184175B2 (en) | 2008-08-26 | 2009-08-21 | System and method for detecting a camera |
Country Status (7)
Country | Link |
---|---|
US (2) | US8184175B2 (en) |
EP (1) | EP2329456B1 (en) |
JP (1) | JP5255122B2 (en) |
KR (1) | KR20110069027A (en) |
AU (1) | AU2009288425A1 (en) |
BR (1) | BRPI0917168A2 (en) |
WO (1) | WO2010027772A2 (en) |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120314085A1 (en) * | 2010-02-25 | 2012-12-13 | Research Organization Of Information And Systems | Video image display screen, video image display system, and method for detecting camera used in illegal camcording |
US20140157442A1 (en) * | 2012-12-04 | 2014-06-05 | Optishell Technologies Ltd. | Device and methods for detecting a camera |
WO2015020703A1 (en) * | 2013-08-04 | 2015-02-12 | Eyesmatch Ltd | Devices, systems and methods of virtualizing a mirror |
US8970569B2 (en) | 2005-03-01 | 2015-03-03 | Eyesmatch Ltd | Devices, systems and methods of virtualizing a mirror |
US8976160B2 (en) | 2005-03-01 | 2015-03-10 | Eyesmatch Ltd | User interface and authentication for a virtual mirror |
US8982110B2 (en) | 2005-03-01 | 2015-03-17 | Eyesmatch Ltd | Method for image transformation, augmented reality, and teleperence |
US8982109B2 (en) | 2005-03-01 | 2015-03-17 | Eyesmatch Ltd | Devices, systems and methods of capturing and displaying appearances |
US9091775B1 (en) * | 2013-02-04 | 2015-07-28 | Sierra Innotek, Inc. | Method and apparatus for detecting and locating camera illuminators |
US9269157B2 (en) | 2005-03-01 | 2016-02-23 | Eyesmatch Ltd | Methods for extracting objects from digital images and for performing color change on the object |
US10274979B1 (en) * | 2018-05-22 | 2019-04-30 | Capital One Services, Llc | Preventing image or video capture of input data provided to a transaction device |
US10438010B1 (en) | 2018-12-19 | 2019-10-08 | Capital One Services, Llc | Obfuscation of input data provided to a transaction device |
Families Citing this family (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10880035B2 (en) * | 2009-07-28 | 2020-12-29 | The United States Of America, As Represented By The Secretary Of The Navy | Unauthorized electro-optics (EO) device detection and response system |
NO332204B1 (en) * | 2009-12-16 | 2012-07-30 | Cisco Systems Int Sarl | Method and apparatus for automatic camera control at a video conferencing endpoint |
US9189697B1 (en) | 2010-02-22 | 2015-11-17 | Isaac S. Daniel | System and method for detecting recording devices |
EP2498520B1 (en) | 2011-03-10 | 2014-04-30 | Alcatel Lucent | Procedure for obtaining video contents generated by cameras and relative to a communication equipment user, and associated device |
JP5339316B1 (en) * | 2012-05-31 | 2013-11-13 | 楽天株式会社 | IDENTIFICATION INFORMATION MANAGEMENT SYSTEM, IDENTIFICATION INFORMATION MANAGEMENT SYSTEM CONTROL METHOD, INFORMATION PROCESSING DEVICE, AND PROGRAM |
US9208753B2 (en) | 2012-09-17 | 2015-12-08 | Elwha Llc | Unauthorized viewer detection system and method |
US9081413B2 (en) | 2012-11-20 | 2015-07-14 | 3M Innovative Properties Company | Human interaction system based upon real-time intention detection |
US9140444B2 (en) | 2013-08-15 | 2015-09-22 | Medibotics, LLC | Wearable device for disrupting unwelcome photography |
CN107547830B (en) * | 2016-06-27 | 2018-09-21 | 宁夏新航信息科技有限公司 | Cloud computing system |
CN107257449B (en) * | 2016-06-27 | 2018-02-09 | 杨春 | A kind of remote supervision system based on cloud computing |
CN105955058B (en) * | 2016-06-27 | 2018-10-23 | 广东优派家私集团有限公司 | Wireless intelligent house system |
CN106101624B (en) * | 2016-06-27 | 2018-10-16 | 宁皓 | Big data manages system |
CN106162077A (en) * | 2016-06-27 | 2016-11-23 | 石鹏 | IN service based on the Internet management system |
CN106878104B (en) * | 2017-01-13 | 2019-05-24 | 浙江大学 | A kind of wireless camera head inspecting method based on network flow |
US10719592B1 (en) | 2017-09-15 | 2020-07-21 | Wells Fargo Bank, N.A. | Input/output privacy tool |
US11017115B1 (en) * | 2017-10-30 | 2021-05-25 | Wells Fargo Bank, N.A. | Privacy controls for virtual assistants |
EP3704558A4 (en) * | 2017-11-01 | 2021-07-07 | Nokia Technologies Oy | Depth-aware object counting |
US10719832B1 (en) | 2018-01-12 | 2020-07-21 | Wells Fargo Bank, N.A. | Fraud prevention tool |
KR102121273B1 (en) * | 2018-08-08 | 2020-06-10 | 주식회사 에스원 | Method and apparatus for detection by using digital filter |
US11654375B2 (en) | 2019-08-07 | 2023-05-23 | Universal City Studios Llc | Systems and methods for detecting specular surfaces |
KR102216694B1 (en) * | 2020-05-06 | 2021-02-16 | 김선욱 | Glasses Capable of Detecting Hidden Cameras and Detecting Method thereof |
FR3111452B1 (en) * | 2020-06-11 | 2023-01-06 | Sangle Ferriere Bruno | Process for the automatic protection of an object, a person or visual information or work against a risk of unwanted observation |
US11610457B2 (en) | 2020-11-03 | 2023-03-21 | Bank Of America Corporation | Detecting unauthorized activity related to a computer peripheral device by monitoring voltage of the peripheral device |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6665079B1 (en) * | 1999-03-24 | 2003-12-16 | Science & Engineering Associates, Inc. | Method and apparatus for locating electromagnetic imaging and detection systems/devices |
US6704447B2 (en) * | 2001-02-21 | 2004-03-09 | Justsystem Corporation | Method and apparatus for using illumination from a display for computer vision based user interfaces and biometric authentication |
US6977366B2 (en) * | 2002-11-14 | 2005-12-20 | Light Elliott D | Detecting and thwarting imaging systems at theatrical performances |
US7006630B2 (en) * | 2003-06-03 | 2006-02-28 | Matsushita Electric Industrial Co., Ltd. | Methods and apparatus for digital content protection |
US20060228003A1 (en) * | 2005-04-06 | 2006-10-12 | Silverstein D A | Method and apparatus for detection of optical elements |
WO2007012098A2 (en) * | 2005-07-26 | 2007-02-01 | Tissue Gnostics Gmbh | Method and device for the segmentation of regions |
US20070103552A1 (en) * | 2005-11-10 | 2007-05-10 | Georgia Tech Research Corporation | Systems and methods for disabling recording features of cameras |
US20100097312A1 (en) * | 2006-10-12 | 2010-04-22 | Koninklijke Philips Electronics N.V. | System and method for light control |
Family Cites Families (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6603134B1 (en) | 1967-03-10 | 2003-08-05 | Bae Systems Information And Electronic Systems Integration Inc. | Optical detection system |
US6018374A (en) | 1996-06-25 | 2000-01-25 | Macrovision Corporation | Method and system for preventing the off screen copying of a video or film presentation |
US6134013A (en) * | 1997-09-15 | 2000-10-17 | Optimet, Optical Metrology Ltd. | Optical ball grid array inspection system |
JP2001313006A (en) | 2000-04-29 | 2001-11-09 | Nishisaka Kiyotaka | Anti-photograph-stealing device by infra-red light |
US6771349B2 (en) | 2000-09-27 | 2004-08-03 | David H. Sitrick | Anti-piracy protection system and methodology |
JP3468228B2 (en) * | 2001-05-16 | 2003-11-17 | ソニー株式会社 | Imaging disturbance method and system |
US6742901B2 (en) * | 2001-05-16 | 2004-06-01 | Sony Corporation | Imaging prevention method and system |
US6868229B2 (en) | 2001-09-20 | 2005-03-15 | Intel Corporation | Interfering with illicit recording activity by emitting non-visible radiation |
JP4024030B2 (en) * | 2001-10-26 | 2007-12-19 | 株式会社トラフィック・シム | Image recognition device, program, and recording medium |
US6801642B2 (en) | 2002-06-26 | 2004-10-05 | Motorola, Inc. | Method and apparatus for limiting storage or transmission of visual information |
US8045760B2 (en) * | 2003-02-21 | 2011-10-25 | Gentex Corporation | Automatic vehicle exterior light control systems |
GB2400514B (en) | 2003-04-11 | 2006-07-26 | Hewlett Packard Development Co | Image capture method |
US20040252835A1 (en) | 2003-04-23 | 2004-12-16 | Odgers Christopher R. | Method for spoiling copies of a theatrical motion picture made using a video camera and recorder |
DE10335190A1 (en) | 2003-07-30 | 2005-03-03 | Daimlerchrysler Ag | Sensor arrangement with a plurality of types of optical sensors |
JP2006258651A (en) * | 2005-03-17 | 2006-09-28 | Fuji Photo Film Co Ltd | Method and device for detecting unspecified imaging apparatus |
JP4341680B2 (en) * | 2007-01-22 | 2009-10-07 | セイコーエプソン株式会社 | projector |
-
2009
- 2009-08-21 US US12/545,504 patent/US8184175B2/en not_active Expired - Fee Related
- 2009-08-25 JP JP2011525135A patent/JP5255122B2/en not_active Expired - Fee Related
- 2009-08-25 EP EP09812001.7A patent/EP2329456B1/en not_active Not-in-force
- 2009-08-25 AU AU2009288425A patent/AU2009288425A1/en not_active Abandoned
- 2009-08-25 WO PCT/US2009/054860 patent/WO2010027772A2/en active Application Filing
- 2009-08-25 KR KR1020117006900A patent/KR20110069027A/en not_active Application Discontinuation
- 2009-08-25 BR BRPI0917168A patent/BRPI0917168A2/en not_active IP Right Cessation
-
2012
- 2012-05-21 US US13/476,603 patent/US20120229637A1/en not_active Abandoned
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6665079B1 (en) * | 1999-03-24 | 2003-12-16 | Science & Engineering Associates, Inc. | Method and apparatus for locating electromagnetic imaging and detection systems/devices |
US6704447B2 (en) * | 2001-02-21 | 2004-03-09 | Justsystem Corporation | Method and apparatus for using illumination from a display for computer vision based user interfaces and biometric authentication |
US6977366B2 (en) * | 2002-11-14 | 2005-12-20 | Light Elliott D | Detecting and thwarting imaging systems at theatrical performances |
US7006630B2 (en) * | 2003-06-03 | 2006-02-28 | Matsushita Electric Industrial Co., Ltd. | Methods and apparatus for digital content protection |
US20060228003A1 (en) * | 2005-04-06 | 2006-10-12 | Silverstein D A | Method and apparatus for detection of optical elements |
WO2007012098A2 (en) * | 2005-07-26 | 2007-02-01 | Tissue Gnostics Gmbh | Method and device for the segmentation of regions |
US20080193014A1 (en) * | 2005-07-26 | 2008-08-14 | Tissue Gnostics Gmbh | Method and Device for the Segmentation of Regions and Related Computer Program Product |
US20070103552A1 (en) * | 2005-11-10 | 2007-05-10 | Georgia Tech Research Corporation | Systems and methods for disabling recording features of cameras |
US20100097312A1 (en) * | 2006-10-12 | 2010-04-22 | Koninklijke Philips Electronics N.V. | System and method for light control |
Non-Patent Citations (2)
Title |
---|
Trakstar, An Eye on Movie Theater Pirates, 11/12/2004, Wired, http://www.wired.com/entertainment/music/news/2004/11/65683?currentPage=all * |
Trakstar, Hollywood Strikes Back at Video Pirates, 11/23/2004, Gizmag, http://www.gizmag.com/go/3504/ * |
Cited By (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9269157B2 (en) | 2005-03-01 | 2016-02-23 | Eyesmatch Ltd | Methods for extracting objects from digital images and for performing color change on the object |
US8970569B2 (en) | 2005-03-01 | 2015-03-03 | Eyesmatch Ltd | Devices, systems and methods of virtualizing a mirror |
US8976160B2 (en) | 2005-03-01 | 2015-03-10 | Eyesmatch Ltd | User interface and authentication for a virtual mirror |
US8982110B2 (en) | 2005-03-01 | 2015-03-17 | Eyesmatch Ltd | Method for image transformation, augmented reality, and teleperence |
US8982109B2 (en) | 2005-03-01 | 2015-03-17 | Eyesmatch Ltd | Devices, systems and methods of capturing and displaying appearances |
US20120314085A1 (en) * | 2010-02-25 | 2012-12-13 | Research Organization Of Information And Systems | Video image display screen, video image display system, and method for detecting camera used in illegal camcording |
US20140157442A1 (en) * | 2012-12-04 | 2014-06-05 | Optishell Technologies Ltd. | Device and methods for detecting a camera |
WO2014087405A1 (en) * | 2012-12-04 | 2014-06-12 | Optishell Technologies Ltd. | Device and methods for detecting a camera |
US9482779B2 (en) * | 2012-12-04 | 2016-11-01 | Optishell Technologies Ltd. | Device and methods for detecting a camera |
US9091775B1 (en) * | 2013-02-04 | 2015-07-28 | Sierra Innotek, Inc. | Method and apparatus for detecting and locating camera illuminators |
WO2015020703A1 (en) * | 2013-08-04 | 2015-02-12 | Eyesmatch Ltd | Devices, systems and methods of virtualizing a mirror |
US10274979B1 (en) * | 2018-05-22 | 2019-04-30 | Capital One Services, Llc | Preventing image or video capture of input data provided to a transaction device |
US20190361471A1 (en) * | 2018-05-22 | 2019-11-28 | Capital One Services, Llc | Preventing image or video capture of input data provided to a transaction device |
US10877499B2 (en) * | 2018-05-22 | 2020-12-29 | Capital One Services, Llc | Preventing image or video capture of input data provided to a transaction device |
US20210116950A1 (en) * | 2018-05-22 | 2021-04-22 | Capital One Services, Llc | Preventing image or video capture of input data provided to a transaction device |
US11747837B2 (en) * | 2018-05-22 | 2023-09-05 | Capital One Services, Llc | Preventing image or video capture of input data provided to a transaction device |
US10438010B1 (en) | 2018-12-19 | 2019-10-08 | Capital One Services, Llc | Obfuscation of input data provided to a transaction device |
US11386211B2 (en) | 2018-12-19 | 2022-07-12 | Capital One Services, Llc | Obfuscation of input data provided to a transaction device |
US11868491B2 (en) | 2018-12-19 | 2024-01-09 | Capital One Services, Llc | Obfuscation of input data provided to a transaction device |
Also Published As
Publication number | Publication date |
---|---|
WO2010027772A3 (en) | 2010-06-17 |
JP2012501588A (en) | 2012-01-19 |
WO2010027772A2 (en) | 2010-03-11 |
US20100053359A1 (en) | 2010-03-04 |
EP2329456B1 (en) | 2018-10-03 |
BRPI0917168A2 (en) | 2015-11-24 |
EP2329456A4 (en) | 2014-06-04 |
JP5255122B2 (en) | 2013-08-07 |
AU2009288425A1 (en) | 2010-03-11 |
US8184175B2 (en) | 2012-05-22 |
EP2329456A2 (en) | 2011-06-08 |
KR20110069027A (en) | 2011-06-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8184175B2 (en) | System and method for detecting a camera | |
CA3004029C (en) | Apparatus, systems and methods for improved facial detection and recognition in vehicle inspection security systems | |
US8594389B2 (en) | Security system and method | |
US7574021B2 (en) | Iris recognition for a secure facility | |
US5956122A (en) | Iris recognition apparatus and method | |
US8437517B2 (en) | Latent fingerprint detectors and fingerprint scanners therefrom | |
US9482779B2 (en) | Device and methods for detecting a camera | |
KR101530255B1 (en) | Cctv system having auto tracking function of moving target | |
US20120128330A1 (en) | System and method for video recording device detection | |
US9165201B2 (en) | Systems and methods for detecting cell phone usage by a vehicle operator | |
KR101450733B1 (en) | Apparatus and method for inspecting lower portion of vehicle | |
CN102598072A (en) | Automated teller machine comprising at least one camera that produces image data to detect manipulation attempts | |
CN107290752B (en) | Optical lens detection method and detector | |
US20060114322A1 (en) | Wide area surveillance system | |
JPH1188870A (en) | Monitoring system | |
CA2217366A1 (en) | Facial recognition system | |
KR102121273B1 (en) | Method and apparatus for detection by using digital filter | |
Patil et al. | Integrating Artificial Intelligence with Camera Systems for Automated Surveillance and Analysis | |
WO2023156449A1 (en) | System for identifying a display device | |
WO2012065241A1 (en) | System and method for video recording device detection | |
Venugopal et al. | Image Processing Technique for Digital Camera Deactivation | |
Creek et al. | Analysis of Image Thresholding Algorithms for Automated Machine Learning Training Data Generation | |
WO2006002466A1 (en) | Image processing apparatus and method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |