US20080285802A1 - Tailgating and reverse entry detection, alarm, recording and prevention using machine vision - Google Patents
Tailgating and reverse entry detection, alarm, recording and prevention using machine vision Download PDFInfo
- Publication number
- US20080285802A1 US20080285802A1 US12/131,850 US13185008A US2008285802A1 US 20080285802 A1 US20080285802 A1 US 20080285802A1 US 13185008 A US13185008 A US 13185008A US 2008285802 A1 US2008285802 A1 US 2008285802A1
- Authority
- US
- United States
- Prior art keywords
- access
- area
- machine vision
- controlled
- access control
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C9/00—Individual registration on entry or exit
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C9/00—Individual registration on entry or exit
- G07C9/10—Movable barriers with registering means
- G07C9/15—Movable barriers with registering means with arrangements to prevent the passage of more than one individual at a time
Definitions
- the present invention relates to detection, alarming, recording and prevention of unauthorized entry at entrances, doors, gates, passages, and the like. More specifically, this invention relates to application of machine vision methods to the detection, alarming, recording and prevention of tailgating (also known as piggybacking) and/or reverse entry events.
- tailgating also known as piggybacking
- Tailgating (also known as piggybacking) is a significant problem in a wide variety of security and access control applications. Tailgating or piggybacking is the entry into or out of a controlled area or through a controlled access portal of more persons, objects or vehicles than are allowed by access control rules. For example, a tailgating event occurs when persons, generally on foot or in a vehicle, attempt to gain access to an area for which they do not have the required credentials. Another example of tailgating occurs when an unauthorized person on foot or in a vehicle attempts to follow a person (again on foot or in a vehicle) with proper access credentials into a controlled access area. A variation on this approach is for the unauthorized person on foot or in a vehicle to attempt to enter the controlled access area when an authorized person (in a vehicle or on foot) leaves the area.
- tailgating or piggybacking
- a person sits on the shoulders of another person or is carried in some other way by the other person into the controlled access area.
- the participation of the authorized individual may be inadvertent, voluntary, or coerced.
- these cases and other similar cases are referred to as tailgating.
- This unauthorized use of an exit portal can be referred to as reverse entry.
- This access control violation arises, for example, when a person attempts to gain access to a controlled area using the exit of a one-way elevator (an elevator intended to be accessible only from inside the controlled area), escalators (by running the wrong direction), one-way revolving doors, or an exit passage.
- persons on foot may attempt to enter a controlled area by going over or under a gate at a vehicle-only entry or exit point.
- the consequences of tailgating and reverse entry can vary widely. For example, the consequences may be purely economic as in the case of a successful perpetrator gaining access to an event venue, transportation or other such area without paying the admission or fare. Operators of entertainment venues, sporting facilities, parking facilities, and transportation systems typically wish to prevent revenue loss from unauthorized entrants. In other cases, a successful perpetrator may steal something of value from the controlled area. Operators of industrial and manufacturing facilities, warehouse and other storage facilities, and housing areas, such as apartments or hotels, wish to prevent loss from theft. In yet other cases, a successful perpetrator may cause serious damage to property or harm to individuals in the controlled area.
- Modem access control systems use a wide variety of technologies and methods, including mechanical keypad or cipher locks, electronic keypad or cipher locks, contact-based or contactless smart cards or smart tokens (generally employing radio frequency or infrared communications), magnetic strip cards, and biometric control methods, such as retinal scans, fingerprint or handprint identification, facial feature identification, and voice print identification.
- biometric control methods such as retinal scans, fingerprint or handprint identification, facial feature identification, and voice print identification.
- Known access control methods do not prevent tailgating or reverse entry on their own.
- human guards and persons authorized to access a controlled area may assist the perpetrators willingly or unwillingly, further complicating the situation.
- break-beams People can crawl under or jump over a pair of break-beams. A person with another person on their shoulders or being carried in some other way is not detected. Since the break-beam requires a light source directly opposite the detector, the break-beam cannot be affected by the swing of a door. Architectural modifications may thus be required for installation.
- the above systems also disclose no provision for interfacing with external access control or other security systems. Further, the optical break-beams may not work in high ambient light conditions.
- this approach cannot detect cases where a person carrying another person on their shoulders or in some other way passes through the array.
- a four-legged animal passing through the array will likely trigger a false alarm.
- Architectural modifications may be required to force each person to pass through each beam.
- the system disclosed has no provision for interfacing with external access control or other security systems.
- optical break- beams may not work in high ambient light conditions.
- U.S. Pat. No. 4,303,851 issued to Mottier, discloses a system using a pair of video cameras focused on two adjacent tracks through a flat mirror and connected to counter circuits. The tracks are perpendicular to a person's direction of travel. Persons passing through this array are detected and counted. While this approach removes some of the ambiguities associated with break-beam methods, problems still remain. A person with another person on their shoulders or being carried in some other way is not detected.
- the system disclosed has no provision for interfacing with external access control or other security systems. Further, architectural modifications may be required to keep all persons within the field of view of the cameras and to prevent the swing of a door from interfering with the field of view of the cameras.
- U.S. Pat. No. 4,847,485, issued to Koelsch, and U.S. Pat. No. 4,799,243, issued to Zepke, disclose systems applying arrays of pyroelectric infrared sensors to directionally count people entering or leaving through a passage.
- the system in U.S. Pat. No. 4,799,243 employs a single linear array of sensors that may not detect a person with another person on their shoulders or being carried in some other way. Further, it is unclear whether several people entering or leaving in close physical proximity would be correctly detected.
- U.S. Pat. No. 4,847,485 attempts to overcome these deficiencies through the use of multiple sensor arrays. This approach has the drawback that it requires architectural modifications since each person must be forced to walk through all the arrays.
- the systems disclosed have no provision for interfacing with external access control or other security systems. Further, architectural modifications may be required since the swing of a door cannot affect the area monitored by the sensors. Both systems are also subject to environmental restrictions since they use pyroelectric sensors and are unsuitable for vehicle entrances.
- U.S. Pat. No. 5,866,887 issued to Hashimoto et al., discloses a system that applies a similar approach but uses a moving sensor and pattern recognition to reduce both the cost and the ambiguity inherent in detecting multiple people at the same time. This system does not overcome all the aforementioned deficiencies, since the approach still relies on sensing biomass though detection of body heat of people.
- U.S. Pat. No. 5,201,906 issued to Schwarz et al., discloses a system that applies a set of ultrasonic sensors in a revolving door structure.
- the sensors are interfaced to a local access control system to prevent or detect piggybacking.
- the ultrasonic sensors determine if more than one person is in one compartment or more than one compartment is occupied.
- This approach requires architectural modifications to most facilities, since a revolving door is required. The rate at which people can pass through the revolving door is likely less than a conventional door with security access. Further, this approach is unsuited for vehicle entrances.
- U.S. Pat. No. 6,081,619 issued to Hashimoto et al., discloses a system that employs either linear or angular infrared distance or range-finding arrays.
- This approach has drawbacks in that some embodiments require architectural modifications since each person must be forced to walk through the array, and the observation area of the sensors cannot be affected by the swing of a door.
- the system disclosed has no provision for interfacing with external access control or other security systems.
- this system is subject to environmental restrictions since it uses infrared technology and is unsuitable for vehicle entrances.
- Motion detection video uses frame-differencing and related methods applied to the output of a video camera. These methods suffer from problems such as changes in lighting and shadowing. Overlapping objects are often difficult to separate since there is no depth (three-dimensional) analysis and no attempt is made to analyze and track individual objects.
- U.S. Pat. No. 5,581,625, issued to Connell discloses a system that uses a pair of stereoscopic video cameras and associated machine vision analysis to separate individual objects (e.g., people) in a line that partially overlap from the perspective of the stereo camera pair. This system makes no attempt to determine the motion of the people observed. Furthermore, there is no provision for interfacing the system with external access control or other security systems.
- the present invention overcomes the deficiencies of prior art systems by using advanced machine vision methods and providing optional integration with access control systems and other security apparatus. Unlike prior art systems, preferred embodiments of the present invention do not require modification of buildings or other facilities. Further, preferred embodiments of the invention provide more reliable operation in practical circumstances, such as when multiple people are using an entrance or exit and under variable light and shadow conditions.
- Embodiments of the invention are preferably configured to operate on any type of entrance or exit, including those with doors, gates, passages, elevators, escalators, and the like.
- the invention may be applied to persons on foot, animals, vehicles, persons in vehicles, packages (e.g., parcels, luggage, etc.), and any other type of object entering or exiting a controlled access area.
- the invention can be used to monitor a single designated access point, such as an entry or exit point, or can be applied to any number of distributed entry or exit points at one or more controlled access areas.
- Machine vision methods employed in preferred embodiments of the invention include three-dimensional (3D) surface analysis of objects in the image. This allows the system to distinguish and classify multiple objects that may overlap in the field of view or be observed under conditions of variable light and shadow.
- the system may use other machine vision methods, including feature extraction and pattern analysis and recognition, to enhance the identification and tracking of objects.
- Systems constructed according to preferred embodiments may interface with a variety of electronic access control equipment, including electronic keypad or cipher locks, contact-based or contactless smart cards or tokens (generally employing radio frequency or infrared communications), magnetic strip cards, and biometric control methods, such as retinal scans, fingerprint or handprint identification, facial feature identification, and voice print identification.
- the systems may also employ machine vision methods to monitor doors, gates, elevators, passages, escalators, etc., with mechanical access control or no access control at all.
- the systems may further be used to provide supplemental automated monitoring of entrances and exits that are monitored with human guards.
- a preferred system of the invention can be fully integrated with security monitoring and access control systems.
- the integrated system provides audible and visible alarms to alert security personnel when a tailgating or reverse entry event has occurred.
- a preferred system of the invention also provides one or more video outputs from event monitoring cameras. These video outputs can be switched to display video of an incident scene for security personnel and to make a video record of the event.
- the video display and record may include textual and graphical information indicating the location of the incident, relevant personnel identification or access codes, and date and time, for example.
- Ambiguous situations arising from objects that cannot be accurately classified by machine vision methods may signal an alert to security personnel to manually investigate the situation.
- An example of an ambiguous situation is where two people attempt to enter using only one access credential and throw a blanket or coat over their heads in an attempt to evade the system.
- Another example arises when a person enters with a large animal (e.g., guide dog).
- the system provides video that can be switched to a display to aid in resolution of the event.
- One aspect is to use machine vision methods to detect and prevent unauthorized individuals employing tailgating or reverse entry methods from gaining access to a controlled area.
- Such tailgating or reverse entry methods may involve two or more persons on foot traveling in the same or different directions, persons in two or more vehicles traveling in the same or opposite directions, a single person attempting to enter through an exit only access point, a person using a vehicle entrance while on foot, etc.
- Machine vision methods that include stereo image analysis, feature detection, color analysis, and pattern recognition or matching, for example, may be used to detect and prevent such unauthorized access.
- Another aspect is to capture, route, display, and/or record event video and other information about a tailgating or reverse entry event, including identification of the person(s) involved (if known), date, time, location, etc.
- This data processing may be under the control of the machine vision processing system.
- a further aspect of the invention is to prevent tailgating or reverse entry at one or more possibly remote locations with doors, gates, escalators, elevators, passages, and other entry and exit points without the need for architectural modifications of existing facilities.
- Preferred embodiments of the invention are configured to operate with any available access control system, including human guards, electronic systems, and mechanical systems, through appropriate interfaces and using machine vision capability to monitor doors, gates, and other entrances or exits.
- the cost of employing machine vision methods may be reduced by using a single processing system to monitor multiple controlled areas through the use of video switching to share image processing capabilities.
- other event information from sources such as electronic access control systems and door or gate sensors, may be combined with the machine vision methods of the invention to more effectively prevent tailgating or reverse entry.
- Yet another aspect of the invention is to provide built-in initialization, calibration, and on-going test methods, including continuous monitoring and analysis of a background image or scene using machine vision methods, including stereo image analysis, to ensure the integrity and accuracy of the machine vision system and to prevent attempts by perpetrators to alter the background.
- Still another aspect of the invention is to provide a tailgating and reverse entry detection system with greater resistance to environmental conditions, including changing light and shadowing, through the use of electronic camera control, interactive user interface, and machine vision methods including feature extraction and stereo analysis. Interference from doors, gates, and other access limiting structures may be eliminated, regardless of the direction of opening or swing of the structures, particularly in embodiments where the cameras are placed above and beyond the reach of the access limiting structures.
- FIG. 1 is an overall block diagram of one preferred embodiment of a system constructed according to the invention
- FIG. 2 is a block diagram of a machine vision processing system that may be used in the embodiment of the invention shown in FIG. 1 ;
- FIGS. 3A , 3 B, 3 C, 3 D, 3 E, 3 F, 3 G, and 3 H depict process flow diagrams that may be employed by the embodiment of the invention shown in FIG. 1 ;
- FIGS. 4A , 4 B, 4 C, 4 D, 4 E, 4 F, 4 G, and 4 H depict examples of stereo image analysis performed by the embodiment of the invention shown in FIG. 1 ;
- FIGS. 5A , 5 B, 5 C, 5 D, and 5 E depict an example of a set of interactive displays for an operator interface.
- a first preferred embodiment of the invention uses a stereo pair of tracking cameras and a single event capture camera combined with machine vision processing to detect or prevent tailgating and/or reverse entry events.
- FIG. 1 depicts a machine vision processing system 10 that is capable of processing video images and detecting tailgating or reverse entry events.
- the machine vision processing system 10 receives video inputs from cameras, here a stereo pair of tracking cameras 12 , and an event capture camera 14 . Other camera inputs to the machine vision processing system 10 may be included if desired.
- the overlapping viewing area of these cameras is shown in FIG. 1 as an area of observation 16 in which objects 18 are identified, classified, and possibly tracked by the machine vision processing system 10 .
- Part of the area of observation 16 may optionally be defined by a door, gate, portal, elevator door, turnstile, or other access limiting structure 20 . It will be understood that the invention is applicable to any type of access limiting structure which people or objects can pass through, and which can include (but are not limited to):
- One or more planar doors possibly interlocking when closed, that open and close with hinges on the top, bottom or side, and are operated manually or automatically;
- planar doors possibly interlocking when closed, that slide up, down or sideways, and are operated manually or automatically;
- One or more gates or doors possibly interlocking when closed, that do not completely obstruct the opening or portal and may open in any manner, and are operated manually or automatically;
- Revolving structures such as revolving doors or turnstiles, which partially or fully occupy the opening or portal and are operated manually or automatically.
- the area of observation 16 can be in a passage, escalator, or other limited access area that is not defined by an access limiting structure.
- the stereo pair of tracking cameras 12 are preferably placed overhead of the area of observation 16 , but may be placed at any convenient location with a clear view of the area of observation.
- the stereo tracking cameras 12 are held in an adjustable bracket that allows their positions to be adjusted but maintains their alignment, and is sturdy enough not to move from these settings with time. Camera positions may be adjusted manually or by motor drive that is locally or remotely controlled.
- the event camera 14 is ideally placed at a location that gives a clear view of persons, vehicles, or other objects in the area of observation 16 . Brackets for the cameras may be attached to any solid surface, such as a ceiling or wall, and thus do not require architectural modification of the area.
- the cameras 12 and 14 can be of an analog or digital type. In a preferred embodiment, these cameras produce stereoscopic color images with two or more color-band Signals. Standard color cameras that produce images in three color- bands may be used for cost reasons.
- the machine vision processing system 10 interfaces with an optional access control system such as area access controller 24 and one or more optional local access controllers 22 .
- the area access controller 24 monitors and controls access at all designated access points, such as entry and exit points, to a particular controlled area or areas. Centralized control and monitoring is often provided at a centralized security desk.
- Local access controllers 22 control one or more doors or other access limiting structures 20 .
- the area access controller 24 may interface with the local access controllers 22 via wired or wireless communication of signals to maintain centralized control and monitoring of all access limiting structures 20 in the area or areas being controlled.
- Local access controllers 22 may include, for example, electronic keypad or cipher locks, contact-based or contact-less smart cards or electronic tokens (generally employing radio frequency or infrared communications), magnetic strip cards, and biometric control methods, such as retinal scans, fingerprint or handprint identification, facial feature identification, and voice print identification.
- the area access controller 24 and local access controllers 22 may be configured to send access control information to the machine vision processing system 10 .
- This data may include status of doors or other access limiting structures 20 , data from other access monitoring systems such as break-beams (not illustrated), and personnel identification or access codes.
- the machine vision processing system 10 may send control data or signals to the area access controller 24 or the local access controllers 22 to “lock down” a facility or to close a particular door or gate when a tailgating or reverse entry incident is detected.
- the machine vision processing system 10 may further interact with alarms and/or annunciators 26 which can include, for example, bells, sirens, machine-generated or recorded speech, lights, image displays, and text displays.
- the alarms and annunciators 26 may be local to the area of observation 16 or near the area access controller 24 or a centralized security desk.
- the alarms and annunciators 26 may be connected to the area access controller 24 or the one or more local access controllers 22 .
- the machine vision processing system 10 can send a signal to these controllers to trigger the alarms and annunciators. This alternative configuration does not change the scope, spirit, or functionality of the invention.
- the machine vision processing system 10 supplies video, typically from the event camera 14 , incident data, and control data or signals to an optional video display 28 and optional video recorder 32 .
- the video display 28 may be used by security or other personnel to observe the area of observation 16 , especially when a tailgating or reverse entry incident is detected.
- the video display 28 may have the ability to receive multiple video signals and switch between the multiple video signals.
- the machine vision processing system 10 preferably switches video produced by the event camera 14 at the scene of the incident for display on the video display 28 .
- the video display 28 may include textual or graphical data describing the location of the incident and pertinent personnel identification or access codes.
- the video display 28 may be at a centralized security desk or at some local location. Preferably, video of a period of time before and after detection of an incident is preserved by the video recorder 32 for future reference. Incident data, including incident location, date, time, and personnel identification or access codes, may also be recorded.
- the machine vision processing system 10 preferably interfaces with an operator interface 30 .
- the operator interface 30 may be used by an operator to provide commands for configuring, initializing, calibrating, and testing of the machine vision processing system 10 .
- the operator interface 30 may provide a textual or graphical user interface for presenting data and command prompts to the operator.
- the user interface may be interactive and enable the operator to configure properties of the system, including properties of the image analysis, properties of the machine vision processing system, and properties of access control systems, such as the local and area access controllers.
- Some embodiments of the invention may use different types of alarms for different situations. These alarms can have different audible and visual properties. Some possible examples of these different alarms can include:
- a warning alarm that may be used in non-critical situations such as someone starting to travel in the wrong direction through a portal (but not yet violating the security policy);
- An alarm specific to the type of security violation such as tailgating, which may have distinctive audible or visible patterns; and
- An alert indicating an unusual situation such as people or objects traveling too close together, or an object in the area of observation that cannot be unambiguously identified.
- FIG. 2 illustrates a block diagram of one suitable embodiment of the machine vision processing system 10 .
- Other embodiments may include additional or fewer components than those shown in FIG. 2 .
- the machine vision processing system 10 may be housed in a single cabinet, but in other applications, it can be distributed in packaging or space.
- the machine vision processing system 10 may also be built using redundant components to resist failure, damage, and tampering.
- video input from the stereo tracking cameras 12 and the event camera 14 is received through a video switch 50 .
- the video switch 50 may also receive input from other video cameras, possibly at other areas of observation.
- the video switch 50 feeds the video signals from the cameras to a frame grabber 52 where one or more video frames from the signals are captured and stored for processing.
- the frame grabber 52 may need to accommodate only one input signal at a time. If analog cameras are used, the frame grabber 52 preferably digitizes the video images before storing them.
- a stereo image processor 54 and feature extractor 56 operate on the image frames stored by the frame grabber 52 .
- the stereo image processor 54 operates on the stereoscopic color images received from the stereo tracking cameras 12 , and preferably performs 3D surface analysis of the objects 18 and background in the area of observation 16 . Suitable methods for 3D surface analysis and background analysis are known, and can include those disclosed in U.S. Pat. No. 5,581,625, issued to Connell, the disclosure of which is entirely incorporated by reference herein.
- the stereo image processor 54 may also use input from the feature extractor 56 .
- the feature extractor 56 identifies features such as edges and corners in images received from the stereo tracking cameras 12 and possibly the event camera 14 .
- the stereo image processor 54 can use the feature information from the feature extractor 56 to test different hypotheses to the best stereo image depth map.
- the outputs of the stereo image processor 54 and feature extractor 56 are fed to an object identification processor 58 .
- the object identification processor 58 uses the 3D surface data and image features extracted from the images to identify and possibly also classify objects 18 in the area of observation 16 .
- the objects 18 are identified and classified using pattern recognition methods.
- pattern recognition involves company patterns in the 3D surface data and image features with equivalent data and features of known or previously-identified objects. A pattern match within a specified tolerance constitutes recognition of the pattern.
- the object identification processor 58 stores image and object data in the object and image memory 62 .
- the object identification processor 58 may use object and image data stored in the memory 62 from previous video frames to aid in the identification and classification of objects. Correlation of the objects from image to image can be based on one or more properties of the object including, size, shape, or color. This may be useful in cases, for example, where a more certain identification of an object in one image can be used to identify the object in another image where the identification may be less certain.
- Data is provided from the object identification processor 58 to the object tracking processor 60 .
- the object tracking processor 60 uses the object and image data from the current frame and object and image data stored in the memory 62 from previous frames to maintain track records or track files of the objects 18 in the area of observation 16 . Using this information, the object tracking processor 60 determines the trajectory and speed of objects 18 in the area of observation. The can be done, for example, where an object in an image is correlated with an object having the same or similar classification in one or more prior images and with motion along some expected trajectory, or where the motion from one image to the next is limited.
- the object tracking processor 60 provides tracking information to an alarm and system controller 64 .
- the alarm and system controller 64 applies access control rules to the object tracking data to determine if a tailgating or reverse entry event or other questionable incident has occurred.
- Alarm decisions may also be made by applying access control rules to data received from the access controllers 24 , 22 , which can include access credentials or access codes identifying the individual or individuals.
- the rules applied in any particular application of the invention for signaling an alarm may depend on the security regulations that are specified for the controlled area being protected. Examples of access control rules that may actuate an alarm include (but are not limited to):
- Reverse entry situations e.g., objects coming into an area of observation from an exit-only location.
- the alarm and system controller 64 alert operators or security personnel as required using an interface to the alarms and annunciators 26 .
- the alarm and system controller 64 may also initiate a display of video from the event camera 14 of the area of observation 16 , possibly along with other event data, including location and access code or personnel identification, on the video display 28 .
- Event camera video transmitted to the video recorder 32 possibly with associated data, including event location, date, time, and access code or personnel identification, may be preserved for future reference.
- Operators and technical personnel using the operator interface 30 can interact with the machine vision processing system 10 through the alarm and system controller 64 .
- the operator interface 30 provides commands for configuration, initialization, calibration, and testing of the alarm and system controller 64 .
- the alarm and system controller 64 can provide textual, audio, and graphical information to the operator through the operator interface 30 .
- the first embodiment described herein processes data from one or more sets of the stereo tracking cameras 12 and possibly the event camera 14 on a frame-by-frame basis.
- the machine vision processing system 10 operates on the frame-by-frame image data to detect, announce, and where possible prevent, tailgating or reverse entry events.
- FIGS. 3A , 3 B, 3 C, 3 D, 3 E, 3 F, 3 G, and 3 H One exemplary process used by a machine vision processing system 10 as shown in FIG. 1 is described in FIGS. 3A , 3 B, 3 C, 3 D, 3 E, 3 F, 3 G, and 3 H. It will be understood that the particular sequence and nature of the actions described can be altered without changing the spirit, function, or capability of the invention.
- the machine vision processing system 10 starts processing, generally as a result of power-up, a boot event, or other initialization event.
- the electronics are initialized at block 102 , followed by software initialization at block 104 .
- failure information is preferably displayed at block 116 to an operator using the operator interface 30 .
- the information presented to the operator will be as diagnostic as possible to assist in correcting the failure or fault.
- the operator or other personnel may take corrective action at block 118 and attempt to start or initialize the system again at block 100 . Examples of failures that may be detected on initialization or during on-going operation include:
- a background image for the stereo tracking cameras 12 and possibly the event camera 14 is detected and saved for reference at block 108 .
- a previously detected and saved background image may be used.
- Objects in the background image are identified by the machine vision processing system 10 and noted as belonging to the background image. Accordingly, during run-time operation, new objects that appear in the area of observation 16 may be identified and classified separately from the background image.
- a calibration procedure may be performed, as indicated at block 110 .
- a typical calibration procedure involves moving a target of known size, shape, and pattern across the area of observation 16 at different heights above the base surface or floor and collecting data with the stereo tracking cameras 12 . Once the calibration data is collected, the operation of the cameras and object identification and tracking processing is verified and the calibration parameters are saved 112 . If, at block 114 , any failures are detected during the collection and saving of the background image (block 108 ) or the calibration process (block 110 ), information relating to the failure is displayed at block 116 to the operator using the operator interface 30 . Generally, the information presented will be as diagnostic as possible to assist in correction of the failure or fault. In the case of a failure, the operator may take corrective action (block 118 ) and then attempt to start or initialize the system again (block 100 ).
- the system 10 starts at block 120 to capture video frames from the one or more sets of stereo tracking cameras 12 .
- Video from the event camera 14 may also be captured.
- image features are extracted at block 122 , and stereo image analysis of the color image is performed at block 124 . This may include methods such as those taught in U.S. Pat. No. 5,581,625, referenced earlier and incorporated by reference herein.
- the image from the event camera is stored.
- tests such as thresholding are applied to the stereo image analysis to determine if the background image has changed significantly.
- objects in the background are identified in the image or depth map, and if they are not clearly visible, an alarm is sounded. This may occur, for example, if someone has covered one of more of the cameras. Changes in the background can indicate an attempt to alter the background by a perpetrator or an equipment failure. If required, the focus, aperture, and shutter settings of the cameras may be electronically updated at block 130 . The need for and availability of these settings is determined in large part by the particular model of cameras chosen.
- objects in the image are extracted at block 131 .
- a 3D surface analysis of the objects may be updated at block 132 ( FIG. 3C ).
- Classification of the objects may be verified at block 134 to ensure that the classification of the objects remains a good match.
- the objects identified in the current frame are compared to those identified in a previously collected frame, as shown at block 136 .
- the machine vision processing system 10 determines whether new objects appear in the image or if unexpected changes in any objects are detected. If objects seen in the previous frame are no longer observed in the current image, as determined at decision block 140 , the data on these objects is removed at block 144 from the tracking file and the history file is marked at block 146 with the event as desired.
- the motion of each object in the image is determined with respect to those objects in the previous frame, as indicated at block 148 , and the track files for the objects are updated at block 150 .
- the machine vision processing system 10 receives available information from external sources such as the area or local access controllers 24 and 22 . Based on the images, the track files for the objects, and data from the access controllers, access control rules as previously discussed are applied at block 154 to determine at decision block 156 if there is an alarm condition. For example, the locations of one or more objects entering one side of the area of observation 16 and leaving at another side of the area of observation are recorded in the track file.
- This information along with access credentials and other access control information can be used to determine if tailgating (piggybacking) or reverse entry is occurring.
- information for an object in the track file and the possible lack of connection to other objects can be used to determine if thrown objects have traversed the area of observation in violation of access control rules.
- objects are identified as being thrown based on their speed and/or trajectory of motion through the area of observation.
- the machine vision processing system 10 triggers audible and visible alarms at block 158 , and optionally annunciates alarm condition information at block 160 using the alarms and annunciators 26 ( FIG. 1 ).
- pertinent video and alarm condition information for the area of observation 16 may be displayed at block 162 ( FIG. 3E ) using the video display 28 .
- control signals or data indicating the alarm condition and possible required actions are sent to the area and local access controllers 24 , 22 , as shown at block 164 .
- Alarm information may be logged at block 166 for future analysis as required and relevant video segments obtained from the event camera 14 may be saved along with other event information at block 168 by the video recorder 32 .
- the machine vision processing system 10 preferably triggers audible and visible alarms at block 170 ( FIG. 3F ), to alert personnel of the failure.
- the machine vision processing system 10 may also annunciate the failure condition at block 172 and display the failure condition information at block 174 using the alarms and annunciators 26 , the operator interface 30 , and possibly the video display 28 .
- the failure information is logged at block 176 for later analysis if required. The failure may be resolved by an operator or technical personnel at block 178 , who then reinitialize the system at block 100 ( FIG. 3A ).
- the stereo image analysis performed at block 124 is performed for these objects at block 180 ( FIG. 3G ).
- This analysis preferably includes 3 D surface analysis. Based on the 3D surface analysis (and possibly also features extracted as in block 122 with other image characteristics), the new object is classified at block 182 , which may use pattern recognition or matching methods, as discussed earlier. If the object is determined to match a known object type or types, as indicated at block 184 , the object identification and object characteristics and position are logged to a tracking file as indicated at block 188 .
- the characteristics of the object are tested at block 190 ( FIG. 3H ) to determine if the changes exceed significance limits with respect to either the object previously identified or with respect to known image classification data. This situation may arise, for example, when an object or some characteristic of the object appears to change more than a certain amount from one image to the next. If these limits are not exceeded, the object position and characteristic information is added to or updated in the track file at block 204 .
- the machine vision processing system 10 triggers audible and visible alarms at block 192 to alert personnel of the existence of the unexpected or ambiguous object or change in object.
- the system 10 may also annunciate information at block 194 about the object and its location using the alarms and annunciators 26 .
- Information about the object, its location and other data, such as personnel identification or access codes, may further be displayed at block 196 , along with video from the event camera 14 , using the video display 28 .
- Information on the unknown object is logged at block 198 as required for later investigation.
- Personnel may investigate the object, as indicated at block 200 , and if desired, enter updated classification information at block 202 using the operator interface 30 . This updated information will typically be used to improve or extend the classification of common objects.
- a reverse entry detection system can be identical to a piggyback detection system, but configured with access control rules that, when applied, do not accept any object traveling in a certain direction.
- access control rules that, when applied, do not accept any object traveling in a certain direction.
- the machine vision processing system 10 may have the ability to directly control the door 20 or other access limiting device. In other embodiments, the machine vision processing system 10 may indirectly control the door or other access limiting device by sending signals to the area access controller 24 and/or local access controllers 22 . This alternative configuration does not change the functionality, scope, or spirit of the invention. Examples of these controls can include the following:
- the machine vision processing system can close and perhaps lock the door.
- the machine vision processing system can directionally lock (perhaps after closing) the door (i.e., allowing exits but not entrances).
- the machine vision processing can close and lock (or directionally lock) one or both of the doors in a man-trap (see discussion below regarding a seventh embodiment of the invention).
- the machine vision processing system can stop and perhaps reverse a rotating door or escalator.
- the machine vision processing system 10 may receive information from a position sensor attached to the door 20 or access limiting device.
- the position sensor may be comprised of known electronic components that produce a variable signal based upon the position of the door. For example, when the door is closed, the position sensor may encode and transmit one signal, and when open, encode and transmit another signal. Door positions of a partially open door may also be encoded and transmitted so the relative position of the door is known to the system 10 .
- the machine vision processing system 10 can use this door position information to make inferences on the background image in the area of observation 16 by a priori predicting where the image of the door should be seen.
- position sensor information received for various positions of objects, such as a door may be correlated and recorded for determining the background image that is used during normal run-time analysis.
- FIGS. 4A , 4 B, 4 C, 4 D, 4 E, 4 F, 4 G, and 4 H These images depict the outcome of a real-time 3 D surface analysis performed by the machine vision processor 10 shown in FIG. 2 .
- the lighter the shade of the image the closer the surface is to the vantage point of the stereo tracking camera pair 12 .
- FIG. 4A depicts the results of a 3D surface analysis of a background image.
- Two objects 250 , 252 are seen in this image, each of which are classified as part of an open door that has swung into the field of view.
- one or more background images are captured and saved for reference. These background images are used for self-test purposes to prevent unauthorized tampering with the background.
- FIG. 4B depicts a real-time 3D surface analysis of the same background as shown in FIG. 4A (including the door objects 250 , 252 ), now with three people present in the area of observation.
- the three people are labeled here as “A” 254 , “O” 256 , and “P” 258 .
- the arrows associated with the letter labels indicate each person's direction of motion as determined by the machine vision processing system 10 . It will be understood that the machine vision processing system 10 can identify, classify, and track an essentially arbitrary number of moving objects within the field of view. Not only can the machine vision processing system 10 determine and evaluate direction of motion, but also the speed and/or trajectory of each object's motion using known techniques.
- FIG. 4C depicts a 3D surface analysis of the background image ( FIG. 4A ) with a single person 260 entering the field of view. This person has been identified and classified by the machine vision processing system 10 and is labeled here with the letter “K.”
- FIG. 4D depicts the same field of view and background image as FIG. 4C , but the person 260 has moved a few steps. In this case, the person is being tracked as he or she is entering the controlled access area.
- FIG. 4E depicts the results of a 3D surface analysis of the background image ( FIG. 4A ) with two people 264 , 266 (respectively labeled “I” and “C”) traveling in the same direction to the left.
- FIG. 4F depicts the same field of view and background image as FIG. 4E with the two people 264 , 266 having moved further to the left.
- This example illustrates a typical tailgating situation where the first person 266 , labeled with the letter “C” and presumably having correct credentials for the controlled access area, is being closely followed by a second person 264 , here labeled with the letter “I.”
- FIG. 4G depicts the results of a 3D surface analysis of the background image ( FIG. 4A ) with two people 272 , 274 , respectively labeled “J” and “K,” traveling in opposite directions.
- FIG. 4H depicts the same field of view and background image as FIG. 4G with the two people 272 , 274 having moved further in their respective directions.
- This example illustrates a typical reverse entry situation where the first person 274 , labeled with a “K” and presumably having correct credentials for the controlled access area, is exiting the controlled area while the second person 272 , labeled with the letter “J,” attempts to maintain proximity to the first person 274 and enter while the exit door is still open.
- FIGS. 5A , 5 B, 5 C, 5 D, and 5 E provide one example of a set of interactive screen displays for an operator interface 30 . It will be understood that the organization of the operator interface shown in FIGS. 5A , 5 B, 5 C, 5 D, and 5 E is an example only, and that many possible arrangements for the functionality of the operator interface 30 are available. Further, additional functionality can be added to the interface 30 , or alternatively, functionality shown may not be required in all cases.
- FIG. 5A depicts an interactive display screen 500 of the operator interface 30 , with the “Monitor” tab 502 selected.
- This interactive display is used to monitor the operation of the machine vision processing system 10 .
- a display control area 504 interacts with the operator to select the designated access point (e.g., portal) for monitoring, particularly in cases where the machine vision processing system 10 is monitoring more than one portal.
- the properties of the view displayed can be selected using the radio buttons labeled “Normal,” “Show camera views,” “Show tracking image,” “Show image tracking camera 1 ,” “Show image tracking camera 2 ,” and “Show event camera.”
- the one or more views chosen will be routed to the video display 28 .
- For each portal one or more sets of alarm statistics for alarm events 510 can be displayed.
- a reset 508 for the event statistics can be provided for each of the portals. These statistics can also be saved in a log file for further analysis.
- FIG. 5B depicts the “Installer I/O” tab 512 of the interactive display 500 .
- This interactive display can organize functionality useful to personnel installing and maintaining the system of the invention.
- a user can select between one or more possible portals 514 as required.
- Parameters required for correcting the images for camera geometry 516 can be entered interactively, possibly including “Camera height,” “Camera distance to door,” and “Door width.”
- the trigger for one or more possible types of “Outputs” 518 from the machine vision processing system 10 can be selected in a “cross-connect” fashion based on any of the one or more inputs.
- the outputs include (a) suspicious entry, (b) tailgates, and (c) warnings
- the inputs can include one or more relays (four in the example shown) from the one or more portals.
- the machine vision processing system 10 can have multiple levels of security or password access.
- three levels of password-controlled security are used:
- a general user level typically used by security personnel.
- An administrator level with limited configuration privileges, but with the ability to administer user (e.g., security personnel) accounts.
- the Installer I/O display 512 of the interactive operator interface 500 can be used to change the installer password 520 or the administrator password 522 .
- the Installer I/O interactive display can be used to configure properties for alarm sensors 524 , which may include, for example, whether the door contact is closed when the door is “open” or “closed,” whether a valid access contact “opens” or “closes” the door when activated, and whether the alarm is reset when the reset contact is “open” or “closed.”
- FIG. 5C depicts the “Installer Camera Settings” tab 540 of the interactive display 500 .
- This set of interactive tools is typically used by installation or maintenance personnel to configure machine vision parameters, often in conjunction with functionality on the “Installer I/O” tab 512 .
- a display control area 542 allows the operator to select the portal for monitoring interaction, for cases where the machine vision processing system 10 is monitoring more than one portal.
- the “Image Type” 544 of the view displayed can be selected using the radio buttons labeled, “Show Tracking Camera 1 ,” “Show Tracking Camera 2 ,” “Show Tracking Camera 1 (unwarped),” “Show Tracking Camera 2 (unwarped),” “Show Tracking Image,” and “Show Event Camera.”
- the one or more Image Type views chosen may be routed to the built-in display 548 . Updates to camera properties can be invoked using the “Update Now” 546 button.
- the “Physical Setup” 550 or calibration may be controlled from this screen.
- a calibration process used to calibrate the machine vision processing system 10 with respect to the background, is initiated using the “Calibrate” button.
- objects in the background image are identified and registered as pertaining to the background so they do not interfere with run-time identification and analysis of new objects entering and leaving the field of view.
- the “Clear Calibration” button allows the user to remove an old calibration before creating a new one.
- One or more “Sensitivity” 552 settings used by the machine vision processing system to identify objects in the image, can be adjusted with the Installer Camera Settings tab 540 . In this example, a slider control and a numeric display are used for these sensitivity settings, including:
- Cart sensitivity used to properly identify moving carts or other objects on wheels or slides.
- Crawler sensitivity used to determine when a person is crawling in an attempt to evade the system.
- One or more “Configuration” 554 parameters may be used to restrict the portion of the area of observation 16 within which the machine vision processing system 10 searches for one or more types of behavior.
- the types of behavior to search for are selectable and numeric parameters setting the limits of the zones within the area of observation may be established.
- an interactive tool can be used to select the search zones by drawing on an image, for example.
- FIG. 5D depicts the “Setup I/O” tab 560 of the interactive display 500 .
- This set of interactive tools is typically used by installation or maintenance personnel to configure the I/O properties of the machine vision processing system 10 , often in conjunction with functionality on the “Installer I/O” 512 tab and the “Installer Camera Settings” tab 540 .
- a user can select between one or more portals 562 as required.
- the types of “Alarms” 564 enabled can be set. In this example, four choices are available, including (a) warning voice, (b) suspicious voice, (c) buzzer, and (d) light.
- a timeout time for the alarm can be set and a “Reset Alarm” button can be used to reset the alarm state.
- the “Reset Event Statistics” button 566 resets the alarm statistics for the portal.
- a system administrator can change the password using the “Change Admin Password” button 568 .
- Portals can be assigned a name using a text box 570 .
- Time on the area access controller 24 or local access controller 22 can be read and set with a “Time” control tool 572 .
- one or more “Policy” 574 properties can be selected. In this example, there is provided an option to allow multiple entries (or people) per access cycle. Other applicable policies may also be displayed and selected.
- FIG. 5D shows the “Setup View” tab 580 of the interactive display 500 , which is used to set camera views and calibrate the machine vision processing system 10 .
- a portal selection 584 allows the operator to select the portal for setup, for cases where the machine vision processing system 10 is monitoring more than one portal.
- the “Image Type” 586 of the view displayed can be selected using the radio buttons labeled, “Show Tracking Camera 1 ,” “Show Tracking Camera 2 ,” “Show Tracking Camera 1 (unwarped),” “Show Tracking Camera 2 (unwarped),” “Show Tracking Image,” and “Show Event Camera,” for this example.
- the one or more Image Type views chosen will be routed to the built-in display 588 .
- Updates to camera properties can be invoked using the “Update Now” 590 button.
- the properties of the view “Display” 592 can be selected using the radio buttons labeled, “Normal,” “Show camera views,” “Show tracking image,” “Show tracking camera 1 ,” “Show tracking camera 2 ,” and “Show event camera,” in this example.
- the one or more views chosen will be routed to the video display 28 .
- the “Physical Setup” 594 or calibration can be controlled from this screen.
- a calibration process, used to calibrate the machine vision processing system 10 with respect to the background, may be initiated using the “Calibrate” button.
- the “Clear Calibration” button allows the user to remove an old calibration before creating a new one.
- a second preferred embodiment operates in much the same manner as the first embodiment, which has already been described in detail.
- the second embodiment uses a single camera in place of the stereo tracking camera pair 12 that is used in the first embodiment.
- This embodiment loses the benefits of stereoscopic image analysis and 3D surface analysis, but still can use other machine vision methods, including motion tracking, background differencing, image segmentation, texture analysis, and shape analysis.
- a gray scale (non-color) camera may be used as a further cost reduction, but losing the benefits of color analysis.
- a third preferred embodiment operates in much the same manner as the first embodiment, which has already been described in detail.
- the third embodiment uses gray scale cameras in place of the color stereo tracking camera pair 12 and color event camera 14 used in the first embodiment.
- This embodiment loses the benefits of color analysis, but still uses machine vision methods, including stereoscopic image and 3D surface analysis, to classify and track objects and determine whether their presence and/or activity is authorized.
- a gray scale (non-color) camera is used for the event camera 14 and color cameras are used for the stereo camera pair 12 .
- This alternative retains the benefits of color analysis for the object identification and tracking capability, and only reduces the cost and capability of the event camera.
- a fourth preferred embodiment extends the functionality of the first embodiment by providing two or more pairs of stereo tracking cameras 12 and/or two or more event cameras 14 to an area of observation 16 .
- the fourth embodiment operates in much the same manner as the first embodiment, which has already been described in detail, but with an improved ability to resolve certain ambiguous situations.
- An example of an ambiguous situation is where one object fully or mostly obscures another object from the point of view of the first set of cameras.
- Another advantage of this extended embodiment is the ability to better identify and classify objects since stereoscopic analysis (including, but not limited to, 3D surface analysis) can be performed from more than one vantage point.
- This alternative extended embodiment retains all other functionality and scope of the first embodiment.
- a fifth preferred embodiment employs the same techniques of the first embodiment, but extends the functionality to include the detection, tracking, and alarming of thrown or dropped objects.
- a perpetrator may wish to pass an unauthorized object into a controlled area.
- the stereo camera 12 tracks any objects that are thrown or dropped in the area of observation 16 .
- a perpetrator may wish to leave an unauthorized object in the area of observation.
- the machine vision processing system 10 tracks the thrown or dropped objects using machine vision techniques that may include stereoscopic image processing, video motion detection, analysis of connections to other objects (to determine if the object is really traveling through the air or is making some other type of motion, such as being swung by a person), creation and maintenance of track files (to determine the trajectory of the thrown or dropped objects), shape analysis, image segmentation, and pattern recognition (to identify the type of object being dropped or thrown).
- the machine vision processing system 10 may determine if the dropped or thrown object is entering or leaving the controlled area, and may trigger an alarm only if the object is entering the controlled area (or vice versa, exiting the controlled area).
- the alarms or annunciators 26 can be triggered.
- a sixth preferred embodiment extends the functionality of the first embodiment to the counting of people or objects entering or leaving a controlled area.
- the stereo camera 12 tracks any number of people or objects entering or leaving the controlled area through the area of observation 16 .
- the machine vision processing system 10 tracks the people or objects using machine vision techniques that may include stereoscopic image processing, video motion detection, analysis of connections to other objects (to determine which people or objects are moving independently of others), creation and maintenance of track files (to determine, for example, if a person or an object has really entered or left the controlled area or merely entered the area of observation, turned around, and left the area of observation traveling in the other direction), shape analysis, image segmentation, and pattern recognition (to identify and classify the type of object traveling though the area of observation).
- the machine vision processing system is interfaced with the area access controller 24 or local access controller 22 . In this case, the machine vision processing system can identify a number of situations in which security procedures may not be followed correctly, including:
- a person or object i.e., a vehicle of some type
- the machine vision processing system can notify the area access controller or local access controller that the authorized person or objects have not actually entered or left the controlled area as expected. These controllers can then appropriately update their state (e.g., noting which persons or objects are within the controlled area).
- the machine vision processing system 10 can count the numbers of persons or objects (possibly of specific types) that have entered or left a controlled area. This information can be used by, for example, security personnel or an access controller to determine how many people or objects are within the controlled area at any one time.
- the area of observation can be around a fire door or other emergency exit or other portal that is not ordinarily used. In normal circumstances, no person or object would pass through this door or portal. In case of an emergency, any number of people or objects (e.g., vehicles) may pass through.
- the machine vision processing system 10 can then count the number of people or objects (and perhaps types) leaving the controlled area. In some embodiments, the machine vision processing system 10 will notify the area access controller or the local access controller of these activities, and if the rules governing the controlled area are violated, then trigger the alarms and annunciators.
- people or objects may receive authorization to enter or exit the controlled area, possibly from the area access controller or local access controller, but may not actually enter or exit the controlled area.
- the area access controller or local access controller can notify the machine vision processing system 10 of the number of persons and objects (perhaps including information indicating the types of objects) authorized to enter or exit the controlled access area and cross the area of observation 16 .
- a period of time is allowed for the authorized persons or objects to enter or exit (e.g., a timeout is set). If the persons or objects are not observed to enter (or exit) the controlled access area, the machine vision processing system 10 can notify the area access controller or local access controller of these activities.
- the controller may then not allow the authorization to enter (or exit) to be repeated.
- the controllers can also prevent the repeated use of security credentials or pass codes to exit (or enter) when the authorized person or object has not actually used the first authorization, or prevent the same credentials or codes from being used for multiple entrances (or exits) when no exits (or entrances) have been recorded.
- the machine vision processing system 10 can trigger the alarms and annunciators when these access control rules are violated.
- a seventh preferred embodiment extends the functionality of the first embodiment, which has already been described in detail, by adding a second door or other access-limiting structure at the end of a passage (and typically at the end of the area of observation 16 ).
- This configuration creates a “man trap” or vehicle trap to contain the perpetrator of a tailgating or reverse entry attempt in a defined area.
- the local access controllers 22 for both doors or access-limiting structures are under the control of the machine vision processing system 10 . This arrangement allows the system 10 to automatically contain the perpetrator until security personnel have a chance to investigate the incident.
- This alternative extended embodiment retains all other functionality and scope of the first embodiment.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Alarm Systems (AREA)
- Closed-Circuit Television Systems (AREA)
- Burglar Alarm Systems (AREA)
Abstract
Unauthorized entry into controlled access areas using tailgating or reverse entry methods is detected using machine vision methods. Camera images of the controlled area are processed to identify and track objects in the controlled area. In a preferred embodiment, this processing includes 3D surface analysis to distinguish and classify objects in the field of view. Feature extraction, color analysis, and pattern recognition may also be used for identification and tracking of objects. Integration with security monitoring and control systems provides notification when a tailgating or reverse entry event has occurred. More reliable operation in practical circumstances is thus obtained, such as when multiple people are using an entrance or exit under variable light and shadow conditions. Electronic access control systems may further be combined with the machine vision methods of the invention to more effectively prevent tailgating or reverse entry.
Description
- This application is a divisional of U.S. application Ser. No. 10/410,884, filed Apr. 8, 2003, now U.S. Pat. No. 7,382,895, which claims the benefit of the filing date of U.S. Provisional Application No. 60/370,837, filed Apr. 8, 2002, the entire disclosure of which is incorporated herein by reference.
- The present invention relates to detection, alarming, recording and prevention of unauthorized entry at entrances, doors, gates, passages, and the like. More specifically, this invention relates to application of machine vision methods to the detection, alarming, recording and prevention of tailgating (also known as piggybacking) and/or reverse entry events.
- Tailgating (also known as piggybacking) is a significant problem in a wide variety of security and access control applications. Tailgating or piggybacking is the entry into or out of a controlled area or through a controlled access portal of more persons, objects or vehicles than are allowed by access control rules. For example, a tailgating event occurs when persons, generally on foot or in a vehicle, attempt to gain access to an area for which they do not have the required credentials. Another example of tailgating occurs when an unauthorized person on foot or in a vehicle attempts to follow a person (again on foot or in a vehicle) with proper access credentials into a controlled access area. A variation on this approach is for the unauthorized person on foot or in a vehicle to attempt to enter the controlled access area when an authorized person (in a vehicle or on foot) leaves the area.
- Another example of tailgating (or piggybacking) is where a person sits on the shoulders of another person or is carried in some other way by the other person into the controlled access area. In all of the foregoing, the participation of the authorized individual may be inadvertent, voluntary, or coerced. In the remainder of this document, these cases and other similar cases are referred to as tailgating.
- A related problem arises when someone attempts to enter on foot or in a vehicle through an “exit only” access point. This unauthorized use of an exit portal can be referred to as reverse entry. This access control violation arises, for example, when a person attempts to gain access to a controlled area using the exit of a one-way elevator (an elevator intended to be accessible only from inside the controlled area), escalators (by running the wrong direction), one-way revolving doors, or an exit passage. Alternatively, persons on foot may attempt to enter a controlled area by going over or under a gate at a vehicle-only entry or exit point. These methods and related events are collectively referred to herein as reverse entry.
- The consequences of tailgating and reverse entry can vary widely. For example, the consequences may be purely economic as in the case of a successful perpetrator gaining access to an event venue, transportation or other such area without paying the admission or fare. Operators of entertainment venues, sporting facilities, parking facilities, and transportation systems typically wish to prevent revenue loss from unauthorized entrants. In other cases, a successful perpetrator may steal something of value from the controlled area. Operators of industrial and manufacturing facilities, warehouse and other storage facilities, and housing areas, such as apartments or hotels, wish to prevent loss from theft. In yet other cases, a successful perpetrator may cause serious damage to property or harm to individuals in the controlled area. Airports, facilities handling hazardous materials, power plants, and other utility facilities and large public places need to prevent the entry of persons wishing to cause property damage or harm to other people. To achieve these goals, it is necessary that doors, gates, passageways, and other entry or exit areas be protected against unauthorized entry by perpetuators using tailgating and reverse entry methods.
- Prior art access control systems have a long history starting with human guards and various types of mechanical locks. Modem access control systems use a wide variety of technologies and methods, including mechanical keypad or cipher locks, electronic keypad or cipher locks, contact-based or contactless smart cards or smart tokens (generally employing radio frequency or infrared communications), magnetic strip cards, and biometric control methods, such as retinal scans, fingerprint or handprint identification, facial feature identification, and voice print identification. Known access control methods do not prevent tailgating or reverse entry on their own. Moreover, human guards and persons authorized to access a controlled area may assist the perpetrators willingly or unwillingly, further complicating the situation.
- Prior art approaches to the problems of detecting tailgating or reverse entry have not been successful. Most prior art approaches have applied either visible light or infrared break-beam technology. U.S. Pat. No. 3,727,034, issued to Pope, for example, discloses a system employing a pair of break-beams to determine the direction a person is traveling and count the number of people traveling through a passageway. U.S. Pat. No. 4,000,400, issued to Elder, discloses a system applying a similar approach that also uses a pair of break-beams. These approaches suffer from a number of well-documented problems. If multiple people or vehicles pass the break-beam pair at the same time, traveling in the same or opposite directions, the system cannot detect or resolve the ambiguity. People can crawl under or jump over a pair of break-beams. A person with another person on their shoulders or being carried in some other way is not detected. Since the break-beam requires a light source directly opposite the detector, the break-beam cannot be affected by the swing of a door. Architectural modifications may thus be required for installation. The above systems also disclose no provision for interfacing with external access control or other security systems. Further, the optical break-beams may not work in high ambient light conditions.
- U.S. Pat. No. 5,519,784, issued to Vermeulen, discloses a system that attempts to overcome some of the deficiencies of break-beam systems by employing an array of four or more sensors at floor level. However, this approach cannot detect cases where a person carrying another person on their shoulders or in some other way passes through the array. A four-legged animal passing through the array will likely trigger a false alarm. Architectural modifications may be required to force each person to pass through each beam. In addition, the system disclosed has no provision for interfacing with external access control or other security systems. Finally, as noted before, optical break- beams may not work in high ambient light conditions.
- U.S. Pat. No. 4,303,851, issued to Mottier, discloses a system using a pair of video cameras focused on two adjacent tracks through a flat mirror and connected to counter circuits. The tracks are perpendicular to a person's direction of travel. Persons passing through this array are detected and counted. While this approach removes some of the ambiguities associated with break-beam methods, problems still remain. A person with another person on their shoulders or being carried in some other way is not detected. The system disclosed has no provision for interfacing with external access control or other security systems. Further, architectural modifications may be required to keep all persons within the field of view of the cameras and to prevent the swing of a door from interfering with the field of view of the cameras.
- U.S. Pat. No. 4,847,485, issued to Koelsch, and U.S. Pat. No. 4,799,243, issued to Zepke, disclose systems applying arrays of pyroelectric infrared sensors to directionally count people entering or leaving through a passage. The system in U.S. Pat. No. 4,799,243 employs a single linear array of sensors that may not detect a person with another person on their shoulders or being carried in some other way. Further, it is unclear whether several people entering or leaving in close physical proximity would be correctly detected. U.S. Pat. No. 4,847,485 attempts to overcome these deficiencies through the use of multiple sensor arrays. This approach has the drawback that it requires architectural modifications since each person must be forced to walk through all the arrays. The systems disclosed have no provision for interfacing with external access control or other security systems. Further, architectural modifications may be required since the swing of a door cannot affect the area monitored by the sensors. Both systems are also subject to environmental restrictions since they use pyroelectric sensors and are unsuitable for vehicle entrances.
- U.S. Pat. No. 5,866,887, issued to Hashimoto et al., discloses a system that applies a similar approach but uses a moving sensor and pattern recognition to reduce both the cost and the ambiguity inherent in detecting multiple people at the same time. This system does not overcome all the aforementioned deficiencies, since the approach still relies on sensing biomass though detection of body heat of people.
- U.S. Pat. No. 5,201,906, issued to Schwarz et al., discloses a system that applies a set of ultrasonic sensors in a revolving door structure. The sensors are interfaced to a local access control system to prevent or detect piggybacking. The ultrasonic sensors determine if more than one person is in one compartment or more than one compartment is occupied. This approach requires architectural modifications to most facilities, since a revolving door is required. The rate at which people can pass through the revolving door is likely less than a conventional door with security access. Further, this approach is unsuited for vehicle entrances.
- U.S. Pat. No. 6,081,619, issued to Hashimoto et al., discloses a system that employs either linear or angular infrared distance or range-finding arrays. This approach has drawbacks in that some embodiments require architectural modifications since each person must be forced to walk through the array, and the observation area of the sensors cannot be affected by the swing of a door. The system disclosed has no provision for interfacing with external access control or other security systems. Finally, this system is subject to environmental restrictions since it uses infrared technology and is unsuitable for vehicle entrances.
- The use of simple motion detection video is known in the security technology industry. Motion detection video uses frame-differencing and related methods applied to the output of a video camera. These methods suffer from problems such as changes in lighting and shadowing. Overlapping objects are often difficult to separate since there is no depth (three-dimensional) analysis and no attempt is made to analyze and track individual objects.
- U.S. Pat. No. 5,581,625, issued to Connell, discloses a system that uses a pair of stereoscopic video cameras and associated machine vision analysis to separate individual objects (e.g., people) in a line that partially overlap from the perspective of the stereo camera pair. This system makes no attempt to determine the motion of the people observed. Furthermore, there is no provision for interfacing the system with external access control or other security systems.
- The present invention overcomes the deficiencies of prior art systems by using advanced machine vision methods and providing optional integration with access control systems and other security apparatus. Unlike prior art systems, preferred embodiments of the present invention do not require modification of buildings or other facilities. Further, preferred embodiments of the invention provide more reliable operation in practical circumstances, such as when multiple people are using an entrance or exit and under variable light and shadow conditions.
- Embodiments of the invention are preferably configured to operate on any type of entrance or exit, including those with doors, gates, passages, elevators, escalators, and the like. The invention may be applied to persons on foot, animals, vehicles, persons in vehicles, packages (e.g., parcels, luggage, etc.), and any other type of object entering or exiting a controlled access area. The invention can be used to monitor a single designated access point, such as an entry or exit point, or can be applied to any number of distributed entry or exit points at one or more controlled access areas.
- Machine vision methods employed in preferred embodiments of the invention include three-dimensional (3D) surface analysis of objects in the image. This allows the system to distinguish and classify multiple objects that may overlap in the field of view or be observed under conditions of variable light and shadow. The system may use other machine vision methods, including feature extraction and pattern analysis and recognition, to enhance the identification and tracking of objects.
- Systems constructed according to preferred embodiments may interface with a variety of electronic access control equipment, including electronic keypad or cipher locks, contact-based or contactless smart cards or tokens (generally employing radio frequency or infrared communications), magnetic strip cards, and biometric control methods, such as retinal scans, fingerprint or handprint identification, facial feature identification, and voice print identification. The systems may also employ machine vision methods to monitor doors, gates, elevators, passages, escalators, etc., with mechanical access control or no access control at all. The systems may further be used to provide supplemental automated monitoring of entrances and exits that are monitored with human guards.
- A preferred system of the invention can be fully integrated with security monitoring and access control systems. The integrated system provides audible and visible alarms to alert security personnel when a tailgating or reverse entry event has occurred. A preferred system of the invention also provides one or more video outputs from event monitoring cameras. These video outputs can be switched to display video of an incident scene for security personnel and to make a video record of the event. The video display and record may include textual and graphical information indicating the location of the incident, relevant personnel identification or access codes, and date and time, for example.
- Ambiguous situations arising from objects that cannot be accurately classified by machine vision methods may signal an alert to security personnel to manually investigate the situation. An example of an ambiguous situation is where two people attempt to enter using only one access credential and throw a blanket or coat over their heads in an attempt to evade the system. Another example arises when a person enters with a large animal (e.g., guide dog). The system provides video that can be switched to a display to aid in resolution of the event.
- There are several aspects of the invention, each of which may be used singly or in combination with the others when constructing a system according to the invention. One aspect is to use machine vision methods to detect and prevent unauthorized individuals employing tailgating or reverse entry methods from gaining access to a controlled area. Such tailgating or reverse entry methods may involve two or more persons on foot traveling in the same or different directions, persons in two or more vehicles traveling in the same or opposite directions, a single person attempting to enter through an exit only access point, a person using a vehicle entrance while on foot, etc. Machine vision methods that include stereo image analysis, feature detection, color analysis, and pattern recognition or matching, for example, may be used to detect and prevent such unauthorized access.
- Another aspect is to capture, route, display, and/or record event video and other information about a tailgating or reverse entry event, including identification of the person(s) involved (if known), date, time, location, etc. This data processing may be under the control of the machine vision processing system.
- A further aspect of the invention is to prevent tailgating or reverse entry at one or more possibly remote locations with doors, gates, escalators, elevators, passages, and other entry and exit points without the need for architectural modifications of existing facilities.
- Preferred embodiments of the invention are configured to operate with any available access control system, including human guards, electronic systems, and mechanical systems, through appropriate interfaces and using machine vision capability to monitor doors, gates, and other entrances or exits. The cost of employing machine vision methods may be reduced by using a single processing system to monitor multiple controlled areas through the use of video switching to share image processing capabilities. Moreover, other event information from sources, such as electronic access control systems and door or gate sensors, may be combined with the machine vision methods of the invention to more effectively prevent tailgating or reverse entry.
- Yet another aspect of the invention is to provide built-in initialization, calibration, and on-going test methods, including continuous monitoring and analysis of a background image or scene using machine vision methods, including stereo image analysis, to ensure the integrity and accuracy of the machine vision system and to prevent attempts by perpetrators to alter the background.
- Still another aspect of the invention is to provide a tailgating and reverse entry detection system with greater resistance to environmental conditions, including changing light and shadowing, through the use of electronic camera control, interactive user interface, and machine vision methods including feature extraction and stereo analysis. Interference from doors, gates, and other access limiting structures may be eliminated, regardless of the direction of opening or swing of the structures, particularly in embodiments where the cameras are placed above and beyond the reach of the access limiting structures.
- It will be appreciated that the foregoing description of features and aspects of the invention are not intended to be exhaustive or to limit the scope, functionality, or operation of the invention.
- The invention will be described by reference to the embodiments described herein in conjunction with the drawings in which:
-
FIG. 1 is an overall block diagram of one preferred embodiment of a system constructed according to the invention; -
FIG. 2 is a block diagram of a machine vision processing system that may be used in the embodiment of the invention shown inFIG. 1 ; -
FIGS. 3A , 3B, 3C, 3D, 3E, 3F, 3G, and 3H depict process flow diagrams that may be employed by the embodiment of the invention shown inFIG. 1 ; -
FIGS. 4A , 4B, 4C, 4D, 4E, 4F, 4G, and 4H depict examples of stereo image analysis performed by the embodiment of the invention shown inFIG. 1 ; and -
FIGS. 5A , 5B, 5C, 5D, and 5E depict an example of a set of interactive displays for an operator interface. - The following detailed description refers to the accompanying drawings and describes exemplary embodiments of the present invention. Other embodiments are possible and modifications may be made to the embodiments described herein without departing from the spirit and scope of the invention. The section titles that follow are provided for convenience and are not meant to limit the invention.
- A first preferred embodiment of the invention uses a stereo pair of tracking cameras and a single event capture camera combined with machine vision processing to detect or prevent tailgating and/or reverse entry events.
- A block diagram of the first embodiment of the invention is shown in
FIG. 1 .FIG. 1 depicts a machinevision processing system 10 that is capable of processing video images and detecting tailgating or reverse entry events. The machinevision processing system 10 receives video inputs from cameras, here a stereo pair of trackingcameras 12, and anevent capture camera 14. Other camera inputs to the machinevision processing system 10 may be included if desired. The overlapping viewing area of these cameras is shown inFIG. 1 as an area ofobservation 16 in which objects 18 are identified, classified, and possibly tracked by the machinevision processing system 10. Part of the area ofobservation 16 may optionally be defined by a door, gate, portal, elevator door, turnstile, or otheraccess limiting structure 20. It will be understood that the invention is applicable to any type of access limiting structure which people or objects can pass through, and which can include (but are not limited to): - 1. One or more planar doors, possibly interlocking when closed, that open and close with hinges on the top, bottom or side, and are operated manually or automatically;
- 2. One or more planar doors, possibly interlocking when closed, that slide up, down or sideways, and are operated manually or automatically;
- 3. Folding or rolling doors, possibly interlocking when closed, that are operated manually or automatically;
- 4. One or more gates or doors, possibly interlocking when closed, that do not completely obstruct the opening or portal and may open in any manner, and are operated manually or automatically; and
- 5. Revolving structures, such as revolving doors or turnstiles, which partially or fully occupy the opening or portal and are operated manually or automatically.
- Alternatively, the area of
observation 16 can be in a passage, escalator, or other limited access area that is not defined by an access limiting structure. - The stereo pair of tracking
cameras 12 are preferably placed overhead of the area ofobservation 16, but may be placed at any convenient location with a clear view of the area of observation. Preferably, thestereo tracking cameras 12 are held in an adjustable bracket that allows their positions to be adjusted but maintains their alignment, and is sturdy enough not to move from these settings with time. Camera positions may be adjusted manually or by motor drive that is locally or remotely controlled. Theevent camera 14 is ideally placed at a location that gives a clear view of persons, vehicles, or other objects in the area ofobservation 16. Brackets for the cameras may be attached to any solid surface, such as a ceiling or wall, and thus do not require architectural modification of the area. Thecameras - The machine
vision processing system 10 interfaces with an optional access control system such asarea access controller 24 and one or more optionallocal access controllers 22. Thearea access controller 24 monitors and controls access at all designated access points, such as entry and exit points, to a particular controlled area or areas. Centralized control and monitoring is often provided at a centralized security desk.Local access controllers 22 control one or more doors or otheraccess limiting structures 20. Thearea access controller 24 may interface with thelocal access controllers 22 via wired or wireless communication of signals to maintain centralized control and monitoring of allaccess limiting structures 20 in the area or areas being controlled.Local access controllers 22 may include, for example, electronic keypad or cipher locks, contact-based or contact-less smart cards or electronic tokens (generally employing radio frequency or infrared communications), magnetic strip cards, and biometric control methods, such as retinal scans, fingerprint or handprint identification, facial feature identification, and voice print identification. - The
area access controller 24 andlocal access controllers 22 may be configured to send access control information to the machinevision processing system 10. This data may include status of doors or otheraccess limiting structures 20, data from other access monitoring systems such as break-beams (not illustrated), and personnel identification or access codes. The machinevision processing system 10 may send control data or signals to thearea access controller 24 or thelocal access controllers 22 to “lock down” a facility or to close a particular door or gate when a tailgating or reverse entry incident is detected. - The machine
vision processing system 10 may further interact with alarms and/orannunciators 26 which can include, for example, bells, sirens, machine-generated or recorded speech, lights, image displays, and text displays. The alarms andannunciators 26 may be local to the area ofobservation 16 or near thearea access controller 24 or a centralized security desk. In some embodiments, the alarms andannunciators 26 may be connected to thearea access controller 24 or the one or morelocal access controllers 22. In this case, the machinevision processing system 10 can send a signal to these controllers to trigger the alarms and annunciators. This alternative configuration does not change the scope, spirit, or functionality of the invention. - The machine
vision processing system 10 supplies video, typically from theevent camera 14, incident data, and control data or signals to anoptional video display 28 andoptional video recorder 32. Thevideo display 28 may be used by security or other personnel to observe the area ofobservation 16, especially when a tailgating or reverse entry incident is detected. Thevideo display 28 may have the ability to receive multiple video signals and switch between the multiple video signals. When an incident arises, the machinevision processing system 10 preferably switches video produced by theevent camera 14 at the scene of the incident for display on thevideo display 28. Thevideo display 28 may include textual or graphical data describing the location of the incident and pertinent personnel identification or access codes. Thevideo display 28 may be at a centralized security desk or at some local location. Preferably, video of a period of time before and after detection of an incident is preserved by thevideo recorder 32 for future reference. Incident data, including incident location, date, time, and personnel identification or access codes, may also be recorded. - The machine
vision processing system 10 preferably interfaces with anoperator interface 30. Theoperator interface 30 may be used by an operator to provide commands for configuring, initializing, calibrating, and testing of the machinevision processing system 10. Optionally, theoperator interface 30 may provide a textual or graphical user interface for presenting data and command prompts to the operator. The user interface may be interactive and enable the operator to configure properties of the system, including properties of the image analysis, properties of the machine vision processing system, and properties of access control systems, such as the local and area access controllers. - Some embodiments of the invention may use different types of alarms for different situations. These alarms can have different audible and visual properties. Some possible examples of these different alarms can include:
- 1. A warning alarm that may be used in non-critical situations such as someone starting to travel in the wrong direction through a portal (but not yet violating the security policy);
- 2. An alarm specific to the type of security violation, such as tailgating, which may have distinctive audible or visible patterns; and
- 3. An alert indicating an unusual situation, such as people or objects traveling too close together, or an object in the area of observation that cannot be unambiguously identified.
-
FIG. 2 illustrates a block diagram of one suitable embodiment of the machinevision processing system 10. Other embodiments may include additional or fewer components than those shown inFIG. 2 . Furthermore, in many applications, the machinevision processing system 10 may be housed in a single cabinet, but in other applications, it can be distributed in packaging or space. The machinevision processing system 10 may also be built using redundant components to resist failure, damage, and tampering. - In the embodiment shown in
FIG. 2 , video input from thestereo tracking cameras 12 and theevent camera 14 is received through avideo switch 50. Thevideo switch 50 may also receive input from other video cameras, possibly at other areas of observation. Thevideo switch 50 feeds the video signals from the cameras to aframe grabber 52 where one or more video frames from the signals are captured and stored for processing. By switching between the different sets of cameras and capturing video signals on a frame-by-frame basis, the use of avideo switch 50 reduces the cost of theframe grabber 52. Theframe grabber 52 may need to accommodate only one input signal at a time. If analog cameras are used, theframe grabber 52 preferably digitizes the video images before storing them. - A
stereo image processor 54 andfeature extractor 56 operate on the image frames stored by theframe grabber 52. Thestereo image processor 54 operates on the stereoscopic color images received from thestereo tracking cameras 12, and preferably performs 3D surface analysis of theobjects 18 and background in the area ofobservation 16. Suitable methods for 3D surface analysis and background analysis are known, and can include those disclosed in U.S. Pat. No. 5,581,625, issued to Connell, the disclosure of which is entirely incorporated by reference herein. Thestereo image processor 54 may also use input from thefeature extractor 56. Thefeature extractor 56 identifies features such as edges and corners in images received from thestereo tracking cameras 12 and possibly theevent camera 14. Thestereo image processor 54 can use the feature information from thefeature extractor 56 to test different hypotheses to the best stereo image depth map. - The outputs of the
stereo image processor 54 andfeature extractor 56 are fed to anobject identification processor 58. Theobject identification processor 58 uses the 3D surface data and image features extracted from the images to identify and possibly also classifyobjects 18 in the area ofobservation 16. In one embodiment, theobjects 18 are identified and classified using pattern recognition methods. Those skilled in the art will be fully familiar with many suitable techniques used for pattern recognition. Typically, pattern recognition involves company patterns in the 3D surface data and image features with equivalent data and features of known or previously-identified objects. A pattern match within a specified tolerance constitutes recognition of the pattern. Theobject identification processor 58 stores image and object data in the object andimage memory 62. These data may include, but are not limited to, location (which can be defined by centroid computation or other suitable methods), depth, shape, color, size, and connection to other objects. Optionally, theobject identification processor 58 may use object and image data stored in thememory 62 from previous video frames to aid in the identification and classification of objects. Correlation of the objects from image to image can be based on one or more properties of the object including, size, shape, or color. This may be useful in cases, for example, where a more certain identification of an object in one image can be used to identify the object in another image where the identification may be less certain. - Data is provided from the
object identification processor 58 to theobject tracking processor 60. Using the object and image data from the current frame and object and image data stored in thememory 62 from previous frames, theobject tracking processor 60 maintains track records or track files of theobjects 18 in the area ofobservation 16. Using this information, theobject tracking processor 60 determines the trajectory and speed ofobjects 18 in the area of observation. The can be done, for example, where an object in an image is correlated with an object having the same or similar classification in one or more prior images and with motion along some expected trajectory, or where the motion from one image to the next is limited. - The
object tracking processor 60 provides tracking information to an alarm andsystem controller 64. In a preferred embodiment, the alarm andsystem controller 64 applies access control rules to the object tracking data to determine if a tailgating or reverse entry event or other questionable incident has occurred. Alarm decisions may also be made by applying access control rules to data received from theaccess controllers - 1. More people or vehicles passing though the area of observation than are allowed under the rules, by the credentials or codes presented to the access controller, by one or more people. For example, some codes or credentials may allow one person to “host” any number of other persons through the area of observation, whereas some persons must pass through individually.
- 2. More people or vehicles passing though the area of observation than have been authorized by the access controller, based on credentials or codes presented.
- 3. Reverse entry situations (e.g., objects coming into an area of observation from an exit-only location).
- 4. Objects of unusual size or shape (possibly a person riding on the shoulders or being carried by another) passing through the area of observation.
- 5. Objects that cannot be well classified (unknown type) passing through the area of observation.
- 6. Door, gate, or other access limiting structure left open for longer than a threshold period of time.
- 7. The types of objects that are not allowed (i.e., a cart, a box, a wheel chair) for a person or object with a particular credential or code, when passing through the area of observation.
- 8. Objects of a type that are not allowed for any access credential or code (e.g., a person on foot using a vehicle-only access or a person on a package-only conveyor belt) passing through the area of observation.
- If one or more rules access control applied by the alarm and
system controller 64 indicate an alarm situation, the alarm andsystem controller 64 alert operators or security personnel as required using an interface to the alarms andannunciators 26. The alarm andsystem controller 64 may also initiate a display of video from theevent camera 14 of the area ofobservation 16, possibly along with other event data, including location and access code or personnel identification, on thevideo display 28. Event camera video transmitted to thevideo recorder 32, possibly with associated data, including event location, date, time, and access code or personnel identification, may be preserved for future reference. - Operators and technical personnel using the
operator interface 30 can interact with the machinevision processing system 10 through the alarm andsystem controller 64. Theoperator interface 30 provides commands for configuration, initialization, calibration, and testing of the alarm andsystem controller 64. Optionally, the alarm andsystem controller 64 can provide textual, audio, and graphical information to the operator through theoperator interface 30. - Those skilled in the art will be familiar with many machine vision methods that are well documented in the scientific and engineering literature. These methods may be employed by the
video switch 50, theframe grabber 52, thestereo image processor 54, thefeature extractor 56, theobject identification processor 58, theobject tracking processor 60, and the object andframe memory 62. - The first embodiment described herein processes data from one or more sets of the
stereo tracking cameras 12 and possibly theevent camera 14 on a frame-by-frame basis. The machinevision processing system 10 operates on the frame-by-frame image data to detect, announce, and where possible prevent, tailgating or reverse entry events. One exemplary process used by a machinevision processing system 10 as shown inFIG. 1 is described inFIGS. 3A , 3B, 3C, 3D, 3E, 3F, 3G, and 3H. It will be understood that the particular sequence and nature of the actions described can be altered without changing the spirit, function, or capability of the invention. - Turning first to
FIG. 3A , atblock 100, the machinevision processing system 10 starts processing, generally as a result of power-up, a boot event, or other initialization event. The electronics are initialized atblock 102, followed by software initialization atblock 104. If, atblock 106, faults or failures are detected during the initialization process, failure information is preferably displayed atblock 116 to an operator using theoperator interface 30. Generally, the information presented to the operator will be as diagnostic as possible to assist in correcting the failure or fault. In the case of a failure, the operator or other personnel may take corrective action atblock 118 and attempt to start or initialize the system again atblock 100. Examples of failures that may be detected on initialization or during on-going operation include: - 1. Failure of one or more electronic components.
- 2. Failure of communications with the
access controllers annunciators 26,video display 28, orvideo recorder 32. - 3. Failure to receive correct signals from any of the
cameras - Once the machine
vision processing system 10 is initialized and determined to be operating properly, a background image for thestereo tracking cameras 12 and possibly theevent camera 14 is detected and saved for reference atblock 108. Alternatively, a previously detected and saved background image may be used. Objects in the background image are identified by the machinevision processing system 10 and noted as belonging to the background image. Accordingly, during run-time operation, new objects that appear in the area ofobservation 16 may be identified and classified separately from the background image. - Upon initialization, and periodically over time, a calibration procedure may be performed, as indicated at
block 110. A typical calibration procedure involves moving a target of known size, shape, and pattern across the area ofobservation 16 at different heights above the base surface or floor and collecting data with thestereo tracking cameras 12. Once the calibration data is collected, the operation of the cameras and object identification and tracking processing is verified and the calibration parameters are saved 112. If, atblock 114, any failures are detected during the collection and saving of the background image (block 108) or the calibration process (block 110), information relating to the failure is displayed atblock 116 to the operator using theoperator interface 30. Generally, the information presented will be as diagnostic as possible to assist in correction of the failure or fault. In the case of a failure, the operator may take corrective action (block 118) and then attempt to start or initialize the system again (block 100). - Turning now to
FIG. 3B , once the machinevision processing system 10 is initialized and calibrated as required, thesystem 10 starts atblock 120 to capture video frames from the one or more sets ofstereo tracking cameras 12. Video from theevent camera 14 may also be captured. Either during or after a set of frames is captured, image features are extracted atblock 122, and stereo image analysis of the color image is performed atblock 124. This may include methods such as those taught in U.S. Pat. No. 5,581,625, referenced earlier and incorporated by reference herein. Atblock 126, the image from the event camera is stored. Atblock 128, tests such as thresholding are applied to the stereo image analysis to determine if the background image has changed significantly. For example, objects in the background are identified in the image or depth map, and if they are not clearly visible, an alarm is sounded. This may occur, for example, if someone has covered one of more of the cameras. Changes in the background can indicate an attempt to alter the background by a perpetrator or an equipment failure. If required, the focus, aperture, and shutter settings of the cameras may be electronically updated atblock 130. The need for and availability of these settings is determined in large part by the particular model of cameras chosen. - Using the image features extracted at
block 122 and stereo image analysis performed atblock 124, objects in the image, such asobject 18 shown inFIG. 1 , are extracted atblock 131. If required by changes in the image or in the identified objects, a 3D surface analysis of the objects may be updated at block 132 (FIG. 3C ). Classification of the objects may be verified atblock 134 to ensure that the classification of the objects remains a good match. The objects identified in the current frame are compared to those identified in a previously collected frame, as shown atblock 136. Atblock 138, the machinevision processing system 10 determines whether new objects appear in the image or if unexpected changes in any objects are detected. If objects seen in the previous frame are no longer observed in the current image, as determined atdecision block 140, the data on these objects is removed atblock 144 from the tracking file and the history file is marked atblock 146 with the event as desired. - Turning now to
FIG. 3D , the motion of each object in the image is determined with respect to those objects in the previous frame, as indicated atblock 148, and the track files for the objects are updated atblock 150. Atblock 152, the machinevision processing system 10 receives available information from external sources such as the area orlocal access controllers block 154 to determine atdecision block 156 if there is an alarm condition. For example, the locations of one or more objects entering one side of the area ofobservation 16 and leaving at another side of the area of observation are recorded in the track file. This information along with access credentials and other access control information can be used to determine if tailgating (piggybacking) or reverse entry is occurring. At the same time, information for an object in the track file and the possible lack of connection to other objects can be used to determine if thrown objects have traversed the area of observation in violation of access control rules. In some embodiments, objects are identified as being thrown based on their speed and/or trajectory of motion through the area of observation. - If there is an alarm condition, the machine
vision processing system 10 triggers audible and visible alarms atblock 158, and optionally annunciates alarm condition information atblock 160 using the alarms and annunciators 26 (FIG. 1 ). At the same time, pertinent video and alarm condition information for the area ofobservation 16 may be displayed at block 162 (FIG. 3E ) using thevideo display 28. If required, control signals or data indicating the alarm condition and possible required actions are sent to the area andlocal access controllers block 164. Alarm information may be logged atblock 166 for future analysis as required and relevant video segments obtained from theevent camera 14 may be saved along with other event information atblock 168 by thevideo recorder 32. - If an unexpected change in the background image or other errors occur at block 128 (
FIG. 3B ) the machinevision processing system 10 preferably triggers audible and visible alarms at block 170 (FIG. 3F ), to alert personnel of the failure. The machinevision processing system 10 may also annunciate the failure condition atblock 172 and display the failure condition information atblock 174 using the alarms andannunciators 26, theoperator interface 30, and possibly thevideo display 28. The failure information is logged atblock 176 for later analysis if required. The failure may be resolved by an operator or technical personnel atblock 178, who then reinitialize the system at block 100 (FIG. 3A ). - Whenever a
new object 18 or unexpected change in anobject 18 is detected in the image, as indicated at block 138 (FIG. 3C ), the stereo image analysis performed atblock 124 is performed for these objects at block 180 (FIG. 3G ). This analysis, as noted earlier, preferably includes 3D surface analysis. Based on the 3D surface analysis (and possibly also features extracted as inblock 122 with other image characteristics), the new object is classified atblock 182, which may use pattern recognition or matching methods, as discussed earlier. If the object is determined to match a known object type or types, as indicated atblock 184, the object identification and object characteristics and position are logged to a tracking file as indicated atblock 188. - If a new or changed
object 18 cannot be matched to one of known characteristics, as indicated atdecision block 184, the characteristics of the object are tested at block 190 (FIG. 3H ) to determine if the changes exceed significance limits with respect to either the object previously identified or with respect to known image classification data. This situation may arise, for example, when an object or some characteristic of the object appears to change more than a certain amount from one image to the next. If these limits are not exceeded, the object position and characteristic information is added to or updated in the track file atblock 204. - If characteristics of the new or changed
object 18 do exceed the significance limits atblock 190, and, thus, cannot be satisfactorily classified or has changed characteristics to an unacceptable degree, the machinevision processing system 10 triggers audible and visible alarms atblock 192 to alert personnel of the existence of the unexpected or ambiguous object or change in object. Thesystem 10 may also annunciate information atblock 194 about the object and its location using the alarms andannunciators 26. Information about the object, its location and other data, such as personnel identification or access codes, may further be displayed atblock 196, along with video from theevent camera 14, using thevideo display 28. Information on the unknown object is logged atblock 198 as required for later investigation. Personnel may investigate the object, as indicated atblock 200, and if desired, enter updated classification information atblock 202 using theoperator interface 30. This updated information will typically be used to improve or extend the classification of common objects. - In some embodiments, a reverse entry detection system can be identical to a piggyback detection system, but configured with access control rules that, when applied, do not accept any object traveling in a certain direction. Thus, any attempt by a person, object, or vehicle to pass through the area of observation in a certain direction (e.g., entering the area through an exit-only portal) will trigger an alarm.
- In some embodiments, the machine
vision processing system 10 may have the ability to directly control thedoor 20 or other access limiting device. In other embodiments, the machinevision processing system 10 may indirectly control the door or other access limiting device by sending signals to thearea access controller 24 and/orlocal access controllers 22. This alternative configuration does not change the functionality, scope, or spirit of the invention. Examples of these controls can include the following: - 1. The machine vision processing system can close and perhaps lock the door.
- 2. The machine vision processing system can directionally lock (perhaps after closing) the door (i.e., allowing exits but not entrances).
- 3. The machine vision processing can close and lock (or directionally lock) one or both of the doors in a man-trap (see discussion below regarding a seventh embodiment of the invention).
- 4. The machine vision processing system can stop and perhaps reverse a rotating door or escalator.
- In some embodiments, the machine
vision processing system 10 may receive information from a position sensor attached to thedoor 20 or access limiting device. The position sensor may be comprised of known electronic components that produce a variable signal based upon the position of the door. For example, when the door is closed, the position sensor may encode and transmit one signal, and when open, encode and transmit another signal. Door positions of a partially open door may also be encoded and transmitted so the relative position of the door is known to thesystem 10. The machinevision processing system 10 can use this door position information to make inferences on the background image in the area ofobservation 16 by a priori predicting where the image of the door should be seen. During calibration, position sensor information received for various positions of objects, such as a door, may be correlated and recorded for determining the background image that is used during normal run-time analysis. - Examples showing the results of a stereo image analysis as described above are provided in
FIGS. 4A , 4B, 4C, 4D, 4E, 4F, 4G, and 4H. These images depict the outcome of a real-time 3D surface analysis performed by themachine vision processor 10 shown inFIG. 2 . In this particular embodiment, the lighter the shade of the image, the closer the surface is to the vantage point of the stereotracking camera pair 12. -
FIG. 4A depicts the results of a 3D surface analysis of a background image. Twoobjects -
FIG. 4B depicts a real-time 3D surface analysis of the same background as shown inFIG. 4A (including the door objects 250, 252), now with three people present in the area of observation. For convenience of illustration, the three people are labeled here as “A” 254, “O” 256, and “P” 258. The arrows associated with the letter labels indicate each person's direction of motion as determined by the machinevision processing system 10. It will be understood that the machinevision processing system 10 can identify, classify, and track an essentially arbitrary number of moving objects within the field of view. Not only can the machinevision processing system 10 determine and evaluate direction of motion, but also the speed and/or trajectory of each object's motion using known techniques. -
FIG. 4C depicts a 3D surface analysis of the background image (FIG. 4A ) with asingle person 260 entering the field of view. This person has been identified and classified by the machinevision processing system 10 and is labeled here with the letter “K.”FIG. 4D depicts the same field of view and background image asFIG. 4C , but theperson 260 has moved a few steps. In this case, the person is being tracked as he or she is entering the controlled access area. -
FIG. 4E depicts the results of a 3D surface analysis of the background image (FIG. 4A ) with twopeople 264, 266 (respectively labeled “I” and “C”) traveling in the same direction to the left.FIG. 4F depicts the same field of view and background image asFIG. 4E with the twopeople first person 266, labeled with the letter “C” and presumably having correct credentials for the controlled access area, is being closely followed by asecond person 264, here labeled with the letter “I.” -
FIG. 4G depicts the results of a 3D surface analysis of the background image (FIG. 4A ) with twopeople FIG. 4H depicts the same field of view and background image asFIG. 4G with the twopeople first person 274, labeled with a “K” and presumably having correct credentials for the controlled access area, is exiting the controlled area while thesecond person 272, labeled with the letter “J,” attempts to maintain proximity to thefirst person 274 and enter while the exit door is still open. - In some embodiments, one or more operators can interact with the machine
vision processing system 10 through anoperator interface 30. Some possible functions of theoperator interface 30 have already been discussed.FIGS. 5A , 5B, 5C, 5D, and 5E provide one example of a set of interactive screen displays for anoperator interface 30. It will be understood that the organization of the operator interface shown inFIGS. 5A , 5B, 5C, 5D, and 5E is an example only, and that many possible arrangements for the functionality of theoperator interface 30 are available. Further, additional functionality can be added to theinterface 30, or alternatively, functionality shown may not be required in all cases. -
FIG. 5A depicts aninteractive display screen 500 of theoperator interface 30, with the “Monitor”tab 502 selected. This interactive display is used to monitor the operation of the machinevision processing system 10. Adisplay control area 504 interacts with the operator to select the designated access point (e.g., portal) for monitoring, particularly in cases where the machinevision processing system 10 is monitoring more than one portal. In this example, the properties of the view displayed can be selected using the radio buttons labeled “Normal,” “Show camera views,” “Show tracking image,” “Showimage tracking camera 1,” “Showimage tracking camera 2,” and “Show event camera.” In some embodiments, the one or more views chosen will be routed to thevideo display 28. For each portal, one or more sets of alarm statistics foralarm events 510 can be displayed. Areset 508 for the event statistics can be provided for each of the portals. These statistics can also be saved in a log file for further analysis. -
FIG. 5B depicts the “Installer I/O”tab 512 of theinteractive display 500. This interactive display can organize functionality useful to personnel installing and maintaining the system of the invention. Using the Installer I/O screen, a user can select between one or morepossible portals 514 as required. Parameters required for correcting the images forcamera geometry 516 can be entered interactively, possibly including “Camera height,” “Camera distance to door,” and “Door width.” The trigger for one or more possible types of “Outputs” 518 from the machinevision processing system 10 can be selected in a “cross-connect” fashion based on any of the one or more inputs. In this example, the outputs include (a) suspicious entry, (b) tailgates, and (c) warnings, and the inputs can include one or more relays (four in the example shown) from the one or more portals. - The machine
vision processing system 10 can have multiple levels of security or password access. In this example, three levels of password-controlled security are used: - 1. A general user level, typically used by security personnel.
- 2. An installer level, with complete access to all system configuration capabilities and the ability to administer all levels of accounts.
- 3. An administrator level, with limited configuration privileges, but with the ability to administer user (e.g., security personnel) accounts.
- The Installer I/
O display 512 of theinteractive operator interface 500 can be used to change theinstaller password 520 or theadministrator password 522. The Installer I/O interactive display can be used to configure properties foralarm sensors 524, which may include, for example, whether the door contact is closed when the door is “open” or “closed,” whether a valid access contact “opens” or “closes” the door when activated, and whether the alarm is reset when the reset contact is “open” or “closed.” -
FIG. 5C depicts the “Installer Camera Settings”tab 540 of theinteractive display 500. This set of interactive tools is typically used by installation or maintenance personnel to configure machine vision parameters, often in conjunction with functionality on the “Installer I/O”tab 512. Adisplay control area 542 allows the operator to select the portal for monitoring interaction, for cases where the machinevision processing system 10 is monitoring more than one portal. In this example, the “Image Type” 544 of the view displayed can be selected using the radio buttons labeled, “Show Tracking Camera 1,” “Show Tracking Camera 2,” “Show Tracking Camera 1 (unwarped),” “Show Tracking Camera 2 (unwarped),” “Show Tracking Image,” and “Show Event Camera.” In some embodiments, the one or more Image Type views chosen may be routed to the built-indisplay 548. Updates to camera properties can be invoked using the “Update Now” 546 button. - The “Physical Setup” 550 or calibration may be controlled from this screen. A calibration process, used to calibrate the machine
vision processing system 10 with respect to the background, is initiated using the “Calibrate” button. As discussed earlier, objects in the background image are identified and registered as pertaining to the background so they do not interfere with run-time identification and analysis of new objects entering and leaving the field of view. The “Clear Calibration” button allows the user to remove an old calibration before creating a new one. One or more “Sensitivity” 552 settings, used by the machine vision processing system to identify objects in the image, can be adjusted with the InstallerCamera Settings tab 540. In this example, a slider control and a numeric display are used for these sensitivity settings, including: - 1. The maximum head size used as a filter for the accurate identification of people.
- 2. Cart sensitivity, used to properly identify moving carts or other objects on wheels or slides.
- 3. The minimum head size used as a filter for the accurate identification of people.
- 4. Crawler sensitivity, used to determine when a person is crawling in an attempt to evade the system.
- One or more “Configuration” 554 parameters may be used to restrict the portion of the area of
observation 16 within which the machinevision processing system 10 searches for one or more types of behavior. In this example, the types of behavior to search for are selectable and numeric parameters setting the limits of the zones within the area of observation may be established. In alternative embodiments, an interactive tool can be used to select the search zones by drawing on an image, for example. -
FIG. 5D depicts the “Setup I/O”tab 560 of theinteractive display 500. This set of interactive tools is typically used by installation or maintenance personnel to configure the I/O properties of the machinevision processing system 10, often in conjunction with functionality on the “Installer I/O” 512 tab and the “Installer Camera Settings”tab 540. Using the “Setup I/O”screen 560, a user can select between one ormore portals 562 as required. The types of “Alarms” 564 enabled can be set. In this example, four choices are available, including (a) warning voice, (b) suspicious voice, (c) buzzer, and (d) light. A timeout time for the alarm can be set and a “Reset Alarm” button can be used to reset the alarm state. The “Reset Event Statistics”button 566 resets the alarm statistics for the portal. A system administrator can change the password using the “Change Admin Password”button 568. Portals can be assigned a name using atext box 570. Time on thearea access controller 24 orlocal access controller 22 can be read and set with a “Time”control tool 572. Moreover, one or more “Policy” 574 properties can be selected. In this example, there is provided an option to allow multiple entries (or people) per access cycle. Other applicable policies may also be displayed and selected. -
FIG. 5D shows the “Setup View”tab 580 of theinteractive display 500, which is used to set camera views and calibrate the machinevision processing system 10. Aportal selection 584 allows the operator to select the portal for setup, for cases where the machinevision processing system 10 is monitoring more than one portal. The “Image Type” 586 of the view displayed can be selected using the radio buttons labeled, “Show Tracking Camera 1,” “Show Tracking Camera 2,” “Show Tracking Camera 1 (unwarped),” “Show Tracking Camera 2 (unwarped),” “Show Tracking Image,” and “Show Event Camera,” for this example. In some embodiments, the one or more Image Type views chosen will be routed to the built-indisplay 588. Updates to camera properties can be invoked using the “Update Now” 590 button. The properties of the view “Display” 592 can be selected using the radio buttons labeled, “Normal,” “Show camera views,” “Show tracking image,” “Show tracking camera 1,” “Show tracking camera 2,” and “Show event camera,” in this example. In some embodiments, the one or more views chosen will be routed to thevideo display 28. The “Physical Setup” 594 or calibration can be controlled from this screen. A calibration process, used to calibrate the machinevision processing system 10 with respect to the background, may be initiated using the “Calibrate” button. The “Clear Calibration” button allows the user to remove an old calibration before creating a new one. - A second preferred embodiment operates in much the same manner as the first embodiment, which has already been described in detail. To reduce cost, the second embodiment uses a single camera in place of the stereo
tracking camera pair 12 that is used in the first embodiment. This embodiment loses the benefits of stereoscopic image analysis and 3D surface analysis, but still can use other machine vision methods, including motion tracking, background differencing, image segmentation, texture analysis, and shape analysis. Alternatively, a gray scale (non-color) camera may be used as a further cost reduction, but losing the benefits of color analysis. These alternative embodiments retain all other functionality and scope of the first embodiment for detection and reporting of tailgating or reverse entry events. - A third preferred embodiment operates in much the same manner as the first embodiment, which has already been described in detail. To reduce cost, the third embodiment uses gray scale cameras in place of the color stereo
tracking camera pair 12 andcolor event camera 14 used in the first embodiment. This embodiment loses the benefits of color analysis, but still uses machine vision methods, including stereoscopic image and 3D surface analysis, to classify and track objects and determine whether their presence and/or activity is authorized. In a further alternative, a gray scale (non-color) camera is used for theevent camera 14 and color cameras are used for thestereo camera pair 12. This alternative retains the benefits of color analysis for the object identification and tracking capability, and only reduces the cost and capability of the event camera. These alternative embodiments retain all other functionality and scope of the first embodiment for detection and reporting of tailgating or reverse entry events. - A fourth preferred embodiment extends the functionality of the first embodiment by providing two or more pairs of
stereo tracking cameras 12 and/or two ormore event cameras 14 to an area ofobservation 16. The fourth embodiment operates in much the same manner as the first embodiment, which has already been described in detail, but with an improved ability to resolve certain ambiguous situations. An example of an ambiguous situation is where one object fully or mostly obscures another object from the point of view of the first set of cameras. Another advantage of this extended embodiment is the ability to better identify and classify objects since stereoscopic analysis (including, but not limited to, 3D surface analysis) can be performed from more than one vantage point. This alternative extended embodiment retains all other functionality and scope of the first embodiment. - A fifth preferred embodiment employs the same techniques of the first embodiment, but extends the functionality to include the detection, tracking, and alarming of thrown or dropped objects. In some situations, a perpetrator may wish to pass an unauthorized object into a controlled area. The
stereo camera 12 tracks any objects that are thrown or dropped in the area ofobservation 16. In other situations, a perpetrator may wish to leave an unauthorized object in the area of observation. The machinevision processing system 10 tracks the thrown or dropped objects using machine vision techniques that may include stereoscopic image processing, video motion detection, analysis of connections to other objects (to determine if the object is really traveling through the air or is making some other type of motion, such as being swung by a person), creation and maintenance of track files (to determine the trajectory of the thrown or dropped objects), shape analysis, image segmentation, and pattern recognition (to identify the type of object being dropped or thrown). In some embodiments, the machinevision processing system 10 may determine if the dropped or thrown object is entering or leaving the controlled area, and may trigger an alarm only if the object is entering the controlled area (or vice versa, exiting the controlled area). In cases where the machinevision processing system 10 determines that a thrown or dropped object is entering (or exiting) the controlled area, or passing through or remaining in the area of observation, in violation of the established access control rules, the alarms orannunciators 26 can be triggered. - A sixth preferred embodiment extends the functionality of the first embodiment to the counting of people or objects entering or leaving a controlled area. The
stereo camera 12 tracks any number of people or objects entering or leaving the controlled area through the area ofobservation 16. The machinevision processing system 10 tracks the people or objects using machine vision techniques that may include stereoscopic image processing, video motion detection, analysis of connections to other objects (to determine which people or objects are moving independently of others), creation and maintenance of track files (to determine, for example, if a person or an object has really entered or left the controlled area or merely entered the area of observation, turned around, and left the area of observation traveling in the other direction), shape analysis, image segmentation, and pattern recognition (to identify and classify the type of object traveling though the area of observation). In some embodiments, the machine vision processing system is interfaced with thearea access controller 24 orlocal access controller 22. In this case, the machine vision processing system can identify a number of situations in which security procedures may not be followed correctly, including: - 1. A person or object (i.e., a vehicle of some type) that is authorized to enter or leave the controlled area by the area access controller or local access controller but does not actually enter or leave the controlled area. In this case, the machine vision processing system can notify the area access controller or local access controller that the authorized person or objects have not actually entered or left the controlled area as expected. These controllers can then appropriately update their state (e.g., noting which persons or objects are within the controlled area).
- 2. The machine
vision processing system 10 can count the numbers of persons or objects (possibly of specific types) that have entered or left a controlled area. This information can be used by, for example, security personnel or an access controller to determine how many people or objects are within the controlled area at any one time. - 3. The area of observation can be around a fire door or other emergency exit or other portal that is not ordinarily used. In normal circumstances, no person or object would pass through this door or portal. In case of an emergency, any number of people or objects (e.g., vehicles) may pass through. The machine
vision processing system 10 can then count the number of people or objects (and perhaps types) leaving the controlled area. In some embodiments, the machinevision processing system 10 will notify the area access controller or the local access controller of these activities, and if the rules governing the controlled area are violated, then trigger the alarms and annunciators. - 4. In some situations, people or objects (such as vehicles) may receive authorization to enter or exit the controlled area, possibly from the area access controller or local access controller, but may not actually enter or exit the controlled area. In this case, the area access controller or local access controller can notify the machine
vision processing system 10 of the number of persons and objects (perhaps including information indicating the types of objects) authorized to enter or exit the controlled access area and cross the area ofobservation 16. In some cases, a period of time is allowed for the authorized persons or objects to enter or exit (e.g., a timeout is set). If the persons or objects are not observed to enter (or exit) the controlled access area, the machinevision processing system 10 can notify the area access controller or local access controller of these activities. The controller may then not allow the authorization to enter (or exit) to be repeated. The controllers can also prevent the repeated use of security credentials or pass codes to exit (or enter) when the authorized person or object has not actually used the first authorization, or prevent the same credentials or codes from being used for multiple entrances (or exits) when no exits (or entrances) have been recorded. In some embodiments, the machinevision processing system 10 can trigger the alarms and annunciators when these access control rules are violated. - A seventh preferred embodiment extends the functionality of the first embodiment, which has already been described in detail, by adding a second door or other access-limiting structure at the end of a passage (and typically at the end of the area of observation 16). This configuration creates a “man trap” or vehicle trap to contain the perpetrator of a tailgating or reverse entry attempt in a defined area. Preferably, the
local access controllers 22 for both doors or access-limiting structures are under the control of the machinevision processing system 10. This arrangement allows thesystem 10 to automatically contain the perpetrator until security personnel have a chance to investigate the incident. This alternative extended embodiment retains all other functionality and scope of the first embodiment. - Other embodiments and uses of the invention will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. The specification and examples herein should be considered exemplary only, with the scope of the invention indicated by the following claims and equivalents thereto. As will be readily understood by those of ordinary skill in the art, variations can be made within the scope of the invention as defined by the following claims.
Claims (45)
1. A method for detecting unauthorized passage of an object in regard to security of a controlled access area having a designated access point, comprising:
(a) acquiring one or more stereoscopic images of an area of observation associated with the designated access point;
(b) analyzing the one or more images using a machine vision processing system to identify and classify an object in the area of observation, and further to determine the direction of motion of the object with respect to the designated access point; and
(c) applying one or more access control rules to the information obtained from the image analysis to determine whether the object is attempting to breach the controlled access area by using the designated access point in violation of the security of the controlled access area.
2. The method of claim 1 , in which the designated access point is an exit-only access point and the one or more access control rules determine whether the object is attempting to enter the controlled access area through the exit-only access point.
3. The method of claim 1 , in which the designated access point is limited to passage of objects that are classified in a defined object type and the one or more access control rules determine whether the classification of the object in the area of observation is in the defined object type.
4. The method of claim 3 , in which the designated access point is limited to passage of objects classified as vehicles.
5. The method of claim 3 , in which the designated access point is limited to passage of objects classified as packages.
6. The method of claim 3 , in which the designated access point is limited to passage of objects classified as persons.
7. The method of claim 1 , in which the designated access point is a one-way elevator and the one or more access control rules determine whether the object is attempting to enter the controlled access area via the exit of the one-way elevator.
8. The method of claim 1 , in which the designated access point is an escalator and the one or more access control rules determine whether the object is attempting to access the controlled access area by entering the escalator in a direction opposite to the operation of the escalator.
9. The method of claim 1 , in which the designated access point is a revolving structure and the one or more access control rules determine whether the object is attempting to access the controlled access area by entering the revolving structure in a direction opposite to the operation of the revolving structure.
10. The method of claim 1 , in which analyzing the one or more images further comprises determining the speed of motion of the object and the one or more access control rules determine whether the speed of motion of the object indicates a thrown object.
11. The method of claim 1 , in which analyzing the one or more images further comprises determining the trajectory of motion of the object and the one or more access control rules determine whether the trajectory of motion of the object indicates a thrown object.
12. The method of claim 1 , further comprising triggering an alarm when the one or more access control rules determine that the object is attempting to breach the controlled access area by using the designated access point in violation of the security of the controlled access area.
13. The method of claim 1 , further comprising causing an access limiting device associated with the designated access point to prevent breach of the controlled access area by the object.
14. The method of claim 13 , in which causing the access limiting device to prevent breach of the controlled access area comprises causing the access limiting device to block passage by the object with respect to the designated access point.
15. The method of claim 13 , in which causing the access limiting device to prevent breach of the controlled access area comprises locking the access limiting device.
16. The method of claim 1 , further comprising causing one or more access limiting devices to block passage of the object at the designated access point and contain the object within a defined area.
17. The method of claim 1 , further comprising receiving access control information from an access control system and applying the one or more access control rules to the access control information in combination with the information obtained from the image analysis to determine whether the object is attempting to breach the controlled access area.
18. The method of claim 17 , further comprising triggering an alarm when the one or more access control rules applied to the access control information and the information obtained from the image analysis determine that the object is attempting to breach the controlled access area by using the designated access point in violation of the security of the controlled access area.
19. The method of claim 17 , further comprising causing an access limiting device associated with the designated access point to prevent breach of the controlled access area by the object.
20. The method of claim 19 , in which causing the access limiting device to prevent breach of the controlled access area comprises causing the access limiting device to block passage by the object with respect to the designated access point.
21. The method of claim 19 , in which causing the access limiting device to prevent breach of the controlled access area comprises locking the access limiting device.
22. The method of claim 1 , in which analyzing the one or more images using a machine vision processing system further comprises using a three-dimensional surface analysis to identify the object.
23. The method of claim 1 , in which analyzing the one or more images using a machine vision processing system further comprises using a three-dimensional surface analysis to classify the object.
24. The method of claim 1 , further comprising recording the one or more images.
25. The method of claim 1 , further comprising recording information obtained from the image analysis.
26. The method of claim 1 , further comprising recording information resulting from applying the one or more access control rules to the information obtained from the image analysis.
27. The method of claim 1 , further comprising defining a region of interest within the area of observation.
28. The method of claim 27 , further comprising defining a plurality of regions of interest and applying the one or more access control rules based on the region of interest in which the object is located.
29. The method of claim 27 , further comprising using an interactive user interface that displays the area of observation to define the region of interest.
30. The method of claim 1 , further comprising using an interactive user interface to configure properties of the image analysis.
31. The method of claim 1 , further comprising using an interactive user interface to configure properties of the machine vision processing system.
32. The method of claim 1 , further comprising using an interactive user interface to configure properties of an access control system that provides access control information to the machine vision processing system.
33. An apparatus for detecting a security violation with respect to a controlled access area having a designated access point, comprising:
(a) a stereoscopic imaging device for acquiring one or more stereographic images of an area of observation; and
(b) a machine vision processing system in communication with the stereoscopic imaging device to receive the one or more images, the machine vision processing system being further configured to analyze the one or more images to identify and classify an object in the area of observation, and further to determine the direction of motion of the object with respect to the designated access point and apply one or more access control rules to the information obtained from the image analysis to determine whether the object is attempting to breach the controlled access area by using the designated access point in violation of security of the controlled access area.
34. The apparatus of claim 33 , in which the object in the area of observation is a suspect object, the apparatus further comprising a position sensor disposed on a background object in the area of observation, the machine vision processing system being further configured to use information received from the position sensor to define the background object from the suspect object.
35. The apparatus of claim 34 , in which the position sensor is disposed on a door.
36. The apparatus of claim 34 , in which the machine vision processing system is further configured to perform a calibration procedure that correlates position sensor information to a position of the background object for determining a background image.
37. The apparatus of claim 33 , in which the machine vision processing system is further configured to analyze the one or more images to determine the speed of motion of the object and apply the one or more access control rules to determine whether the speed of motion of the object indicates a thrown object.
38. The apparatus of claim 33 , in which the machine vision processing system is further configured to analyze the one or more images to determine the trajectory of motion of the object and apply the one or more access control rules to determine whether the trajectory of motion of the object indicates a thrown object.
39. The apparatus of claim 33 , further comprising an alarm that is triggered when the one or more access control rules determine that the object is attempting to breach the controlled access area in violation of the security of the controlled access area.
40. The apparatus of claim 33 , in which the object in the area of observation is a second object, the machine vision processing system being further configured to identify and classify a first object in the area of observation having supplied an authorization with respect to the controlled access area, and to apply the one or more access control rules to determine whether the second object is attempting to breach the controlled access area by utilizing the authorization supplied by the first object in violation thereof.
41. The apparatus of claim 40 , in which the one or more access control rules are configured to determine whether the second object is attempting to breach the controlled access area based on proximity of the second object to the first object in the area of observation.
42. The apparatus of claim 41 , in which the authorization of the first object is to enter the controlled access area, and the one or more access control rules are configured to determine whether the second object is attempting to enter the controlled access area by maintaining proximity to the first object as the first object enters the controlled access area.
43. The apparatus of claim 41 , in which the authorization of the first object is to exit the controlled access area, and the one or more access control rules are configured to determine whether the second object is attempting to enter the controlled access area by maintaining proximity to the first object as the first object exits the controlled access area.
44. The apparatus of claim 33 , further comprising an access control system associated with the controlled access area in which the access control system is configured to receive information from the object for accessing the controlled access area.
45. The apparatus of claim 44 , further comprising an access limiting device in communication with the access control system, in which the access control system is configured to control the operation of the access limiting device.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/131,850 US20080285802A1 (en) | 2002-04-08 | 2008-06-02 | Tailgating and reverse entry detection, alarm, recording and prevention using machine vision |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US37083702P | 2002-04-08 | 2002-04-08 | |
US10/410,884 US7382895B2 (en) | 2002-04-08 | 2003-04-08 | Tailgating and reverse entry detection, alarm, recording and prevention using machine vision |
US12/131,850 US20080285802A1 (en) | 2002-04-08 | 2008-06-02 | Tailgating and reverse entry detection, alarm, recording and prevention using machine vision |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US10/410,884 Division US7382895B2 (en) | 2002-04-08 | 2003-04-08 | Tailgating and reverse entry detection, alarm, recording and prevention using machine vision |
Publications (1)
Publication Number | Publication Date |
---|---|
US20080285802A1 true US20080285802A1 (en) | 2008-11-20 |
Family
ID=29250591
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US10/410,884 Active 2025-05-21 US7382895B2 (en) | 2002-04-08 | 2003-04-08 | Tailgating and reverse entry detection, alarm, recording and prevention using machine vision |
US12/131,850 Abandoned US20080285802A1 (en) | 2002-04-08 | 2008-06-02 | Tailgating and reverse entry detection, alarm, recording and prevention using machine vision |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US10/410,884 Active 2025-05-21 US7382895B2 (en) | 2002-04-08 | 2003-04-08 | Tailgating and reverse entry detection, alarm, recording and prevention using machine vision |
Country Status (5)
Country | Link |
---|---|
US (2) | US7382895B2 (en) |
EP (1) | EP1493130A1 (en) |
AU (1) | AU2003221893A1 (en) |
CA (1) | CA2481250C (en) |
WO (1) | WO2003088157A1 (en) |
Cited By (54)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070146491A1 (en) * | 2004-06-09 | 2007-06-28 | Cognex Corporation | Human-machine-interface and method for manipulating data in a machine vision system |
US20080118106A1 (en) * | 2006-11-22 | 2008-05-22 | Regents Of The University Of Minnesota | Crowd counting and monitoring |
US20080212099A1 (en) * | 2007-03-01 | 2008-09-04 | Chao-Ho Chen | Method for counting people passing through a gate |
US20090010490A1 (en) * | 2007-07-03 | 2009-01-08 | Shoppertrak Rct Corporation | System and process for detecting, tracking and counting human objects of interest |
DE202009010858U1 (en) | 2009-08-11 | 2009-10-22 | Magnetic Autocontrol Gmbh | Passage or transit barrier with a device for monitoring the passage or passage area |
US20090273668A1 (en) * | 2004-06-09 | 2009-11-05 | Cognex Corporation | Method for setting parameters of a vision detector using production line information |
US20100026786A1 (en) * | 2006-10-25 | 2010-02-04 | Norbert Link | Method and device for monitoring a spatial volume as well as calibration method |
US20110169917A1 (en) * | 2010-01-11 | 2011-07-14 | Shoppertrak Rct Corporation | System And Process For Detecting, Tracking And Counting Human Objects of Interest |
WO2011128408A1 (en) * | 2010-04-15 | 2011-10-20 | Iee International Electronics & Engineering S.A. | Configurable access control sensing device |
US20110291841A1 (en) * | 2010-05-27 | 2011-12-01 | Infrared Integrated Systems Limited | Monitoring hand hygiene |
EP2395451A1 (en) * | 2010-06-09 | 2011-12-14 | Iee International Electronics & Engineering S.A. | Configurable access control sensing device |
US8103085B1 (en) | 2007-09-25 | 2012-01-24 | Cognex Corporation | System and method for detecting flaws in objects using machine vision |
US20120126939A1 (en) * | 2010-11-18 | 2012-05-24 | Hyundai Motor Company | System and method for managing entrance and exit using driver face identification within vehicle |
US8237099B2 (en) | 2007-06-15 | 2012-08-07 | Cognex Corporation | Method and system for optoelectronic detection and location of objects |
US8243986B2 (en) | 2004-06-09 | 2012-08-14 | Cognex Technology And Investment Corporation | Method and apparatus for automatic visual event detection |
US8249297B2 (en) | 2004-06-09 | 2012-08-21 | Cognex Technology And Investment Corporation | Method and apparatus for automatic visual event detection |
US8290238B2 (en) | 2004-06-09 | 2012-10-16 | Cognex Technology And Investment Corporation | Method and apparatus for locating objects |
US8355046B2 (en) * | 2004-07-14 | 2013-01-15 | Panasonic Corporation | Object tracing device, object tracing system, and object tracing method |
USRE44353E1 (en) | 2004-11-12 | 2013-07-09 | Cognex Technology And Investment Corporation | System and method for assigning analysis parameters to vision detector using a graphical interface |
US8582925B2 (en) | 2004-11-12 | 2013-11-12 | Cognex Technology And Investment Corporation | System and method for displaying and using non-numeric graphic elements to control and monitor a vision system |
US20130314232A1 (en) * | 2012-05-23 | 2013-11-28 | Honeywell International Inc. | Tailgating detection |
US20140063191A1 (en) * | 2012-08-27 | 2014-03-06 | Accenture Global Services Limited | Virtual access control |
US20140270358A1 (en) * | 2013-03-15 | 2014-09-18 | Pelco, Inc. | Online Learning Method for People Detection and Counting for Retail Stores |
US8891852B2 (en) | 2004-06-09 | 2014-11-18 | Cognex Technology And Investment Corporation | Method and apparatus for configuring and testing a machine vision detector |
US20150095107A1 (en) * | 2013-09-27 | 2015-04-02 | Panasonic Corporation | Stay duration measurement device, stay duration measurement system and stay duration measurement method |
US9177195B2 (en) | 2011-09-23 | 2015-11-03 | Shoppertrak Rct Corporation | System and method for detecting, tracking and counting human objects of interest using a counting system and a data capture device |
US9292187B2 (en) | 2004-11-12 | 2016-03-22 | Cognex Corporation | System, method and graphical user interface for displaying and controlling vision system operating parameters |
US9367733B2 (en) | 2012-11-21 | 2016-06-14 | Pelco, Inc. | Method and apparatus for detecting people by a surveillance system |
US20160284183A1 (en) * | 2014-08-19 | 2016-09-29 | Sensormatic Electronics, LLC | Tailgating Detection in Frictionless Access Control System |
WO2017176876A1 (en) * | 2016-04-07 | 2017-10-12 | Vivint, Inc. | Identification-based barrier techniques |
US9794446B2 (en) | 2016-03-11 | 2017-10-17 | Fuji Xerox Co., Ltd. | Information processing apparatus, information processing method, and non-transitory computer readable medium |
US20170364743A1 (en) * | 2016-06-15 | 2017-12-21 | Google Inc. | Object rejection system and method |
DE102016119343A1 (en) * | 2016-10-11 | 2018-04-12 | Bircher Reglomat Ag | Object monitoring with infrared image recording and infrared pulse illumination |
DE102016119348A1 (en) * | 2016-10-11 | 2018-04-12 | Bircher Reglomat Ag | Stereometric object flow interaction |
US9947155B2 (en) | 2015-05-20 | 2018-04-17 | Sensormatic Electronics, LLC | Frictionless access system for public access point |
US10009579B2 (en) | 2012-11-21 | 2018-06-26 | Pelco, Inc. | Method and system for counting people using depth sensor |
US10127754B2 (en) | 2014-04-25 | 2018-11-13 | Vivint, Inc. | Identification-based barrier techniques |
US10158550B2 (en) | 2014-08-19 | 2018-12-18 | Sensormatic Electronics, LLC | Access control system with omni and directional antennas |
US10209698B2 (en) | 2014-12-26 | 2019-02-19 | Industrial Technology Research Institute | Calibration method and automation machining apparatus using the same |
US10221610B2 (en) | 2017-05-15 | 2019-03-05 | Otis Elevator Company | Depth sensor for automatic doors |
US20190073521A1 (en) * | 2017-09-06 | 2019-03-07 | Pixart Imaging Inc. | Auxiliary filtering device for face recognition and starting method for electronic device |
US10235822B2 (en) | 2014-04-25 | 2019-03-19 | Vivint, Inc. | Automatic system access using facial recognition |
US10274909B2 (en) | 2014-04-25 | 2019-04-30 | Vivint, Inc. | Managing barrier and occupancy based home automation system |
US10373408B2 (en) | 2014-08-19 | 2019-08-06 | Sensormatic Electronics, LLC | Method and system for access control proximity location |
US10386460B2 (en) | 2017-05-15 | 2019-08-20 | Otis Elevator Company | Self-calibrating sensor for elevator and automatic door systems |
DE102018104202A1 (en) * | 2018-02-23 | 2019-08-29 | Marantec Antriebs- Und Steuerungstechnik Gmbh & Co. Kg | Method for operating a gate system and gate system |
US10657749B2 (en) | 2014-04-25 | 2020-05-19 | Vivint, Inc. | Automatic system access using facial recognition |
CN111599064A (en) * | 2020-05-29 | 2020-08-28 | 武汉虹信技术服务有限责任公司 | Bidirectional access control method, system, terminal and computer readable medium |
TWI720903B (en) * | 2020-06-03 | 2021-03-01 | 南開科技大學 | Home life safety warning system for living alone and method thereof |
US10936859B2 (en) | 2011-09-23 | 2021-03-02 | Sensormatic Electronics, LLC | Techniques for automatically identifying secondary objects in a stereo-optical counting system |
US11025865B1 (en) * | 2011-06-17 | 2021-06-01 | Hrl Laboratories, Llc | Contextual visual dataspaces |
US11036972B2 (en) * | 2016-07-11 | 2021-06-15 | Disco Corporation | Management system for supervising operator |
TWI735588B (en) * | 2016-07-11 | 2021-08-11 | 日商迪思科股份有限公司 | Management system |
TWI826784B (en) * | 2021-05-11 | 2023-12-21 | 大陸商星宸科技股份有限公司 | Object detection apparatus and method |
Families Citing this family (153)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6701005B1 (en) | 2000-04-29 | 2004-03-02 | Cognex Corporation | Method and apparatus for three-dimensional object segmentation |
US9892606B2 (en) * | 2001-11-15 | 2018-02-13 | Avigilon Fortress Corporation | Video surveillance system employing video primitives |
US8564661B2 (en) * | 2000-10-24 | 2013-10-22 | Objectvideo, Inc. | Video analytic rule detection system and method |
US7734066B2 (en) * | 2003-11-19 | 2010-06-08 | L-3 Communications Security And Detection Systems, Inc. | Security system with distributed computing |
US7382895B2 (en) * | 2002-04-08 | 2008-06-03 | Newton Security, Inc. | Tailgating and reverse entry detection, alarm, recording and prevention using machine vision |
US7397929B2 (en) * | 2002-09-05 | 2008-07-08 | Cognex Technology And Investment Corporation | Method and apparatus for monitoring a passageway using 3D images |
US7920718B2 (en) * | 2002-09-05 | 2011-04-05 | Cognex Corporation | Multi-zone passageway monitoring system and method |
US7400744B2 (en) * | 2002-09-05 | 2008-07-15 | Cognex Technology And Investment Corporation | Stereo door sensor |
US20090266882A1 (en) * | 2003-06-17 | 2009-10-29 | Sajkowsky James M | Smart passport system for monitoring and recording activity and data relating to persons |
US7626608B2 (en) * | 2003-07-10 | 2009-12-01 | Sony Corporation | Object detecting apparatus and method, program and recording medium used therewith, monitoring system and method, information processing apparatus and method, and recording medium and program used therewith |
US7831087B2 (en) * | 2003-10-31 | 2010-11-09 | Hewlett-Packard Development Company, L.P. | Method for visual-based recognition of an object |
US7623674B2 (en) * | 2003-11-05 | 2009-11-24 | Cognex Technology And Investment Corporation | Method and system for enhanced portal security through stereoscopy |
US8326084B1 (en) * | 2003-11-05 | 2012-12-04 | Cognex Technology And Investment Corporation | System and method of auto-exposure control for image acquisition hardware using three dimensional information |
US8272053B2 (en) * | 2003-12-18 | 2012-09-18 | Honeywell International Inc. | Physical security management system |
JP4332028B2 (en) * | 2003-12-25 | 2009-09-16 | キャタピラージャパン株式会社 | Display control system |
TW200525449A (en) * | 2004-01-29 | 2005-08-01 | Lai Jin Ding | Human body image recognition system |
DE102004009541A1 (en) * | 2004-02-23 | 2005-09-15 | Iris-Gmbh Infrared & Intelligent Sensors | User controllable acquisition system |
KR100519782B1 (en) * | 2004-03-04 | 2005-10-07 | 삼성전자주식회사 | Method and apparatus for detecting people using a stereo camera |
EP2408193A3 (en) * | 2004-04-16 | 2014-01-15 | James A. Aman | Visible and non-visible light sensing camera for videoing and object tracking |
US8427538B2 (en) * | 2004-04-30 | 2013-04-23 | Oncam Grandeye | Multiple view and multiple object processing in wide-angle video camera |
US7680300B2 (en) * | 2004-06-01 | 2010-03-16 | Energid Technologies | Visual object recognition and tracking |
US7813525B2 (en) | 2004-06-01 | 2010-10-12 | Sarnoff Corporation | Method and apparatus for detecting suspicious activities |
US8059153B1 (en) | 2004-06-21 | 2011-11-15 | Wyse Technology Inc. | Three-dimensional object tracking using distributed thin-client cameras |
US20050285941A1 (en) * | 2004-06-28 | 2005-12-29 | Haigh Karen Z | Monitoring devices |
US8289390B2 (en) * | 2004-07-28 | 2012-10-16 | Sri International | Method and apparatus for total situational awareness and monitoring |
WO2006011593A1 (en) * | 2004-07-30 | 2006-02-02 | Matsushita Electric Works, Ltd. | Individual detector and accompaniment detection device |
EP1647918A1 (en) * | 2004-10-13 | 2006-04-19 | SkiData AG | Access control system |
WO2006047610A2 (en) * | 2004-10-27 | 2006-05-04 | Cinital | Method and apparatus for a virtual scene previewing system |
JP2006165795A (en) * | 2004-12-03 | 2006-06-22 | Canon Inc | Image forming device and image forming method |
US7505607B2 (en) * | 2004-12-17 | 2009-03-17 | Xerox Corporation | Identifying objects tracked in images using active device |
JP4122384B2 (en) * | 2005-01-31 | 2008-07-23 | オプテックス株式会社 | Traffic monitoring device |
US7706778B2 (en) | 2005-04-05 | 2010-04-27 | Assa Abloy Ab | System and method for remotely assigning and revoking access credentials using a near field communication equipped mobile phone |
US20060244828A1 (en) * | 2005-04-29 | 2006-11-02 | Ho Li-Pen J | Vehicle passenger occupancy alert system using passenger image recognition |
TWM277062U (en) * | 2005-04-29 | 2005-10-01 | Jia Fu Internat Dev Co Ltd | Dactyloscopy entrance guard devices |
EP1752931A1 (en) * | 2005-07-21 | 2007-02-14 | Scheidt & Bachmann Gesellschaft mit beschränkter Haftung | Method for automatically opening the barrier of a passage way for persons |
WO2007022111A1 (en) * | 2005-08-17 | 2007-02-22 | Honeywell International Inc. | Physical security management system |
US20070047837A1 (en) * | 2005-08-29 | 2007-03-01 | John Schwab | Method and apparatus for detecting non-people objects in revolving doors |
ITUD20050152A1 (en) * | 2005-09-23 | 2007-03-24 | Neuricam Spa | ELECTRO-OPTICAL DEVICE FOR THE COUNTING OF PEOPLE, OR OTHERWISE, BASED ON STEREOSCOPIC VISION, AND ITS PROCEDURE |
US20070083915A1 (en) * | 2005-10-06 | 2007-04-12 | Janani Janakiraman | Method and system for dynamic adjustment of computer security based on personal proximity |
US8111904B2 (en) | 2005-10-07 | 2012-02-07 | Cognex Technology And Investment Corp. | Methods and apparatus for practical 3D vision system |
FR2895122B1 (en) | 2005-12-16 | 2008-02-01 | Sagem Defense Securite | METHOD OF SECURING PHYSICAL ACCESS AND PROVIDING ACCESS TO THE PROCESS |
US8380696B1 (en) * | 2005-12-20 | 2013-02-19 | Emc Corporation | Methods and apparatus for dynamically classifying objects |
GB2447829B (en) | 2006-01-12 | 2011-11-09 | Otis Elevator Co | Video aided system for elevator control |
US7764167B2 (en) * | 2006-01-18 | 2010-07-27 | British Telecommunications Plc | Monitoring movement of an entity in an environment |
US20070268145A1 (en) * | 2006-05-19 | 2007-11-22 | Bazakos Michael E | Automated tailgating detection via fusion of video and access control |
US7733043B2 (en) * | 2006-06-27 | 2010-06-08 | B.E.A., Inc. | Revolving door control system |
US8074271B2 (en) | 2006-08-09 | 2011-12-06 | Assa Abloy Ab | Method and apparatus for making a decision on a card |
US9985950B2 (en) | 2006-08-09 | 2018-05-29 | Assa Abloy Ab | Method and apparatus for making a decision on a card |
US8432448B2 (en) | 2006-08-10 | 2013-04-30 | Northrop Grumman Systems Corporation | Stereo camera intrusion detection system |
US7900398B2 (en) | 2006-11-14 | 2011-03-08 | Overhead Door Corporation | Security door system |
US8126260B2 (en) * | 2007-05-29 | 2012-02-28 | Cognex Corporation | System and method for locating a three-dimensional object using machine vision |
ITMI20071301A1 (en) * | 2007-06-29 | 2008-12-30 | Business Gates S R L | "APPARATUS FOR THE DISTANCE OPENING OF DOORS OR GATES OF A BUILDING" |
US8203426B1 (en) | 2007-07-11 | 2012-06-19 | Precision Edge Access Control, Inc. | Feed protocol used to report status and event information in physical access control system |
US8269602B2 (en) * | 2007-08-22 | 2012-09-18 | Utc Fire & Security Americas Corporation, Inc. | Security access control system and method for making same |
US8009013B1 (en) * | 2007-09-21 | 2011-08-30 | Precision Control Systems of Chicago, Inc. | Access control system and method using user location information for controlling access to a restricted area |
JP4442682B2 (en) * | 2007-11-27 | 2010-03-31 | ソニー株式会社 | Optical element |
US8108055B2 (en) * | 2007-12-28 | 2012-01-31 | Larry Wong | Method, system and apparatus for controlling an electrical device |
DE102008016516B3 (en) | 2008-01-24 | 2009-05-20 | Kaba Gallenschütz GmbH | Access control device for use in entry point of e.g. building for determining fingerprint of person, has CPU with control unit for adjusting default security steps, where each security step is associated with defined parameter of CPU |
DE102008006449A1 (en) * | 2008-01-29 | 2009-07-30 | Kaba Gallenschütz GmbH | Method and device for monitoring a volume of space |
GB0804472D0 (en) * | 2008-03-11 | 2008-04-16 | Patterson Kieran | An evacuation lighting system |
JP5159390B2 (en) * | 2008-03-28 | 2013-03-06 | キヤノン株式会社 | Object detection method and apparatus |
EP2131306A1 (en) * | 2008-06-02 | 2009-12-09 | THOMSON Licensing | Device and method for tracking objects in a video, system and method for audience measurement |
US9288449B2 (en) | 2008-08-05 | 2016-03-15 | University Of Florida Research Foundation, Inc. | Systems and methods for maintaining multiple objects within a camera field-of-view |
US8791817B2 (en) * | 2008-10-22 | 2014-07-29 | Centurylink Intellectual Property Llc | System and method for monitoring a location |
US8760510B2 (en) * | 2008-11-26 | 2014-06-24 | Robert T. Aloe | Apparatus and methods for three-dimensional imaging using a static light screen |
US8983488B2 (en) | 2008-12-11 | 2015-03-17 | Centurylink Intellectual Property Llc | System and method for providing location based services at a shopping facility |
DE102009011348A1 (en) * | 2009-03-05 | 2010-09-09 | Wolfgang Konrad | Device for monitoring monitored area, has image detecting mediums for identifying persons in monitored area and releasing medium for releasing person identifying unit |
EP2234073A1 (en) * | 2009-03-23 | 2010-09-29 | SkiData AG | Access control device and method for operating same |
JP4737316B2 (en) * | 2009-03-25 | 2011-07-27 | コニカミノルタビジネステクノロジーズ株式会社 | Authentication system, authentication method, and information processing apparatus |
US9307037B2 (en) * | 2009-04-15 | 2016-04-05 | Centurylink Intellectual Property Llc | System and method for utilizing attendee location information with an event planner |
US8428620B2 (en) * | 2009-04-22 | 2013-04-23 | Centurylink Intellectual Property Llc | Mass transportation service delivery platform |
US8284993B2 (en) * | 2009-06-18 | 2012-10-09 | Hytrol Conveyor Company, Inc. | Decentralized tracking of packages on a conveyor |
US8655693B2 (en) * | 2009-07-08 | 2014-02-18 | Centurylink Intellectual Property Llc | System and method for automating travel related features |
US20120069192A1 (en) * | 2009-10-20 | 2012-03-22 | Qing-Hu Li | Data Processing System and Method |
US9918048B2 (en) | 2009-11-18 | 2018-03-13 | Verizon Patent And Licensing Inc. | System and method for providing automatic location-based imaging |
US9417312B2 (en) * | 2009-11-18 | 2016-08-16 | Verizon Patent And Licensing Inc. | System and method for providing automatic location-based imaging using mobile and stationary cameras |
JP5441749B2 (en) * | 2010-02-12 | 2014-03-12 | セコム株式会社 | Security system |
JP5448899B2 (en) * | 2010-02-12 | 2014-03-19 | セコム株式会社 | Security system |
CN106570307B (en) | 2010-04-09 | 2020-01-07 | 卓尔医学产品公司 | System and method for streaming patient information from a defibrillator |
WO2011151232A1 (en) * | 2010-05-31 | 2011-12-08 | Universiteit Gent | An optical system for occupancy sensing, and corresponding method |
CN102034328B (en) * | 2010-12-10 | 2012-06-27 | 山东申普交通科技有限公司 | Intelligent monitoring method |
TWI619660B (en) * | 2010-12-15 | 2018-04-01 | 辛波提克有限責任公司 | Storage and retrieval system with bot position sensing and method of operating the same |
US8694152B2 (en) | 2010-12-15 | 2014-04-08 | Symbotic, LLC | Maintenance access zones for storage and retrieval systems |
US20120169880A1 (en) * | 2010-12-31 | 2012-07-05 | Schneider Electric Buildings Llc | Method and system for video-based gesture recognition to assist in access control |
US8659643B2 (en) * | 2011-01-18 | 2014-02-25 | Disney Enterprises, Inc. | Counting system for vehicle riders |
DE102011011929A1 (en) | 2011-02-18 | 2012-08-23 | Hella Kgaa Hueck & Co. | Method for detecting target objects in a surveillance area |
US9268054B2 (en) * | 2011-07-07 | 2016-02-23 | Robert Osann, Jr. | Synchronized robotic baggage portal for secure access |
KR101233608B1 (en) * | 2011-11-10 | 2013-02-14 | 조희문 | Door lock apparatus |
US9208554B2 (en) | 2012-01-16 | 2015-12-08 | Intelliview Technologies Inc. | Apparatus for detecting humans on conveyor belts using one or more imaging devices |
EP2805188A4 (en) * | 2012-01-16 | 2016-02-24 | Intelliview Technologies Inc | Apparatus for detecting humans on conveyor belts using one or more imaging devices |
JP5888172B2 (en) * | 2012-08-02 | 2016-03-16 | ソニー株式会社 | Data storage device and program |
TWI448990B (en) * | 2012-09-07 | 2014-08-11 | Univ Nat Chiao Tung | Real-time people counting system using layer scanning method |
WO2014052802A2 (en) * | 2012-09-28 | 2014-04-03 | Zoll Medical Corporation | Systems and methods for three-dimensional interaction monitoring in an ems environment |
FI124131B (en) * | 2012-11-14 | 2014-03-31 | Kone Corp | Lift arrangement |
FI124166B (en) * | 2013-01-08 | 2014-04-15 | Kone Corp | An elevator call system and a method for providing lift calls in an elevator call system |
WO2014122357A1 (en) * | 2013-02-07 | 2014-08-14 | Kone Corporation | Personalization of an elevator service |
ES2496665B1 (en) * | 2013-02-14 | 2015-06-16 | Holding Assessoria I Lideratge, S.L. | FRAUDULENT ACCESS DETECTION METHOD IN CONTROLLED ACCESS POINTS |
KR102350530B1 (en) | 2013-03-15 | 2022-01-14 | 심보틱 엘엘씨 | Automated storage and retrieval system with integral secured personnel access zones and remote rover shutdown |
TWI642028B (en) | 2013-03-15 | 2018-11-21 | 辛波提克有限責任公司 | Transportation system and automated storage and retrieval system with integral secured personnel access zones and remote rover shutdown |
US8948457B2 (en) * | 2013-04-03 | 2015-02-03 | Pillar Vision, Inc. | True space tracking of axisymmetric object flight using diameter measurement |
US10373470B2 (en) | 2013-04-29 | 2019-08-06 | Intelliview Technologies, Inc. | Object detection |
US9948359B2 (en) | 2013-09-20 | 2018-04-17 | At&T Intellectual Property I, L.P. | Secondary short-range wireless assist for wireless-based access control |
JP6113369B2 (en) * | 2013-11-18 | 2017-04-12 | コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. | Video surveillance for MRI safety monitoring |
EP2876610A1 (en) * | 2013-11-25 | 2015-05-27 | Inventio AG | Controlling passenger traffic |
FR3015746B1 (en) * | 2013-12-20 | 2016-02-05 | Thales Sa | ACCESS CONTROL SYSTEM |
US20150220783A1 (en) * | 2014-02-06 | 2015-08-06 | Rf Spot Inc. | Method and system for semi-automated venue monitoring |
US9792129B2 (en) | 2014-02-28 | 2017-10-17 | Tyco Fire & Security Gmbh | Network range extender with multi-RF radio support for plurality of network interfaces |
US9316720B2 (en) * | 2014-02-28 | 2016-04-19 | Tyco Fire & Security Gmbh | Context specific management in wireless sensor network |
US10878323B2 (en) | 2014-02-28 | 2020-12-29 | Tyco Fire & Security Gmbh | Rules engine combined with message routing |
CA2847707C (en) | 2014-03-28 | 2021-03-30 | Intelliview Technologies Inc. | Leak detection |
KR102256474B1 (en) * | 2014-04-08 | 2021-05-26 | 한화테크윈 주식회사 | System and Method for Network Security |
CN105096406A (en) | 2014-04-30 | 2015-11-25 | 开利公司 | Video analysis system used for architectural energy consumption equipment and intelligent building management system |
DE102014110506A1 (en) * | 2014-07-25 | 2016-01-28 | Bircher Reglomat Ag | Procedure for monitoring |
US10943357B2 (en) | 2014-08-19 | 2021-03-09 | Intelliview Technologies Inc. | Video based indoor leak detection |
US10212319B1 (en) | 2014-11-04 | 2019-02-19 | Amazon Technologies, Inc. | Camera positioning fixture |
US10438277B1 (en) * | 2014-12-23 | 2019-10-08 | Amazon Technologies, Inc. | Determining an item involved in an event |
US9710712B2 (en) * | 2015-01-16 | 2017-07-18 | Avigilon Fortress Corporation | System and method for detecting, tracking, and classifiying objects |
US10380486B2 (en) * | 2015-01-20 | 2019-08-13 | International Business Machines Corporation | Classifying entities by behavior |
CN106144795B (en) * | 2015-04-03 | 2020-01-31 | 奥的斯电梯公司 | System and method for passenger transport control and security by identifying user actions |
CN106144861B (en) * | 2015-04-03 | 2020-07-24 | 奥的斯电梯公司 | Depth sensor based passenger sensing for passenger transport control |
CN106144862B (en) * | 2015-04-03 | 2020-04-10 | 奥的斯电梯公司 | Depth sensor based passenger sensing for passenger transport door control |
CN106144801B (en) * | 2015-04-03 | 2021-05-18 | 奥的斯电梯公司 | Depth sensor based sensing for special passenger transport vehicle load conditions |
CN112850406A (en) | 2015-04-03 | 2021-05-28 | 奥的斯电梯公司 | Traffic list generation for passenger transport |
CN106315316A (en) * | 2015-06-16 | 2017-01-11 | 奥的斯电梯公司 | Elevator system and control method for same |
CN106256744B (en) | 2015-06-19 | 2019-12-10 | 奥的斯电梯公司 | Elevator riding user management method and system |
US20160378268A1 (en) * | 2015-06-23 | 2016-12-29 | Honeywell International Inc. | System and method of smart incident analysis in control system using floor maps |
US9972149B2 (en) * | 2015-08-24 | 2018-05-15 | Cubic Corporation | Vision-based fare collection |
US10373412B2 (en) * | 2016-02-03 | 2019-08-06 | Sensormatic Electronics, LLC | System and method for controlling access to an access point |
EP3203447B1 (en) | 2016-02-04 | 2019-05-01 | Holding Assessoria I Lideratge, S.L. (HAL SL) | Detection of fraudulent access at control gates |
JP6603151B2 (en) * | 2016-02-15 | 2019-11-06 | 日本電信電話株式会社 | Person traffic management system and person traffic management method |
JP6393360B2 (en) * | 2016-05-11 | 2018-09-19 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America | Imaging control method, imaging control system, and imaging control server |
US10407275B2 (en) * | 2016-06-10 | 2019-09-10 | Otis Elevator Company | Detection and control system for elevator operations |
CN117902441A (en) | 2016-07-29 | 2024-04-19 | 奥的斯电梯公司 | Monitoring system for passenger conveyor, passenger conveyor and monitoring method thereof |
EP3312762B1 (en) * | 2016-10-18 | 2023-03-01 | Axis AB | Method and system for tracking an object in a defined area |
WO2018094515A1 (en) * | 2016-11-22 | 2018-05-31 | Avigilon Corporation | Location control system and method |
DE102017207754A1 (en) * | 2017-05-08 | 2018-11-08 | Bundesdruckerei Gmbh | Identify potential users of pyrotechnic articles |
CN107644506A (en) * | 2017-10-26 | 2018-01-30 | 扬州制汇互联信息技术有限公司 | A kind of intelligent security protection system |
US10586432B2 (en) | 2017-12-29 | 2020-03-10 | Ademco Inc. | Systems and methods for intrusion detection using selective masking |
US11989836B2 (en) * | 2018-06-12 | 2024-05-21 | Current Lighting Solutions, Llc | Integrated management of sensitive controlled environments and items contained therein |
CN109064616A (en) * | 2018-10-16 | 2018-12-21 | 珠海数图信息技术有限公司 | A kind of intelligent gate control system based on testimony of a witness unification |
CN109376639B (en) * | 2018-10-16 | 2021-12-17 | 上海弘目智能科技有限公司 | Accompanying personnel early warning system and method based on portrait recognition |
CN113228034B (en) * | 2018-12-26 | 2024-03-01 | 浙江大华技术股份有限公司 | Gate equipment system and control method thereof |
IT201900007232A1 (en) * | 2019-05-24 | 2020-11-24 | Marco Tiso | ENTRANCE GATE WITH CONTROLLED ACCESS |
WO2020240071A1 (en) * | 2019-05-24 | 2020-12-03 | Kone Corporation | An access control solution for a passage device |
US10850709B1 (en) * | 2019-08-27 | 2020-12-01 | Toyota Motor Engineering & Manufacturing North America, Inc. | Facial recognition and object detection for vehicle unlocking scenarios |
US11080955B2 (en) | 2019-09-06 | 2021-08-03 | Motorola Solutions, Inc. | Device, system and method for controlling a passage barrier mechanism |
US11330395B2 (en) * | 2020-02-14 | 2022-05-10 | Aurora Flight Services Corp., A Subsidiary Of The Boeing Company | Access control system and method |
CN113593099B (en) * | 2020-04-30 | 2023-06-13 | 深圳云天励飞技术有限公司 | Gate control method, device and system, electronic equipment and storage medium |
CN111565300B (en) * | 2020-05-22 | 2020-12-22 | 深圳市百川安防科技有限公司 | Object-based video file processing method, device and system |
WO2021245098A1 (en) | 2020-06-03 | 2021-12-09 | Dormakaba Schweiz Ag | Access gate |
AU2022338939A1 (en) * | 2021-08-31 | 2024-03-07 | Assa Abloy Entrance Systems Ab | Method for operating a person separation device as well as person separation device |
CN113723372B (en) * | 2021-11-01 | 2022-01-18 | 北京卓建智菡科技有限公司 | Prompting method and device, computer equipment and computer readable storage medium |
DE102022124737A1 (en) | 2022-09-27 | 2024-03-28 | Scheidt & Bachmann Gmbh | Gate arrangement, especially for a passenger transport system |
Citations (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3564132A (en) * | 1966-01-17 | 1971-02-16 | Mardix | Apparatus for controlling the passage of persons and objects between two areas utilizing closed circuit television |
US3727034A (en) * | 1972-01-19 | 1973-04-10 | Gen Electric | Counting system for a plurality of locations |
US4000400A (en) * | 1975-04-09 | 1976-12-28 | Elder Clarence L | Bidirectional monitoring and control system |
US4303851A (en) * | 1979-10-16 | 1981-12-01 | Otis Elevator Company | People and object counting system |
US4799243A (en) * | 1987-09-01 | 1989-01-17 | Otis Elevator Company | Directional people counting arrangement |
US4847485A (en) * | 1986-07-15 | 1989-07-11 | Raphael Koelsch | Arrangement for determining the number of persons and a direction within a space to be monitored or a pass-through |
US5201906A (en) * | 1989-10-11 | 1993-04-13 | Milan Schwarz | Anti-piggybacking: sensor system for security door to detect two individuals in one compartment |
US5519784A (en) * | 1992-10-07 | 1996-05-21 | Vermeulen; Pieter J. E. | Apparatus for classifying movement of objects along a passage by type and direction employing time domain patterns |
US5581625A (en) * | 1994-01-31 | 1996-12-03 | International Business Machines Corporation | Stereo vision system for counting items in a queue |
US5866887A (en) * | 1996-09-04 | 1999-02-02 | Matsushita Electric Industrial Co., Ltd. | Apparatus for detecting the number of passers |
US6081619A (en) * | 1995-07-19 | 2000-06-27 | Matsushita Electric Industrial Co., Ltd. | Movement pattern recognizing apparatus for detecting movements of human bodies and number of passed persons |
US20020070858A1 (en) * | 2000-12-12 | 2002-06-13 | Philips Electronics North America Corporation | Placement of camera in door for face recognition-based security systems |
US6493878B1 (en) * | 1988-10-17 | 2002-12-10 | Lord Samuel A Kassatly | Method and apparatus for tv broadcasting and reception |
US20030002712A1 (en) * | 2001-07-02 | 2003-01-02 | Malcolm Steenburgh | Method and apparatus for measuring dwell time of objects in an environment |
US20030095186A1 (en) * | 1998-11-20 | 2003-05-22 | Aman James A. | Optimizations for live event, real-time, 3D object tracking |
US20030107649A1 (en) * | 2001-12-07 | 2003-06-12 | Flickner Myron D. | Method of detecting and tracking groups of people |
US6720874B2 (en) * | 2000-09-29 | 2004-04-13 | Ids Systems, Inc. | Portal intrusion detection apparatus and method |
US20040145658A1 (en) * | 2000-01-13 | 2004-07-29 | Ilan Lev-Ran | Video-based system and method for counting persons traversing areas being monitored |
US20040240542A1 (en) * | 2002-02-06 | 2004-12-02 | Arie Yeredor | Method and apparatus for video frame sequence-based object tracking |
US20080100704A1 (en) * | 2000-10-24 | 2008-05-01 | Objectvideo, Inc. | Video surveillance system employing video primitives |
US7382895B2 (en) * | 2002-04-08 | 2008-06-03 | Newton Security, Inc. | Tailgating and reverse entry detection, alarm, recording and prevention using machine vision |
US20080266395A1 (en) * | 2003-02-10 | 2008-10-30 | Activeye, Inc. | User assisted customization of automated video surveillance systems |
US20100026802A1 (en) * | 2000-10-24 | 2010-02-04 | Object Video, Inc. | Video analytic rule detection system and method |
US20130179034A1 (en) * | 1997-08-22 | 2013-07-11 | Timothy R. Pryor | Interactive video based games using objects sensed by tv cameras |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0567264A (en) | 1991-06-28 | 1993-03-19 | Furukawa Electric Co Ltd:The | Illegal passage monitoring method |
FR2725278B1 (en) | 1994-10-04 | 1997-08-14 | Telecommunications Sa | THREE-DIMENSIONAL SHAPE RECOGNITION EQUIPMENT |
GB9511140D0 (en) | 1995-06-02 | 1995-07-26 | Mayor Limited | Security control system |
GB9617592D0 (en) | 1996-08-22 | 1996-10-02 | Footfall Limited | Video imaging systems |
IT1289712B1 (en) * | 1996-12-04 | 1998-10-16 | Ist Trentino Di Cultura | PROCEDURE AND DEVICE FOR THE DETECTION AND AUTOMATIC COUNTING OF BODIES CROSSING A GATE |
-
2003
- 2003-04-08 US US10/410,884 patent/US7382895B2/en active Active
- 2003-04-08 WO PCT/US2003/011216 patent/WO2003088157A1/en not_active Application Discontinuation
- 2003-04-08 CA CA2481250A patent/CA2481250C/en not_active Expired - Fee Related
- 2003-04-08 AU AU2003221893A patent/AU2003221893A1/en not_active Abandoned
- 2003-04-08 EP EP03718353A patent/EP1493130A1/en not_active Ceased
-
2008
- 2008-06-02 US US12/131,850 patent/US20080285802A1/en not_active Abandoned
Patent Citations (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3564132A (en) * | 1966-01-17 | 1971-02-16 | Mardix | Apparatus for controlling the passage of persons and objects between two areas utilizing closed circuit television |
US3727034A (en) * | 1972-01-19 | 1973-04-10 | Gen Electric | Counting system for a plurality of locations |
US4000400A (en) * | 1975-04-09 | 1976-12-28 | Elder Clarence L | Bidirectional monitoring and control system |
US4303851A (en) * | 1979-10-16 | 1981-12-01 | Otis Elevator Company | People and object counting system |
US4847485A (en) * | 1986-07-15 | 1989-07-11 | Raphael Koelsch | Arrangement for determining the number of persons and a direction within a space to be monitored or a pass-through |
US4799243A (en) * | 1987-09-01 | 1989-01-17 | Otis Elevator Company | Directional people counting arrangement |
US6493878B1 (en) * | 1988-10-17 | 2002-12-10 | Lord Samuel A Kassatly | Method and apparatus for tv broadcasting and reception |
US5201906A (en) * | 1989-10-11 | 1993-04-13 | Milan Schwarz | Anti-piggybacking: sensor system for security door to detect two individuals in one compartment |
US5519784A (en) * | 1992-10-07 | 1996-05-21 | Vermeulen; Pieter J. E. | Apparatus for classifying movement of objects along a passage by type and direction employing time domain patterns |
US5581625A (en) * | 1994-01-31 | 1996-12-03 | International Business Machines Corporation | Stereo vision system for counting items in a queue |
US6081619A (en) * | 1995-07-19 | 2000-06-27 | Matsushita Electric Industrial Co., Ltd. | Movement pattern recognizing apparatus for detecting movements of human bodies and number of passed persons |
US5866887A (en) * | 1996-09-04 | 1999-02-02 | Matsushita Electric Industrial Co., Ltd. | Apparatus for detecting the number of passers |
US20130179034A1 (en) * | 1997-08-22 | 2013-07-11 | Timothy R. Pryor | Interactive video based games using objects sensed by tv cameras |
US20030095186A1 (en) * | 1998-11-20 | 2003-05-22 | Aman James A. | Optimizations for live event, real-time, 3D object tracking |
US20040145658A1 (en) * | 2000-01-13 | 2004-07-29 | Ilan Lev-Ran | Video-based system and method for counting persons traversing areas being monitored |
US6720874B2 (en) * | 2000-09-29 | 2004-04-13 | Ids Systems, Inc. | Portal intrusion detection apparatus and method |
US20080100704A1 (en) * | 2000-10-24 | 2008-05-01 | Objectvideo, Inc. | Video surveillance system employing video primitives |
US20100026802A1 (en) * | 2000-10-24 | 2010-02-04 | Object Video, Inc. | Video analytic rule detection system and method |
US20020070858A1 (en) * | 2000-12-12 | 2002-06-13 | Philips Electronics North America Corporation | Placement of camera in door for face recognition-based security systems |
US20030002712A1 (en) * | 2001-07-02 | 2003-01-02 | Malcolm Steenburgh | Method and apparatus for measuring dwell time of objects in an environment |
US20030107649A1 (en) * | 2001-12-07 | 2003-06-12 | Flickner Myron D. | Method of detecting and tracking groups of people |
US20040240542A1 (en) * | 2002-02-06 | 2004-12-02 | Arie Yeredor | Method and apparatus for video frame sequence-based object tracking |
US7382895B2 (en) * | 2002-04-08 | 2008-06-03 | Newton Security, Inc. | Tailgating and reverse entry detection, alarm, recording and prevention using machine vision |
US20080266395A1 (en) * | 2003-02-10 | 2008-10-30 | Activeye, Inc. | User assisted customization of automated video surveillance systems |
Cited By (91)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8630478B2 (en) | 2004-06-09 | 2014-01-14 | Cognex Technology And Investment Corporation | Method and apparatus for locating objects |
US8290238B2 (en) | 2004-06-09 | 2012-10-16 | Cognex Technology And Investment Corporation | Method and apparatus for locating objects |
US8891852B2 (en) | 2004-06-09 | 2014-11-18 | Cognex Technology And Investment Corporation | Method and apparatus for configuring and testing a machine vision detector |
US20070146491A1 (en) * | 2004-06-09 | 2007-06-28 | Cognex Corporation | Human-machine-interface and method for manipulating data in a machine vision system |
US20100318936A1 (en) * | 2004-06-09 | 2010-12-16 | Cognex Corporation | Human-machine-interface and method for manipulating data in a machine vision system |
US20090273668A1 (en) * | 2004-06-09 | 2009-11-05 | Cognex Corporation | Method for setting parameters of a vision detector using production line information |
US8127247B2 (en) * | 2004-06-09 | 2012-02-28 | Cognex Corporation | Human-machine-interface and method for manipulating data in a machine vision system |
US8249297B2 (en) | 2004-06-09 | 2012-08-21 | Cognex Technology And Investment Corporation | Method and apparatus for automatic visual event detection |
US20130141591A1 (en) * | 2004-06-09 | 2013-06-06 | Cognex Corporation | Human-machine-interface and method for manipulating data in a machine vision system |
US8295552B2 (en) | 2004-06-09 | 2012-10-23 | Cognex Technology And Investment Corporation | Method for setting parameters of a vision detector using production line information |
US9183443B2 (en) | 2004-06-09 | 2015-11-10 | Cognex Technology And Investment Llc | Method and apparatus for configuring and testing a machine vision detector |
US8782553B2 (en) * | 2004-06-09 | 2014-07-15 | Cognex Corporation | Human-machine-interface and method for manipulating data in a machine vision system |
US8243986B2 (en) | 2004-06-09 | 2012-08-14 | Cognex Technology And Investment Corporation | Method and apparatus for automatic visual event detection |
US9094588B2 (en) * | 2004-06-09 | 2015-07-28 | Cognex Corporation | Human machine-interface and method for manipulating data in a machine vision system |
US8355046B2 (en) * | 2004-07-14 | 2013-01-15 | Panasonic Corporation | Object tracing device, object tracing system, and object tracing method |
USRE44353E1 (en) | 2004-11-12 | 2013-07-09 | Cognex Technology And Investment Corporation | System and method for assigning analysis parameters to vision detector using a graphical interface |
US8582925B2 (en) | 2004-11-12 | 2013-11-12 | Cognex Technology And Investment Corporation | System and method for displaying and using non-numeric graphic elements to control and monitor a vision system |
US9292187B2 (en) | 2004-11-12 | 2016-03-22 | Cognex Corporation | System, method and graphical user interface for displaying and controlling vision system operating parameters |
US20100026786A1 (en) * | 2006-10-25 | 2010-02-04 | Norbert Link | Method and device for monitoring a spatial volume as well as calibration method |
US8384768B2 (en) * | 2006-10-25 | 2013-02-26 | Vitracom Ag | Pass-through compartment for persons and method for monitoring a spatial volume enclosed by a pass-through compartment for persons |
US20080118106A1 (en) * | 2006-11-22 | 2008-05-22 | Regents Of The University Of Minnesota | Crowd counting and monitoring |
US7787656B2 (en) * | 2007-03-01 | 2010-08-31 | Huper Laboratories Co., Ltd. | Method for counting people passing through a gate |
US20080212099A1 (en) * | 2007-03-01 | 2008-09-04 | Chao-Ho Chen | Method for counting people passing through a gate |
US8237099B2 (en) | 2007-06-15 | 2012-08-07 | Cognex Corporation | Method and system for optoelectronic detection and location of objects |
US8472672B2 (en) | 2007-07-03 | 2013-06-25 | Shoppertrak Rct Corporation | System and process for detecting, tracking and counting human objects of interest |
US11670086B2 (en) * | 2007-07-03 | 2023-06-06 | Shoppertrak Rct Llc | System and process for detecting, tracking and counting human objects of interest |
US20090010490A1 (en) * | 2007-07-03 | 2009-01-08 | Shoppertrak Rct Corporation | System and process for detecting, tracking and counting human objects of interest |
US7965866B2 (en) * | 2007-07-03 | 2011-06-21 | Shoppertrak Rct Corporation | System and process for detecting, tracking and counting human objects of interest |
US11232326B2 (en) * | 2007-07-03 | 2022-01-25 | Shoppertrak Rct Corporation | System and process for detecting, tracking and counting human objects of interest |
US20220148321A1 (en) * | 2007-07-03 | 2022-05-12 | Shoppertrak Rct Corporation | System and process for detecting, tracking and counting human objects of interest |
US10558890B2 (en) | 2007-07-03 | 2020-02-11 | Shoppertrak Rct Corporation | System and process for detecting, tracking and counting human objects of interest |
US8238607B2 (en) | 2007-07-03 | 2012-08-07 | Shoppertrak Rct Corporation | System and method for detecting, tracking and counting human objects of interest |
US9384407B2 (en) | 2007-07-03 | 2016-07-05 | Shoppertrak Rct Corporation | System and process for detecting, tracking and counting human objects of interest |
US8103085B1 (en) | 2007-09-25 | 2012-01-24 | Cognex Corporation | System and method for detecting flaws in objects using machine vision |
WO2011018078A1 (en) | 2009-08-11 | 2011-02-17 | Magnetic Autocontrol Gmbh | Installation for blocking passage by walking or driving, having a device for monitoring the passage area by walking or driving |
DE202009010858U1 (en) | 2009-08-11 | 2009-10-22 | Magnetic Autocontrol Gmbh | Passage or transit barrier with a device for monitoring the passage or passage area |
US20110169917A1 (en) * | 2010-01-11 | 2011-07-14 | Shoppertrak Rct Corporation | System And Process For Detecting, Tracking And Counting Human Objects of Interest |
US10909695B2 (en) | 2010-01-11 | 2021-02-02 | Shoppertrak Rct Corporation | System and process for detecting, tracking and counting human objects of interest |
US9355556B2 (en) | 2010-04-15 | 2016-05-31 | Iee International Electronics & Engineering S.A. | Configurable access control sensing device |
WO2011128408A1 (en) * | 2010-04-15 | 2011-10-20 | Iee International Electronics & Engineering S.A. | Configurable access control sensing device |
US20110291841A1 (en) * | 2010-05-27 | 2011-12-01 | Infrared Integrated Systems Limited | Monitoring hand hygiene |
US9000926B2 (en) * | 2010-05-27 | 2015-04-07 | Stephen Hollock | Monitoring hand hygiene |
EP2395451A1 (en) * | 2010-06-09 | 2011-12-14 | Iee International Electronics & Engineering S.A. | Configurable access control sensing device |
US8988188B2 (en) * | 2010-11-18 | 2015-03-24 | Hyundai Motor Company | System and method for managing entrance and exit using driver face identification within vehicle |
US20120126939A1 (en) * | 2010-11-18 | 2012-05-24 | Hyundai Motor Company | System and method for managing entrance and exit using driver face identification within vehicle |
US11025865B1 (en) * | 2011-06-17 | 2021-06-01 | Hrl Laboratories, Llc | Contextual visual dataspaces |
US9305363B2 (en) | 2011-09-23 | 2016-04-05 | Shoppertrak Rct Corporation | System and method for detecting, tracking and counting human objects of interest using a counting system and a data capture device |
US10936859B2 (en) | 2011-09-23 | 2021-03-02 | Sensormatic Electronics, LLC | Techniques for automatically identifying secondary objects in a stereo-optical counting system |
US10733427B2 (en) | 2011-09-23 | 2020-08-04 | Sensormatic Electronics, LLC | System and method for detecting, tracking, and counting human objects of interest using a counting system and a data capture device |
US10410048B2 (en) | 2011-09-23 | 2019-09-10 | Shoppertrak Rct Corporation | System and method for detecting, tracking and counting human objects of interest using a counting system and a data capture device |
US9734388B2 (en) | 2011-09-23 | 2017-08-15 | Shoppertrak Rct Corporation | System and method for detecting, tracking and counting human objects of interest using a counting system and a data capture device |
US12039803B2 (en) | 2011-09-23 | 2024-07-16 | Sensormatic Electronics, LLC | Techniques for automatically identifying secondary objects in a stereo-optical counting system |
US9177195B2 (en) | 2011-09-23 | 2015-11-03 | Shoppertrak Rct Corporation | System and method for detecting, tracking and counting human objects of interest using a counting system and a data capture device |
US20130314232A1 (en) * | 2012-05-23 | 2013-11-28 | Honeywell International Inc. | Tailgating detection |
US9142106B2 (en) * | 2012-05-23 | 2015-09-22 | Honeywell International, Inc. | Tailgating detection |
US20140063191A1 (en) * | 2012-08-27 | 2014-03-06 | Accenture Global Services Limited | Virtual access control |
EP2704107A3 (en) * | 2012-08-27 | 2017-08-23 | Accenture Global Services Limited | Virtual Access Control |
US10453278B2 (en) * | 2012-08-27 | 2019-10-22 | Accenture Global Services Limited | Virtual access control |
US9367733B2 (en) | 2012-11-21 | 2016-06-14 | Pelco, Inc. | Method and apparatus for detecting people by a surveillance system |
US10009579B2 (en) | 2012-11-21 | 2018-06-26 | Pelco, Inc. | Method and system for counting people using depth sensor |
US9639747B2 (en) * | 2013-03-15 | 2017-05-02 | Pelco, Inc. | Online learning method for people detection and counting for retail stores |
US20140270358A1 (en) * | 2013-03-15 | 2014-09-18 | Pelco, Inc. | Online Learning Method for People Detection and Counting for Retail Stores |
US10185965B2 (en) * | 2013-09-27 | 2019-01-22 | Panasonic Intellectual Property Management Co., Ltd. | Stay duration measurement method and system for measuring moving objects in a surveillance area |
US20150095107A1 (en) * | 2013-09-27 | 2015-04-02 | Panasonic Corporation | Stay duration measurement device, stay duration measurement system and stay duration measurement method |
US10127754B2 (en) | 2014-04-25 | 2018-11-13 | Vivint, Inc. | Identification-based barrier techniques |
US10657749B2 (en) | 2014-04-25 | 2020-05-19 | Vivint, Inc. | Automatic system access using facial recognition |
US10235822B2 (en) | 2014-04-25 | 2019-03-19 | Vivint, Inc. | Automatic system access using facial recognition |
US10274909B2 (en) | 2014-04-25 | 2019-04-30 | Vivint, Inc. | Managing barrier and occupancy based home automation system |
US10235854B2 (en) * | 2014-08-19 | 2019-03-19 | Sensormatic Electronics, LLC | Tailgating detection in frictionless access control system |
US10373408B2 (en) | 2014-08-19 | 2019-08-06 | Sensormatic Electronics, LLC | Method and system for access control proximity location |
US10158550B2 (en) | 2014-08-19 | 2018-12-18 | Sensormatic Electronics, LLC | Access control system with omni and directional antennas |
US20160284183A1 (en) * | 2014-08-19 | 2016-09-29 | Sensormatic Electronics, LLC | Tailgating Detection in Frictionless Access Control System |
US10209698B2 (en) | 2014-12-26 | 2019-02-19 | Industrial Technology Research Institute | Calibration method and automation machining apparatus using the same |
US10403066B2 (en) | 2015-05-20 | 2019-09-03 | Sensormatic Electronics, LLC | Portable device having directional BLE antenna |
US9947155B2 (en) | 2015-05-20 | 2018-04-17 | Sensormatic Electronics, LLC | Frictionless access system for public access point |
US9794446B2 (en) | 2016-03-11 | 2017-10-17 | Fuji Xerox Co., Ltd. | Information processing apparatus, information processing method, and non-transitory computer readable medium |
WO2017176876A1 (en) * | 2016-04-07 | 2017-10-12 | Vivint, Inc. | Identification-based barrier techniques |
US20170364743A1 (en) * | 2016-06-15 | 2017-12-21 | Google Inc. | Object rejection system and method |
US10402643B2 (en) * | 2016-06-15 | 2019-09-03 | Google Llc | Object rejection system and method |
TWI735588B (en) * | 2016-07-11 | 2021-08-11 | 日商迪思科股份有限公司 | Management system |
US11036972B2 (en) * | 2016-07-11 | 2021-06-15 | Disco Corporation | Management system for supervising operator |
DE102016119343A1 (en) * | 2016-10-11 | 2018-04-12 | Bircher Reglomat Ag | Object monitoring with infrared image recording and infrared pulse illumination |
DE102016119348A1 (en) * | 2016-10-11 | 2018-04-12 | Bircher Reglomat Ag | Stereometric object flow interaction |
US10386460B2 (en) | 2017-05-15 | 2019-08-20 | Otis Elevator Company | Self-calibrating sensor for elevator and automatic door systems |
US10221610B2 (en) | 2017-05-15 | 2019-03-05 | Otis Elevator Company | Depth sensor for automatic doors |
US10867161B2 (en) * | 2017-09-06 | 2020-12-15 | Pixart Imaging Inc. | Auxiliary filtering device for face recognition and starting method for electronic device |
US20190073521A1 (en) * | 2017-09-06 | 2019-03-07 | Pixart Imaging Inc. | Auxiliary filtering device for face recognition and starting method for electronic device |
DE102018104202A1 (en) * | 2018-02-23 | 2019-08-29 | Marantec Antriebs- Und Steuerungstechnik Gmbh & Co. Kg | Method for operating a gate system and gate system |
CN111599064A (en) * | 2020-05-29 | 2020-08-28 | 武汉虹信技术服务有限责任公司 | Bidirectional access control method, system, terminal and computer readable medium |
TWI720903B (en) * | 2020-06-03 | 2021-03-01 | 南開科技大學 | Home life safety warning system for living alone and method thereof |
TWI826784B (en) * | 2021-05-11 | 2023-12-21 | 大陸商星宸科技股份有限公司 | Object detection apparatus and method |
Also Published As
Publication number | Publication date |
---|---|
WO2003088157A1 (en) | 2003-10-23 |
CA2481250C (en) | 2011-09-27 |
US7382895B2 (en) | 2008-06-03 |
AU2003221893A1 (en) | 2003-10-27 |
US20040017929A1 (en) | 2004-01-29 |
CA2481250A1 (en) | 2003-10-23 |
EP1493130A1 (en) | 2005-01-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7382895B2 (en) | Tailgating and reverse entry detection, alarm, recording and prevention using machine vision | |
US8005267B2 (en) | Intelligent vehicle access control system | |
US20220228419A1 (en) | Controlled access gate | |
US20070268145A1 (en) | Automated tailgating detection via fusion of video and access control | |
US20060225352A1 (en) | Method and device for pass-through control and/or singling-out of persons | |
EP1430449B1 (en) | Security method for gaining access, access verification device, elevator | |
US8749344B2 (en) | Exit lane monitoring system | |
US7733043B2 (en) | Revolving door control system | |
US6720874B2 (en) | Portal intrusion detection apparatus and method | |
CN105518749A (en) | System and method for controlling and monitoring access to restricted areas | |
WO2002058404A1 (en) | Access control method and apparatus for members and guests | |
JP4747611B2 (en) | Entrance / exit management device | |
KR101492799B1 (en) | Entrance control integrated video recording system and method thereof | |
KR101964374B1 (en) | Access Control system and method | |
JP2010204719A (en) | Entrance management system, entrance management device, and entrance management method | |
JP2010211514A (en) | Intruder monitoring device | |
US11587385B2 (en) | Security revolving door assembly | |
JP4493521B2 (en) | Access control device | |
JP3690361B2 (en) | Outsider monitoring system | |
JP2000295598A (en) | Remote monitor system | |
KR20230114650A (en) | Integrity entrance management apparatus using image and method for unmanned space | |
AU2003286990B2 (en) | Intelligent vehicle access control system | |
KR20020074900A (en) | A Exit and Entry Management System and Methods |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |