WO2018037392A1 - Système et procédé de contrôle d'accès dans des zones réglementées ouvertes - Google Patents

Système et procédé de contrôle d'accès dans des zones réglementées ouvertes Download PDF

Info

Publication number
WO2018037392A1
WO2018037392A1 PCT/IB2017/055147 IB2017055147W WO2018037392A1 WO 2018037392 A1 WO2018037392 A1 WO 2018037392A1 IB 2017055147 W IB2017055147 W IB 2017055147W WO 2018037392 A1 WO2018037392 A1 WO 2018037392A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
person
central control
validator
control unit
Prior art date
Application number
PCT/IB2017/055147
Other languages
English (en)
Inventor
Nuno Miguel FRADIQUE VIEIRA
Original Assignee
Outmind, Lda
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Outmind, Lda filed Critical Outmind, Lda
Priority to EP17780518.1A priority Critical patent/EP3535734A1/fr
Publication of WO2018037392A1 publication Critical patent/WO2018037392A1/fr

Links

Classifications

    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C9/00Individual registration on entry or exit
    • G07C9/20Individual registration on entry or exit involving the use of a pass
    • G07C9/28Individual registration on entry or exit involving the use of a pass the pass enabling tracking or indicating presence
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07BTICKET-ISSUING APPARATUS; FARE-REGISTERING APPARATUS; FRANKING APPARATUS
    • G07B15/00Arrangements or apparatus for collecting fares, tolls or entrance fees at one or more control points

Definitions

  • This application relates to the field of access control systems for accessing open restricted areas.
  • Physical barrier access control methods from now on referred as physical barriers: the entry or both the entry and the exit channels are physically blocked with, for example, barriers, turnstiles, gates or doors; people can only enter or leave the restricted area after some kind of mandatory validation of their credentials at entrance (ticket, card, biometric identification, etc.) which opens the physical barrier and allows the passage of the person one at a time in each direction.
  • the physical barrier blocks his/her access to the restricted area.
  • Open access control methods also known as "proof-Of- payment" in public transport systems, from now on referred as open access: in this type of solution the entry and exit channels are open and have no physical barriers, so multiple persons can freely enter and exit the restricted areas simultaneously.
  • users must perform a voluntary validation of their credentials (ticket, card, biometric identification, etc.) before entering the restricted area. This validation can be done by a human operator, but is most commonly performed by an electronic validation system (validator, reader or a biometric sensor, for example) .
  • an electronic validation system validation, reader or a biometric sensor, for example
  • open restricted areas are characterized by:
  • Physical barrier method ensures a much higher effectiveness of access control, preventing anyone without a valid ticket from entering the restricted area.
  • it requires more space to implement, is more expensive, constitutes an emergency evacuation obstacle and has a lower flow rate of people at entry and at exit.
  • station confined and heavy passenger load transport systems, such as subways the fare evasion issues are usually tackled using physical barriers access control.
  • state of the art open access control systems [1], [2]
  • passengers approach their tickets to the validator device which detects the ticket and checks if the ticket is valid.
  • state of the art open access control systems use any form of machine readable electronic tickets, such as magnetic stripe, proximity contactless smart card (RFID) or near field communications tickets.
  • RFID proximity contactless smart card
  • open access control systems are their lower effectiveness, as they only control those people that make a voluntary validation at entry. Therefore, they do not detect users that bypass the validator device and enter the restricted area without a valid ticket or authorization. In fact, most of the technological efforts to tackle fare evasion on state of the art open access control systems are directed towards the ticketing system itself, including tampering-proof systems and hard to fake tickets.
  • the present application intends to solve the problem of controlling the access to open restricted areas, which have no physical barriers so multiple persons can freely enter and leave the restricted area simultaneously, of both people who voluntarily follow the rules that require validation of their credentials on a validator unit and also of those who do not follow those rules, i.e., do not validate their credentials before entering the restricted area; thus assuring that only authorized people are allowed inside the restricted areas.
  • the proposed system falls under the category of open access control systems, respecting therefore its basic principle of operation which requires a prior validation step of the user before entering the area to which the access is restricted .
  • the proposed system comprises a central control unit (CCU) having a special purpose processor module responsible for all processing tasks such as control, logical and mathematical computations, which handles people detection and validation status management.
  • the system also comprises one or more sensor units, one or more validator units, an interface unit and at least one alarm unit.
  • the sensor units are responsible for capturing spatial data that, together with the execution of a real-time people detection and tracking algorithm in the CCU, can detect and track people in both the restricted area and the approach area where the validator units are located.
  • the spatial data collected must have enough detail and frame rate to differentiate people from background, which, in moving vehicles, for example, is extremely subject to changes both in position (vibrations) and light conditions (e.g., sudden sun incidence) making the technique of background subtraction a problematic approach.
  • said collected data must have a low enough data rate to allow processing in real-time with reasonable computer processing capacity.
  • the collected data could therefore be, for example:
  • a pseudo 3D image also known as a range image (2D image + depth map or disparity map) , obtained by a special type of sensors called "range cameras”; these can be based on several technologies, including stereoscopy, structured light, time-of-flight , etc.
  • the validator units are provided with means for reading, processing and validating the person's credentials.
  • the interface unit allows the connection to any type of validation units, being able to decode the validator's communication protocol and to convey the data to the CCU.
  • the alarm unit has the purpose of alerting whenever a non- authorized person has entered the restricted area.
  • the proposed system constantly monitors all validation events and all persons' positions.
  • the operation method of the proposed system is based on the assumption that at the time of a given validation event the person closest to the validator, who hasn't yet performed a validation, is the one who has executed the operation. This assumption being true, such person is then assigned a "VALID" status that indicates his or hers right to access the restricted area.
  • the system operates by constantly keeping track of all persons' positions and all validation events and then matching every validation event with the person closest to the relevant validator, assigning that person a "VALID" identification tag, that certifies that from then on such person is clear to enter the restricted area and also will not be considered in future validation events matchings .
  • Figure 1 illustrates the conceptual operation of the system divided in three moments in time (steps 1, 2 and 3) ; in each of these steps reference numbers represent:
  • CPU Central control unit
  • FIG. 2 illustrates a conceptual illustration of the system disclosed, in which reference numbers represent:
  • CPU Central control unit
  • FIG. 3 illustrates a conceptual flowchart of the method for access control in open restricted areas using the system disclosed in each of the three steps identified in figure 1, in which reference signs represent:
  • I- Process action of central control unit lookup detected person in person database and checking his/her authorization status
  • Figures 4 shows a conceptual illustration of the system implemented onboard a public transport bus in which reference numbers represent:
  • Figure 5 shows another conceptual illustration of the system implemented onboard a public transport bus in which reference numbers represent:
  • the proposed system falls under the category of open access control systems to a restricted area (9) - in which the entry channel is open, having no physical barriers, so that multiple people can freely enter and exit the restricted area (9) simultaneously, respecting therefore its basic principle of operation which requires that when a given person approaches (8) the restricted area -step 1- he or she must perform a voluntary validation - step 2- of the access credential on a validator unit (3) before entering -step 3- the open restricted area (9) .
  • the approach now followed focus its operation on the persons throughout the three steps, detecting those who wrongly, by not voluntarily performing step 2, enter the open restricted zone (9) without authorization.
  • the method of operation of the access control system disclosed consists in constantly monitoring all persons' positions and all validation events.
  • the central control unit decides it should be allocated to the detected person with a "NOT VALID” or "UNCERTAIN” validation status whose location, obtained by the people detection and tracking algorithm, is nearest to the validator unit (3) coordinates and assigns a "VALID” validation status to that person. If two or more persons are located near the validator unit at a similar distance, an "UNCERTAIN" status is assigned to them.
  • a person (7) does not perform the validation operation in step 2
  • he or she will keep the "NOT VALID” validation status initially assigned to him/her when such person was detected for the first time in step 1. Since the persons' position is continuously monitored by the system through the people detection and tracking algorithm, based on the data gathered by the sensor unit, whenever that person enters the restricted area (9) in step 3, the "NOT VALID" validation status will cause the alarm to be triggered (5) .
  • the system for access control in restricted areas shown comprises a main central control unit (1), one or more sensor units (2), one or more validator units (3), an interface unit (4) and one or more alarm units (5) . All units are connected by means of high speed transmission links. The necessary bandwidth will depend on the type and resolution of the sensors used and of the frame capture rate.
  • the central control unit (1) comprises a processing module provided with computational requirements and is responsible for managing the operation of the system, implementing a finite state machine that controls all other technical units that are integrated to it, and performing, in a different process that runs concurrently, the intensive mathematical computations needed to detect people, in real time, from spatial data gathered from the sensors .
  • the central control unit (1) also has the ability to temporarily store data in a person' s record regarding each detected person and to perform the association of each validation event to the detected person who executed it in the validator unit (3) by assigning a positive validation status on that person's record.
  • the central control unit (1) triggers the alarm unit (5) whenever a person with a "NOT VALID" validation status is detected inside the restricted area.
  • the sensor units (2) position will depend on the area being covered and other factors, such as the lenses field-of-view whenever optical sensors are used, but will generally be located above the approach and restricted areas in order to properly be able to cover said areas .
  • the sensors gather spatial data of the approach and restricted areas with an equilibrium of i) a high enough detail and frame-rate, to allow an accurate representation of space; and ii) a low enough data rate, which is necessary for the operation of the real time people detection and tracking algorithms running in the CCU (1) with reasonable computer processing capacity.
  • the said sensors can be implemented through 2D cameras, stereoscopic, structured- light, time-of-flight or other range cameras, thermal or ultrasound technologies, etc. It should be noted that the aforementioned list is merely illustrative and does not intend to limit the scope of sensor technologies that can be used to this purpose.
  • One or more validator units (3) are provided with means for reading, processing and validation of the person' s credentials and transmitting the validation event data to the central control unit (1) through the interface unit
  • the validator unit (3) use any form of machine readable electronic validation system, such as, for example, magnetic stripe, proximity contactless smart card
  • RFID RFID
  • RFID near field communications systems
  • biometric readers .
  • the interface unit (4) is responsible for handling and processing low-level communication protocols between the validator units (3) and the central control unit (1) . It is a general-purpose interface, specifically designed to be compatible with all types of validator units (3), which can be configured, at software and hardware levels, as needed on each particular system installation.
  • the alarm unit (5) is controlled by the central control unit (1) and emits a visual alert, an audio alert, a signal alert or other types of alert whenever the central control unit (1) detects a person (7) carrying a "NOT VALID" validation status inside the restricted area.
  • the alarm unit (5) can consist, for example, of a luminous alarm, an audible alarm, a signal transmitted alarm or any other type of external or internal (computer-generated) alarm.
  • the alarm includes an external red luminous portal and an internally-stored multimedia file.
  • the sound (or other multimedia content) is output through the central control unit (1) dedicated connectors and reproduced through optional speakers/monitors.
  • the file contents can include both buzzer/horn sounds and a recorded voice message. It should be noted that the aforementioned list of alarm types is merely illustrative and does not intend to limit the scope of alarm technologies that can be used to this purpose.
  • the method comprises two different phases: a setup phase and an operation phase.
  • the setup phase is performed once at every location before the system can operate. It consists of:
  • sensor unit calibration process where the variable sensor parameters, those which are sensor-specific and, for manufacturing reasons, have slight variations from one unit to the next one, are measured and registered in a calibration file so that the central control unit can, from then on, compensate errors caused by those variations; for example, when optical cameras are used, lenses distortion must be measured and stored by means of a checkered target placed in various positions;
  • each validator unit location -x, y, z coordinates- in the sensor unit referential to enable the central control unit to calculate the distance between the validator unit and each detected person;
  • the approach area e.g., a vehicle entrance door zone- in the sensor unit referential -x, y, z coordinates- where new persons can be detected by the system for the first time. Any person who is detected for the first time out of the approach area is disregarded by the CCU in order to prevent erroneous detections caused by noise or people coming back from inside the restricted area.
  • the 2D or 3D data is permanently being gathered by the sensor unit (B,2) to allow a representation of the space in order to detect and determine the location of every person in the approach and restricted areas. Said data is transmitted to the central control unit in order to create a model of the analyzed space in real-time.
  • the gathered data is then mathematically processed by the processing module of the central control unit (C,l), using adequate algorithms to perform real time detection and tracking of people in space and time.
  • C,l central control unit
  • the validator unit If the person validates his access authorization (F) on a given validator unit "X”, the validator unit reads, processes and validates the person's authorization (G) .
  • the interface unit receives and decodes (H) the signal transmitted by the validator "X”, which is then conveyed to the central control unit.
  • data continuously being gathered by the sensor unit (B) is mathematically processed by the processing module of the central control unit (C) in order to detect in real-time each person' s position in time and space .
  • the central control unit looks up the detected persons in the database and checks the valid authorization status of each ( I ) .
  • Statuses marked as “UNCERTAIN” may remain as such until the person enters the restricted area or yet be upgraded to "VALID” if certain future conditions are met; for example, a validation event received while there are no candidates to have performed that operation other than a person marked as “UNCERTAIN” means that person' s status should have been "NOT VALID” before and will from then on be “VALID” with 100% certainty.
  • the data continuously being gathered by the sensor unit (B) is mathematically processed by the processing module of the central control unit (C) in order to detect in real-time each person's position in time and space.
  • the central control unit looks up the detected person' s record in the database and checks its authorization status (I) .
  • Figures 4 and 5 intend to illustrate the application of the system and method described in a public transport bus, as a non-limiting implementation scenario.
  • the preferred embodiment of the system proposed under these conditions require a compact size central control unit (1) specially designed for onboard utilization.
  • Sensor unit (2) cannot be affected by sunlight and must have a high field of view angle.
  • One possibility is to use a stereoscopic optical sensor, with megapixel class cameras, which allows a 3D representation of the space, but other sensor approaches can be used, dependent on the type of real time people detection and tracking algorithm to use. All components must be resistant to vibrations and protected from electrical noise through a DC/DC regulated power supply (10) .
  • a manual control unit (11) can also be used to allow human intervention - for example, in a bus, the driver must be able to manually override the automatic operation of the system to cope with exceptional or unexpected situations, such as an on-board manual ticket sale, an inspection team boarding the bus, a system malfunction, etc.
  • the manual control unit includes one or more buttons/switches that allow a human to perform one or more pre-defined functions, which are transmitted to the central control unit, overriding the automatic operation of the system. These can be configured differently on each installation.
  • the buttons/switches are binary (on/off) and so is the transmitted signal, which is also handled by the I/O interface .

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Alarm Systems (AREA)

Abstract

La présente invention a pour but de résoudre le problème consistant à contrôler l'accès à des zones réglementées ouvertes (c'est-à-dire sans moyen physique de blocage d'accès, où des personnes doivent valider volontairement leurs justificatifs avant d'entrer), à mettre en application le contrôle des personnes qui valident leurs justificatifs à l'entrée et des personnes qui ne le font pas indûment; et à garantir ainsi que seules les personnes habilitées sont autorisées à l'intérieur de ces zones. Le système proposé par l'invention et le procédé respectif détectent l'emplacement de chaque personne dans des zones réglementées et des zones d'approche vers lesdites zones, afin d'identifier ceux qui ont effectué la validation volontaire de leurs justificatifs et ceux qui ne l'ont pas fait, étant donc en situation irrégulière. Le système proposé peut être mis en œuvre sur n'importe quel type de zones réglementées ouvertes, étant particulièrement adapté au contrôle de l'accès des passagers dans des systèmes de transport public ayant des zones réglementées ouvertes, soit à bord des véhicules eux-mêmes, soit dans les stations.
PCT/IB2017/055147 2016-08-26 2017-08-28 Système et procédé de contrôle d'accès dans des zones réglementées ouvertes WO2018037392A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP17780518.1A EP3535734A1 (fr) 2016-08-26 2017-08-28 Système et procédé de contrôle d'accès dans des zones réglementées ouvertes

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
PT109597 2016-08-26
PT10959716 2016-08-26

Publications (1)

Publication Number Publication Date
WO2018037392A1 true WO2018037392A1 (fr) 2018-03-01

Family

ID=60022130

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2017/055147 WO2018037392A1 (fr) 2016-08-26 2017-08-28 Système et procédé de contrôle d'accès dans des zones réglementées ouvertes

Country Status (2)

Country Link
EP (1) EP3535734A1 (fr)
WO (1) WO2018037392A1 (fr)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019014440A1 (fr) * 2017-07-12 2019-01-17 Cubic Corporation Système de validation de ticket suivi et de rétroaction
CN110378179A (zh) * 2018-05-02 2019-10-25 上海大学 基于红外热成像的地铁逃票行为检测方法及系统
WO2020072583A1 (fr) * 2018-10-02 2020-04-09 Capital One Services, Llc Systèmes et procédés d'établissement d'identité pour retrait de commande
IT202000022108A1 (it) * 2020-09-22 2020-12-22 Improve Public Mobility S R L Procedimento e sistema di controllo del flusso di persone
CN113658325A (zh) * 2021-08-05 2021-11-16 郑州轻工业大学 数字孪生环境下的生产线不确定对象智能识别与预警方法

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040105006A1 (en) * 2002-12-03 2004-06-03 Lazo Philip A. Event driven video tracking system
DE102009000006A1 (de) * 2009-01-02 2010-07-08 Robert Bosch Gmbh Kontrollvorrichtung, Verfahren zur Kontrolle eines Objektes in einem Überwachungsbereich und Computerprogramm
EP2270761A1 (fr) * 2009-07-01 2011-01-05 Thales Architecture de système et procédé de suivi des individus dans des environnements de foules
US20140015978A1 (en) * 2012-07-16 2014-01-16 Cubic Corporation Barrierless gate
EP2704107A2 (fr) * 2012-08-27 2014-03-05 Accenture Global Services Limited Commande d'accès virtuel
FR3000266A1 (fr) * 2012-12-26 2014-06-27 Thales Sa Procede de lutte contre la fraude, et systeme correspondant

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040105006A1 (en) * 2002-12-03 2004-06-03 Lazo Philip A. Event driven video tracking system
DE102009000006A1 (de) * 2009-01-02 2010-07-08 Robert Bosch Gmbh Kontrollvorrichtung, Verfahren zur Kontrolle eines Objektes in einem Überwachungsbereich und Computerprogramm
EP2270761A1 (fr) * 2009-07-01 2011-01-05 Thales Architecture de système et procédé de suivi des individus dans des environnements de foules
US20140015978A1 (en) * 2012-07-16 2014-01-16 Cubic Corporation Barrierless gate
EP2704107A2 (fr) * 2012-08-27 2014-03-05 Accenture Global Services Limited Commande d'accès virtuel
FR3000266A1 (fr) * 2012-12-26 2014-06-27 Thales Sa Procede de lutte contre la fraude, et systeme correspondant

Non-Patent Citations (12)

* Cited by examiner, † Cited by third party
Title
A. J. LIPTON; H. FUJIYOSHI; R. S. PATIL: "Moving target classification and tracking from real-time video", PROCEEDINGS OF THE DARPA IMAGE UNDERSTANDING WORKSHOP (IUW'98, November 1998 (1998-11-01), pages 129 - 136
A. M. BAUMBERG: "PhD thesis, School of Computer Studies", October 1995, UNIVERSITY OF LEEDS, article "Learning Deformable Models for Tracking Human Motion"
C. WREN; A. AZARBAYEJANI; T. DARRELL; A. PENTLAND: "Pfinder: Real-time tracking of the human body", TECH. REP. 353, MIT MEDIA LABORATORY PERCEPTUAL COMPUTING SECTION, 1995
D. M. GAVRILA; L. S. DAVIS: "ARPA Image Understanding Workshop", February 1996, PALM SPRINGS, article "Tracking of humans in action: A 3-D model-based approach", pages: 737 - 746
F. BR'EMOND; M. THONNAT: "Tracking multiple non-rigid objects in a cluttered scene", PROCEEDINGS OF THE 10TH SCANDINAVIAN CONFERENCE ON IMAGE ANALYSIS (SCIA '97, vol. 2, 1997, pages 643 - 650
H. SIDENBLADH; M. J. BLACK; D. J. FLEET: "ECCV 2000, 6th European Conference on Computer Vision", 2000, SPRINGER VERLAG, article "Stochastic tracking of 3D human figures using 2D image motion", pages: 702 - 718
I. HARITAOGLU; D. HARWOOD; L. S. DAVIS: "W4 : Real-time surveillance of people and their actions", IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, vol. 22, August 2000 (2000-08-01), pages 809 - 830, XP000976488, DOI: doi:10.1109/34.868683
KEITH E. MAYES, TRANSPORT TICKETING SECURITY AND FRAUD CONTROLS
MARC SEL, THE SECURITY OF MASS TRANSPORT TICKETING SYSTEM
Q. CAI; A. MITICHE; J. K. AGGARWAL: "Tracking human motion in an indoor environment", PROCEEDINGS OF THE 2ND INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP'95, 1995, pages 215 - 218, XP010196798, DOI: doi:10.1109/ICIP.1995.529584
S. KHAN; 0. JAVED; Z. RASHEED; M. SHAH: "Human tracking in multiple cameras", PROCEEDINGS OF THE 8TH IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2001, 9 July 2001 (2001-07-09), pages 331 - 336, XP010554001
TRANSPORT SYSTEM WITH ARTIFICIAL INTELLIGENCE FOR SAFETY AND FARE EVASION, Retrieved from the Internet <URL:https://ec.europa.eu/easme/en/sme/5840/transport-system-artificial-intelligence-safety-and-fare-evasion>

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019014440A1 (fr) * 2017-07-12 2019-01-17 Cubic Corporation Système de validation de ticket suivi et de rétroaction
US10497193B2 (en) 2017-07-12 2019-12-03 Cubic Corporation Tracked ticket validation and feedback system
US10600264B2 (en) 2017-07-12 2020-03-24 Cubic Corporation Tracked ticket validation and feedback system
CN110378179A (zh) * 2018-05-02 2019-10-25 上海大学 基于红外热成像的地铁逃票行为检测方法及系统
CN110378179B (zh) * 2018-05-02 2023-07-18 上海大学 基于红外热成像的地铁逃票行为检测方法及系统
US11423452B2 (en) 2018-10-02 2022-08-23 Capital One Services, Llc Systems and methods for establishing identity for order pick up
US10733645B2 (en) 2018-10-02 2020-08-04 Capital One Services, Llc Systems and methods for establishing identity for order pick up
WO2020072583A1 (fr) * 2018-10-02 2020-04-09 Capital One Services, Llc Systèmes et procédés d'établissement d'identité pour retrait de commande
US12106341B2 (en) 2018-10-02 2024-10-01 Capital One Services, Llc Systems and methods for establishing identity for order pick up
IT202000022108A1 (it) * 2020-09-22 2020-12-22 Improve Public Mobility S R L Procedimento e sistema di controllo del flusso di persone
WO2022064534A1 (fr) * 2020-09-22 2022-03-31 Improve Public Mobility S.R.L. Procédé et système de contrôle d'un flux de personnes
CN113658325A (zh) * 2021-08-05 2021-11-16 郑州轻工业大学 数字孪生环境下的生产线不确定对象智能识别与预警方法
CN113658325B (zh) * 2021-08-05 2022-11-11 郑州轻工业大学 数字孪生环境下的生产线不确定对象智能识别与预警方法

Also Published As

Publication number Publication date
EP3535734A1 (fr) 2019-09-11

Similar Documents

Publication Publication Date Title
EP3535734A1 (fr) Système et procédé de contrôle d&#39;accès dans des zones réglementées ouvertes
US10650650B2 (en) Parcel theft deterrence for A/V recording and communication devices
US10848719B2 (en) System and method for gate monitoring during departure or arrival of an autonomous vehicle
US8804997B2 (en) Apparatus and methods for video alarm verification
US10878249B2 (en) Border inspection with aerial cameras
US9679425B2 (en) Control and monitoring system and method for access to a restricted area
US11096022B2 (en) Tailgating detection
CN101356108A (zh) 用于电梯控制的视频辅助系统
CN108053525B (zh) 基于图像处理的控制方法
US20150077550A1 (en) Sensor and data fusion
Garibotto et al. White paper on industrial applications of computer vision and pattern recognition
US11349707B1 (en) Implementing security system devices as network nodes
KR20190078688A (ko) 인공지능을 이용한 주차 인식 장치
KR101370982B1 (ko) 무선태그를 이용한 이동감지장치 및 이를 구비한 보안장치와 이를 이용한 이동감지방법
US20190304220A1 (en) Systems and methods for monitoring and controlling access to a secured area
US11525937B2 (en) Registration system
KR20220031258A (ko) 상담 이벤트에 대응되는 학습 데이터 기반 능동형 보안관제 서비스 제공 방법
Yoon et al. Tracking model for abnormal behavior from multiple network CCTV using the Kalman Filter
Cheh et al. Leveraging physical access logs to identify tailgating: Limitations and solutions
Alhelali et al. Vision-Based Smart Parking Detection System Using Object Tracking
Rayte et al. Crime monitoring and controlling system by mobile device
MBONYUMUVUNYI Contribution of Smart Intelligent Video surveillance solutions for public safety in Kigali City: Case study of Rwanda National Police
KR20220031266A (ko) 감시장비의 감시정보와 보안 관제 상담 정보의 기계학습을 이용한 능동형 보안관제 서비스 제공 장치
KR20220031327A (ko) 능동형 보안 관제 상담 대응 학습 데이터를 구축 프로그램 기록매체
KR20200063293A (ko) 차량용 블랙박스를 활용한 객체추적 시스템 및 방법

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17780518

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2017780518

Country of ref document: EP

Effective date: 20190326