WO2018072520A1 - 安检系统及配置安检设备的方法 - Google Patents

安检系统及配置安检设备的方法 Download PDF

Info

Publication number
WO2018072520A1
WO2018072520A1 PCT/CN2017/095118 CN2017095118W WO2018072520A1 WO 2018072520 A1 WO2018072520 A1 WO 2018072520A1 CN 2017095118 W CN2017095118 W CN 2017095118W WO 2018072520 A1 WO2018072520 A1 WO 2018072520A1
Authority
WO
WIPO (PCT)
Prior art keywords
security
examinee
safety factor
identity
user
Prior art date
Application number
PCT/CN2017/095118
Other languages
English (en)
French (fr)
Inventor
陈志强
赵自然
吴万龙
金颖康
丁先利
隆姣
沈宗俊
李峥
Original Assignee
同方威视技术股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 同方威视技术股份有限公司 filed Critical 同方威视技术股份有限公司
Priority to JP2019507210A priority Critical patent/JP7178343B2/ja
Priority to US16/321,251 priority patent/US11631152B2/en
Priority to EP17861967.2A priority patent/EP3480776A4/en
Publication of WO2018072520A1 publication Critical patent/WO2018072520A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/26Government or public services
    • G06Q50/265Personal security, identity or safety
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/28Databases characterised by their database models, e.g. relational or object models
    • G06F16/284Relational databases
    • G06F16/285Clustering or classification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/583Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2411Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on the proximity to a decision surface, e.g. support vector machines
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/245Classification techniques relating to the decision surface
    • G06F18/2451Classification techniques relating to the decision surface linear, e.g. hyperplane
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/26Government or public services
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/46Extracting features or characteristics from the video content, e.g. video fingerprints, representative shots or key frames
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • G06V40/169Holistic features and representations, i.e. based on the facial image taken as a whole
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C9/00Individual registration on entry or exit
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/44Event detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/05Recognition of patterns representing particular kinds of hidden objects, e.g. weapons, explosives, drugs

Definitions

  • the invention relates to the technical field of security inspection, in particular to a security inspection system capable of realizing a customized security inspection strategy based on big data and a method for configuring a security inspection device.
  • the human body safety inspection device scans the human body in a non-contact manner, and can detect objects hidden in the clothes, not only detecting metal substances, but also detecting non-metallic substances carried on the human body, such as ceramics, plastics, powders, liquids, Colloids, etc.
  • the detection method of this type of equipment is non-contact inspection, and the detection result does not show any physical feature details, which can fully protect the personal privacy of the inspected person.
  • the detection method of this method is more efficient, and can work continuously without interruption, which is much higher than the traditional manual security efficiency.
  • This kind of equipment can adopt automatic interpretation method, which can accurately locate the position of suspicious objects, effectively reduce the influence of human factors and reduce the labor intensity of security personnel.
  • Human security equipment is widely used in the security inspection field because of its unique advantages.
  • the existing equipment is relatively simple in judging the potential danger of users.
  • the hidden image recognition algorithm is used to analyze the scanned images to determine the possible threats of the inspected personnel.
  • the hidden object recognition algorithm in human security screening equipment adopts a common algorithm framework, and does not consider the identity of the user and the difference of different groups of people. All the inspected personnel use the same type of algorithm framework for analysis and processing, and these algorithms are established. Under the assumption that all users carry the same probability of dangerous goods. Therefore, when the algorithm is identified, there are signs of false negatives and false positives. User experience.
  • an object of the present disclosure is at least in part to provide a security system capable of customizing security policies to different users based on user data and a method of configuring security devices.
  • a security check system including: an identity information entry device for entering an identity of a subject; and a parameter determination device for based on a user corresponding to the identity of the examinee
  • the safety factor of the examinee determined by the data determines the parameters for the security inspection of the examinee; and the security inspection device is configured to perform security check on the examinee based on the determined parameters.
  • the security system may further include: a security coefficient determining device, configured to acquire user data related to the examinee based on the identity of the examinee, and determine the checked according to the acquired user data.
  • Human safety factor For example, the safety factor determining device may determine the safety factor of the examinee by substituting the user data of the examinee into a relationship model between the user data and the safety factor, wherein in the relationship model, the user data is classified as Several classes, and each class is given a different safety factor.
  • the identity information entry device may also be used to obtain a registered photo of the examinee.
  • the security system may further include: a video device for capturing an image of the examinee in real time.
  • the security system may further include: an authentication device for extracting a face image of the examinee from the image captured by the video device, and by extracting the extracted face image with the registered photo and/or The face images in the untrusted crowd database are compared to verify the identity of the examinee.
  • the security check system may further include: an abnormal behavior determining device configured to determine whether the subject has an abnormal behavior according to the image of the examinee photographed by the video device.
  • the user data may include one or more of personal data, credit, social relationships, and historical behavior of the examinee.
  • the parameter determination device may determine a parameter applied in the concealment recognition algorithm employed by the security device based on the safety factor of the examinee.
  • the security system may further include: a display device for security check The corresponding security level of the examinee is displayed in the process.
  • a method for configuring a security device in the security system described above includes: obtaining an identity of a subject by using an identity information entry device; acquiring the identity based on the identity of the subject The user data related to the examinee, and determining the safety factor of the examinee according to the acquired user data; determining the parameter for the checkee by the parameter determination device based on the safety factor of the examinee; and determining based on the determined Parameters, configure the security device.
  • the parameters may include parameters applied in a concealment recognition algorithm employed by the security device.
  • the parameters may include one or more of a type of classifier, a parameter of a classifier, and an alarm threshold.
  • high-value information capable of representing user behavior characteristics and the like can be extracted from massive data having low-value density characteristics, and a model of the user's personal safety factor is established for the data, and Combined with the security parameter setting of the security inspection equipment, the unique security inspection strategy can be customized for different users, making the security inspection method more humanized and personalized, and achieving the goal of accurate security inspection.
  • FIG. 1 is a schematic view showing a security check system according to an embodiment of the present disclosure
  • FIG. 2 is a schematic diagram showing various user data
  • FIG. 3 is a schematic diagram showing an operational flow of a security check system according to an embodiment of the present disclosure
  • FIG. 4 is a schematic diagram showing an example clustering of user data.
  • the techniques of this disclosure may be implemented in the form of hardware and/or software (including firmware, microcode, etc.). Additionally, the techniques of this disclosure may take the form of a computer program product on a computer readable medium storing instructions for use by or in connection with an instruction execution system.
  • a computer readable medium can be any medium that can contain, store, communicate, propagate or transport the instructions.
  • a computer readable medium can include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or propagation medium.
  • the computer readable medium include: a magnetic storage device such as a magnetic tape or a hard disk (HDD); an optical storage device such as a compact disk (CD-ROM); a memory such as a random access memory (RAM) or a flash memory; and/or a wired /Wireless communication link.
  • a magnetic storage device such as a magnetic tape or a hard disk (HDD)
  • an optical storage device such as a compact disk (CD-ROM)
  • a memory such as a random access memory (RAM) or a flash memory
  • RAM random access memory
  • FIG. 1 is a schematic diagram showing a security check system in accordance with an embodiment of the present disclosure.
  • the security system 100 includes security devices (103, 105) disposed on the security channel 101.
  • the security channel 101 can be located at any entrance to a location that requires secure access, such as an airport, station, stadium, museum, and the like.
  • the security channel 101 can be an actual channel surrounded by a barrier such as a fence or a virtual channel defined by, for example, a video device.
  • a barrier such as a fence
  • a virtual channel defined by, for example, a video device.
  • the subject can travel within the surveillance range of the video device, which can be considered a "channel.”
  • the examinee passes through the security inspection channel 101, for example, in the direction indicated by the arrow in the figure, while receiving a security check by the security inspection device. If the inspection of the security equipment is passed, the examinee can be allowed to enter the visit. location.
  • the security device can include a scanning device 103 and a control device 105.
  • the scanning device 103 can be non-contact.
  • the scanning device 103 can illuminate the subject by rays (for example, millimeter waves) under the control of the control device 105, and collect rays from the subject (for example, by scattering).
  • the data received by the scanning device 103 can be sent to the control device 105 via a wired link (e.g., a cable) or a wireless link (e.g., WiFi), in which the subject can be reconstructed by a reconstruction algorithm ( Body surface) image.
  • the control device 105 can also identify the item that the subject may be hidden in the worn item by the hidden object recognition algorithm.
  • the imaging results and the recognition results can be output by an output device such as a display, as will be further described below.
  • Control device 105 can include various computing devices such as computers, servers, and the like.
  • a concealment recognition algorithm for a human body scanned image may include two parts of a classifier training and an online recognition. These two parts are based on the basis of the partition image.
  • the scanned image of the human body can be first divided into several areas according to the position of the key points of the body, such as the arm, the trunk, the leg, and the like.
  • the classifier training can be performed as follows: 1) establishing a positive and negative sample database of each partition image; 2) feature extraction: using a scale-invariant feature transform (SIFT) feature for each partition image Perform feature description; 3) use the sparse coding principle to train the over-complete dictionary of dense SIFT features; 4) project the feature descriptors in 2) into the dictionary to obtain the coding vector; 5) use support vector machine (Support Vector Machine) , SVM) trained to get the classifier.
  • Online recognition can be performed as follows: 1) first extract the dense SIFT features of the partition image to be identified; 2) calculate the projection vectors of the aforementioned features in the dictionary; 3) input the projection vectors into the trained SVM classifier for class classification.
  • LSPM Linear Spatial Pyramid Matching
  • LLC Locality-constrained Linear coding
  • the scanning device 103 is illustrated as passing through a portal structure, but the disclosure is not limited thereto.
  • Scanning device 103 can have a different form, such as a rotary scan.
  • security devices there are various types of security devices in the field, and currently commonly used human security devices are based, for example, on millimeter wave holographic imaging technology.
  • the embodiment may further include a scanning device (not shown) for an article such as a baggage carried by the examinee such as an X-ray scanning device (which may share the same control device as the human body scanning device 103, or have a separate control device).
  • a scanning device for an article such as a baggage carried by the examinee such as an X-ray scanning device (which may share the same control device as the human body scanning device 103, or have a separate control device).
  • an X-ray scanning device which may share the same control device as the human body scanning device 103, or have a separate control device.
  • the security inspection device employs the same security policy for different examinees, specifically, the same parameters (including the parameters employed in the scanning device 103 and the parameters employed in the control device 105). .
  • the possibility of carrying dangerous goods between different groups of people is very different. If the principle of "Everyone is equal" is still followed during the security inspection of the inspected personnel, the accuracy of the security inspection results will be affected.
  • different security policies may be employed for different examinees.
  • a security model specific to the individual can be established for differences in identity and behavior of each individual.
  • the efficiency of security inspection can be greatly improved, the false alarms of security personnel can be reduced, and the security intensity of suspicious personnel can be strengthened.
  • the illegal behaviors that may be taken by these people can be detected and identified in the security inspection process in time to prompt law enforcement. The personnel take corresponding measures.
  • the system 100 can include an identity information entry device 107 for entering the identity of the subject.
  • the identity information entry device 107 can include various types of information input devices.
  • the identity information entry device 107 can be a keyboard and/or a touch screen.
  • the security personnel can check the identity document of the examinee such as an identity card or a passport, and manually input the identity of the examinee through the identity information entry device 107, which can uniquely identify the identity of the examinee, such as the examinee's identity.
  • the ID number such as the ID number, passport number, etc. (generally do not use the name, because there is a greater possibility of having a duplicate name).
  • the identity information entry device 107 can be an optical or electronic reading device.
  • the identity information entry device 107 can read a one-dimensional code (eg, a barcode) or a two-dimensional code present on the identity document, or can wirelessly read information on the identity document, such as based on radio frequency identification (RFID) or near field. Communication (NFC) technology.
  • a one-dimensional code eg, a barcode
  • RFID radio frequency identification
  • NFC Near field. Communication
  • an identity document it is not necessary to use an identity document, and other documents associated with the user's identity may be used.
  • a boarding pass can be used. You can get the identity of the user and the corresponding identity such as the ID number or passport number by scanning the barcode on the boarding pass and querying the airline's database.
  • the identity of the user and the corresponding identity can be obtained by scanning the two-dimensional code on the ticket and querying the ticket database of the train station.
  • the identity information entry device 107 is not limited to entering the identity of the subject, and may also input other identity information.
  • the identity information entry device 107 can also acquire a registered photo of the examinee.
  • the "registered photo” refers to a photograph used by the examinee to clearly reflect the appearance of the subject when the registered institution registers the identity.
  • such a registered photo may be a photo used by the examinee when the public security organ handles the identity card or passport.
  • the identity information entry device 107 can obtain the above-mentioned registered photo directly from the photo ID (for example, by image recognition), or can obtain the above from the relevant trusted database (for example, the public security agency database) based on the entered identity. Register photos.
  • the identity information entry device 107 can be placed at the entrance of the security channel 101.
  • the control device 105 can obtain the user data related to the examinee identified by the identity identification from the databases 111-1, 111-2, ..., 111-n via the network 109.
  • the user data may include various data capable of embodying user characteristics, such as data related to the identity and behavior of the examinee, including but not limited to the personal data, credit, social relationship, and historical behavior of the examinee.
  • user data can reflect the sociological characteristics of the user, rather than biological features (eg, height, weight, blood type, fingerprint, iris, etc.).
  • FIG. 2 is a schematic diagram showing various user data.
  • the operator can obtain the user's call history, communication records, and online records.
  • the user's contact information By analyzing the data, the user's contact information, the duration of the call, the frequency of the call, the time and location of each visit to the network, and the IP address of the visited website can be obtained.
  • the bank you can obtain the user's bank card information, trading information, credit status and loan information.
  • e-commerce and B2C companies Through e-commerce and B2C companies, you can obtain the user's personal consumption details, product browsing categories, payment habits, website transaction details and user contact information.
  • the system 100 may include a safety factor determining device for acquiring user data related to the examinee based on the identity of the examinee, and determining the security of the examinee based on the acquired user data. coefficient.
  • the safety factor determining device can be implemented, for example, by the control device 105 running program instructions.
  • the coefficient of safety determination device can also be implemented by a separate entity, such as a computing device.
  • the control device 105 can extract data information related to the identity and behavior of the person from a large amount of data, for example, through data mining technology, and classify the population accordingly, and different levels of security policies can be applied to different categories of people.
  • a corresponding scoring mechanism can be established to characterize the safety of the examinee using the safety factor.
  • the corresponding safety factor of the examinee can be used as a guide for the security check device to perform the security check on the subject (for example, the parameters applied in the hidden object recognition algorithm), and the security of different examinees can be differentiated. And personalization. In reality, some of the examinee's data may be missing or unable to obtain complete data. For such subjects, in order to prevent the impact on the dangerous goods during the security inspection, the safety factor is given lower. Value, that is, to strengthen the security of this type of population. In the following, the determination of the safety factor will be described in further detail.
  • the user data may also include historical security data, including past departure and destination locations, previous security checks (including human security checks of the inspected and accompanying personnel, carrying baggage checks, etc.).
  • previous security checks including human security checks of the inspected and accompanying personnel, carrying baggage checks, etc.
  • the system 100 may include a parameter determination device for determining a parameter for performing a security check on a subject based on the determined safety factor of the examinee.
  • the parameter determination device can be implemented, for example, by the control device 105 running program instructions.
  • the present disclosure is not limited thereto.
  • the parameter determination device can also be implemented by a separate entity, such as a computing device.
  • the basic principle of parameter determination is that for a person with a relatively high safety factor value, relatively loose parameters can be set, the security intensity of such a group of people can be appropriately reduced, and the experience of the security personnel of such relative security personnel can be improved, and Accelerate the adoption rate of this type of population; for users with relatively low safety factor values, relatively strict parameters can be set to enhance the security intensity of such users.
  • These parameters may be any parameters in the system related to the security strength, such as parameters related to hardware (eg, scanning parameters in scanning device 103) and/or parameters related to software (eg, various algorithms employed in the system).
  • these parameters may be parameters applied in the obscurant recognition algorithm employed by the security device, including one or more of the type of classifier, the parameters of the classifier, and the alarm threshold.
  • an SVM classifier can be used.
  • the type of the classifier can be controlled by adjusting the kernel function.
  • the parameters of the classifier may include a penalty coefficient or the like.
  • the alarm threshold is the distance between the classification plane corresponding to the classifier and the optimal segmentation hyperplane. The closer the classification surface is to the suspect category, the smaller the alarm threshold.
  • the linear classifier with higher generalization ability and the SVM classifier corresponding to the smaller penalty coefficient can be used, and the alarm threshold is increased at the same time; for the examinee with low safety factor, Using a nonlinear classifier and a large SVM classifier corresponding to the penalty coefficient, increasing the value of the penalty coefficient can increase the detection rate of dangerous goods, and at the same time reduce its alarm threshold to avoid the safety caused by the dangerous goods missing test. loss.
  • the system 100 may further include a video device 113 such as a camera for capturing an image of the subject in real time.
  • a video device 113 such as a camera for capturing an image of the subject in real time.
  • the video device 113 is illustrated as being disposed on the scanning device 103, but the disclosure is not limited thereto.
  • the video device 113 can be placed at any other location in the security channel, the number of which is not limited to one, but can be more.
  • Video device 113 can be set to It or their shooting or monitoring range can cover the security channel 101, so that the subject can be monitored in real time without a blind spot when the subject passes through the security channel 101.
  • the security channel can also be defined by the monitoring range of the video device 113.
  • the system 100 may further include an abnormal behavior determining device for determining whether the subject has an abnormal behavior based on the examinee image taken by the video device 113.
  • the abnormal behavior determining device can be implemented, for example, by the control device 105 running program instructions.
  • the video device 113 can be connected to the control device 105 by wire or wirelessly.
  • the abnormal behavior determination device can also be implemented by a separate entity such as a computing device.
  • the control device 105 can use the computer vision technology to extract the contour image of the examinee from the video image collected by the video device 113 and perform dynamic tracking, and use the crowd modeling model pre-trained by the machine learning method to monitor the abnormal behavior. Once the subject is found to have abnormal behavior, notify the law enforcement personnel in a timely manner.
  • the system 100 may further include an authentication device for extracting a face image of the examinee from the image taken by the video device 113, and verifying by comparing the extracted face image with the registered photo of the examinee.
  • the identity of the person being examined can be implemented, for example, by the control device 105 running program instructions. However, the present disclosure is not limited thereto.
  • the authentication device can also be implemented by a separate entity, such as a computing device.
  • the registered photograph of the examinee is, for example, a registered photograph input by the identity information entry device 107, or a registered photograph obtained from the relevant trusted database as described above. If there is a difference between the person and the person, the system can issue an alarm to the security personnel to carry out manual verification work on the subject, and at the same time increase the intensity of checking the baggage for carrying the baggage.
  • the authentication device can also compare the extracted face image with the face image in the untrusted crowd database.
  • the untrusted crowd database may include a face image database of the fugitive suspects provided by the Ministry of Public Security, a face image database published by the court to restrict entry or exit, or a restricted population leaving a certain place. If there is a high degree of matching, the relevant law enforcement personnel can be notified to conduct a security review.
  • the control device 105 can configure the security device using parameters determined by the parameter determination device, such as configuring parameters applied in the obscurant recognition algorithm employed by the security device (configuring the type of classifier, parameters of the classifier, and/or alarms therein) Threshold).
  • parameters determined by the parameter determination device such as configuring parameters applied in the obscurant recognition algorithm employed by the security device (configuring the type of classifier, parameters of the classifier, and/or alarms therein) Threshold).
  • a complementary display manner of the remote end and the device end can be employed.
  • the security device eg, at the control device 105
  • only the figure of the person reflecting the contour of the human body and the corresponding alarm information and the identity information of the subject may be displayed.
  • the remote control terminal can be provided in a closed monitoring room at a certain distance from the security inspection device.
  • the actual scanned image is displayed only on the remote console, and the displayed scanned image blurs the face.
  • the security inspector at the remote console can view and process the scanned image of the current examinee, but does not know any personal information related to the identity of the current examinee.
  • the scanned image of the examinee can be deleted immediately, and the figure displayed at the security inspection device does not have any surface information of the scanned person, thereby being fully protected.
  • the security check results can be displayed in a manner that complements each other and does not intersect each other. In this way, on the one hand, the personal privacy of the examinee can be protected, and on the other hand, the inspection result of the examinee can be clearly seen.
  • the centralized control device 105 is shown, but the present disclosure is not limited thereto, and a distributed control device may also be employed.
  • one or more components in system 100 may have corresponding control devices, while other components may have other control devices.
  • the safety factor determination device is incorporated in the security system 100.
  • the present disclosure is not limited thereto.
  • multiple security systems can share the same safety factor determination device.
  • a security system that is located at a different security checkpoint can share a server that acts as a safety factor determination device.
  • the safety factor determination device can be provided by a professional data analysis company.
  • the security system 100 can send the entered identity of the examinee to the server of the data analysis company, and the server of the data analysis company retrieves relevant user information and performs data analysis to obtain the safety factor of the examinee. Subsequently, the safety factor can be sent to the safety system 100, and then the parameter determination device can determine the security parameters for the subject based on the received safety factor.
  • model training can be performed to establish a relationship model between user data and safety factors. This may be done in the security system 100 (e.g., control device 105) described above, or may be performed outside of the security system 100, such as by the professional data analysis company described above.
  • multiple databases may be utilized (eg, 111-1 as shown in FIG. 1 , 111-2, ... 111-n) respectively receive massive amounts of data shared by third-party organizations that establish partnerships, including name data, age data, birth data, residence data, document number data, interpersonal relationship data, communication
  • Various user-related data such as data, transaction detail data, credit data, social activity data, and social network data.
  • These massive data are cleaned and preprocessed and imported into a large distributed database.
  • the user data can be stored hierarchically according to key information such as user age, region and occupation, so that the data can be read later.
  • data such as text, images, audio, video, etc. related to users can be obtained from departments with rich and reliable source data such as the Ministry of Public Security, operators, banks, e-commerce, B2C companies, and Internet companies, and these massive data can be The format is organized and imported into a large distributed database for storage.
  • a massive data process for storing user information can be analyzed and processed by methods such as data mining, machine learning, and statistical analysis to establish a relationship model between user data and a safety factor.
  • user data may be clustered into several categories, and each category is assigned a different safety factor value.
  • this model can be built as follows.
  • the data information indicating the j-th dimension of the nth user may be structured data such as a number, or may be unstructured data such as text, image, audio, video, or web page. Referring to Figure 4, these user data can be represented as points in the data space.
  • User data can be clustered into several categories.
  • the classification result is schematically shown by a dotted circle in Fig. 4.
  • different safety factor values are assigned to each class according to the security presented by the users corresponding to the respective categories.
  • the safety of the 0th category is the highest
  • the safety of the first type is the second highest
  • the safety of the (K-1)th is the lowest.
  • the category center C i represents the ith security level. That is, the same type of users have the same security level (C i ).
  • C i the larger i is, the lower the security of its corresponding category users.
  • the established model may be stored at a safety factor determination device (which may be within or outside the security system 100 as described above) or may be stored outside of the safety factor determination device (in this case, the safety factor is determined)
  • the device can request the model from the device in which the model is stored).
  • the identity of the user is first entered by the identity information entry device 107 (optionally, other identity information such as the above registered photo is also entered).
  • identity information such as the above registered photo is also entered.
  • authentication can be performed at 321 .
  • identity verification can be performed by comparing the registered photo with the face image in the image captured by the video device 113.
  • the security factor determination module may obtain user data based on the entered user identity and determine the security factor for the user using the model established as described above, as indicated at 323.
  • the control device 105 can also send the entered user identity to the safety factor determination module outside the system and receive its determined safety factor from the external safety factor determination module.
  • the security device can be configured accordingly. For example, as shown at 325, parameters of the security device may be generated based on a safety factor, such as a self-learning algorithm, such as parameters applied in the obscurant recognition algorithm used, including classifier type, classifier parameters, alarm thresholds, etc. . These parameters can be represented in the form of parameter vectors.
  • the security measures and security strength can be adjusted accordingly.
  • a large safety factor value for example, D>0.8
  • the security intensity of such a group of people can be appropriately reduced, the experience of the security personnel of such relative security personnel is improved, and the population is accelerated.
  • the pass rate of the user for users with a small safety factor value (for example, D ⁇ 0.4), the security level of such users can be enhanced, and the information about the security level of the user is provided to the staff through the system. Early warning, prompting the staff to take appropriate protective measures.
  • the security device can receive security parameters, for example, sent as parameter vectors, configure the received parameters, and perform security checks on the user. For example, a millimeter wave can be emitted to illuminate the user, the user is imaged based on millimeter waves from the user, and the image is processed using a hidden object recognition algorithm, as shown at 327. The results of the inspection can be displayed as shown in 329. As mentioned above, this display can be done in a complementary manner between the remote end and the device side.
  • the data may also be updated as indicated at 330.
  • the data of the current inspected person can be updated.
  • the information involved in the security check process can be collected in real time and stored in the corresponding database, as shown in 331.
  • the data update includes two parts of the security inspection process information and the personal information of the inspected personnel.
  • the security inspection process information includes the current security equipment number and location, on-duty security inspector information, etc.
  • the personal information part of the inspected person includes the identification information, the travel location, the face image, and the video data recorded in the security channel, and the imaging data. And the results of this security inspection.
  • the large-scale data provided by the third-party platform can be crawled and updated in real time to ensure the real-time and reliability of the security inspection work.
  • the user behavior can be accurately predicted and the risk or potential danger of the user can be evaluated, and a more accurate security check solution can be provided.

Abstract

一种安检系统及配置安检设备的方法。安检系统包括:身份信息录入设备,用于录入受检人的身份标识;参数确定设备,用于基于根据与受检人的身份标识相对应的用户数据而确定的受检人的安全系数,确定针对受检人进行安检的参数;以及安检设备,用于基于所确定的参数,对受检人进行安检。本方案通过分析和挖掘用户全方位的数据,可以准确地预测用户行为及评估用户的危险性或潜在的危险性,提供更准确的安检方案。

Description

安检系统及配置安检设备的方法
相关申请的引用
本申请要求于2016年10月17日递交的题为“安检系统及配置安检设备的方法”的中国专利申请201610901778.9的优先权,其内容一并于此用作参考。
技术领域
本发明涉及安检技术领域,尤其涉及一种能够基于大数据实现定制化安检策略的安检系统及配置安检设备的方法。
背景技术
近年来,恐怖袭击在全球范围内逐步升级扩大,严重威胁到人身安全乃至国家安全。传统的人体安检采用金属探测或人工搜身等方式,不仅耗时,而且侵犯个人隐私。随着社会科技的发展进步,人们对安检的要求日益增加,传统的低效率的人工安检手段已不能满足新时代下的人们对安检的需求。人体安全检查设备通过非接触方式对人体进行扫描,能够对藏匿于衣服内的物体进行探测,不仅可探测金属物质,还可以探测人身上携带的非金属物质,如陶瓷、塑料、粉末、液体、胶体等。该类设备检测方式为非接触式检查,检测结果不显示任何身体特征细节,能够充分保护被检人员的个人隐私。此外,该方式的检测手段更加高效,且可以连续不间断工作,远远高于传统的人工安检效率。该类设备可以采用自动判读的方式,能够精确定位可疑物体的位置,有效降低了人为因素的影响,减轻了安检人员的劳动强度。
人体安检设备以其独特优势广泛应用于安检领域,但现有的设备在对用户潜在危险的判断方式上较为单一,仅通过隐匿物识别算法对扫描图像进行分析,确定被检人员的可能威胁。同时,目前人体安检设备中的隐匿物识别算法采用通用的算法框架,并未考虑用户的身份及不同人群的差异,对所有的被检人员均使用同一类算法框架进行分析处理,这些算法都建立在所有用户携带危险品的概率相同的假设下。因此,识别算法时有漏报与误报等现象的发生,影响用 户的使用体验效果。
发明内容
鉴于上述问题,本公开的目的至少部分地在于提供一种能够基于用户数据对不同用户定制安检策略的安检系统及配置安检设备的方法。
根据本公开的一个方面,提供了一种安检系统,包括:身份信息录入设备,用于录入受检人的身份标识;参数确定设备,用于基于根据与受检人的身份标识相对应的用户数据而确定的受检人的安全系数,确定针对受检人进行安检的参数;以及安检设备,用于基于所确定的参数,对受检人进行安检。
根据本公开的实施例,该安检系统还可以包括:安全系数确定设备,用于基于受检人的身份标识,获取与该受检人相关的用户数据,并根据所获取的用户数据确定受检人的安全系数。例如,安全系数确定设备可以通过将受检人的用户数据代入用户数据与安全系数间的关系模型中,确定受检人的安全系数,其中,在所述关系模型中,用户数据被归类为若干类,并且每类被赋予不同的安全系数。
根据本公开的实施例,身份信息录入设备还可以用于获得受检人的登记照片。
根据本公开的实施例,该安检系统还可以包括:视频设备,用于实时拍摄受检人的图像。
根据本公开的实施例,该安检系统还可以包括:身份验证设备,用于从视频设备拍摄的图像中提取受检人的脸部图像,并通过将提取的脸部图像与登记照片和/或非受信人群数据库中的脸部图像相比较,来验证受检人的身份。
根据本公开的实施例,该安检系统还可以包括:异常行为确定设备,用于根据视频设备拍摄的受检人图像,确定受检人是否存在异常行为。
根据本公开的实施例,用户数据可以包括受检人的个人数据、信用、社交关系和历史行为中的一项或多项。
根据本公开的实施例,参数确定设备可以基于受检人的安全系数,确定安检设备所采用的隐匿物识别算法中应用的参数。
根据本公开的实施例,该安检系统还可以包括:显示设备,用于在安检过 程中显示受检人的对应安全等级。
根据本公开的另一方面,提供了一种配置上述安检系统中的安检设备的方法,包括:通过身份信息录入设备,获取受检人的身份标识;基于受检人的身份标识,获取与该受检人相关的用户数据,并根据所获取的用户数据确定受检人的安全系数;通过参数确定设备,基于受检人的安全系数,确定针对受检人进行安检的参数;以及基于所确定的参数,配置安检设备。
根据本公开的实施例,所述参数可以包括安检设备所采用的隐匿物识别算法中应用的参数。例如,所述参数可以包括分类器的类型、分类器的参数以及报警阈值中的一项或多项。
根据本公开的实施例,通过数据挖掘等分析手段,可以从具有低价值密度特性的海量数据中提取出能表示用户行为特征等的高价值信息,对这些数据建立用户个人安全系数的模型,并与安检设备的安检参数设置进行结合,可对不同的用户定制特有的安检策略,使得安检手段更加人性化及个性化,实现精准安检的目标。
附图说明
通过以下参照附图对本公开实施例的描述,本公开的上述以及其他目的、特征和优点将更为清楚,在附图中:
图1是示出了根据本公开实施例的安检系统的示意图;
图2是示出了各种用户数据的示意图;
图3是示出了根据本公开实施例的安检系统的操作流程的示意图;
图4是示出了对用户数据的示例聚类的示意图。
具体实施方式
以下,将参照附图来描述本公开的实施例。但是应该理解,这些描述只是示例性的,而并非要限制本公开的范围。此外,在以下说明中,省略了对公知结构和技术的描述,以避免不必要地混淆本公开的概念。
在此使用的术语仅仅是为了描述具体实施例,而并非意在限制本公开。这里使用的词语“一”、“一个(种)”和“该”等也应包括“多个”、“多种”的 意思,除非上下文另外明确指出。此外,在此使用的术语“包括”、“包含”等表明了所述特征、步骤、操作和/或部件的存在,但是并不排除存在或添加一个或多个其他特征、步骤、操作或部件。
在此使用的所有术语(包括技术和科学术语)具有本领域技术人员通常所理解的含义,除非另外定义。应注意,这里使用的术语应解释为具有与本说明书的上下文相一致的含义,而不应以理想化或过于刻板的方式来解释。
附图中示出了一些方框图和/或流程图。应理解,方框图和/或流程图中的一些方框或其组合可以由计算机程序指令来实现。这些计算机程序指令可以提供给通用计算机、专用计算机或其他可编程数据处理装置的处理器,从而这些指令在由该处理器执行时可以创建用于实现这些方框图和/或流程图中所说明的功能/操作的装置。
因此,本公开的技术可以硬件和/或软件(包括固件、微代码等)的形式来实现。另外,本公开的技术可以采取存储有指令的计算机可读介质上的计算机程序产品的形式,该计算机程序产品可供指令执行系统使用或者结合指令执行系统使用。在本公开的上下文中,计算机可读介质可以是能够包含、存储、传送、传播或传输指令的任意介质。例如,计算机可读介质可以包括但不限于电、磁、光、电磁、红外或半导体系统、装置、器件或传播介质。计算机可读介质的具体示例包括:磁存储装置,如磁带或硬盘(HDD);光存储装置,如光盘(CD-ROM);存储器,如随机存取存储器(RAM)或闪存;和/或有线/无线通信链路。
图1是示出了根据本公开实施例的安检系统的示意图。
如图1所示,根据该实施例的安检系统100包括设置于安检通道101上的安检设备(103,105)。
安检通道101可以设于任何需要安全访问的地点如机场、车站、体育馆、博物馆等的入口处。安检通道101可以是由围挡等阻挡物围起来的一条实际通道,或者是由例如视频设备限定的虚拟通道。例如,受检人可以在视频设备的监控范围内行进,该监控范围可以视为“通道”。
受检人例如如图中箭头所示方向穿过安检通道101,同时接受安检设备进行的安全检查。如果通过了安检设备的检查,则受检人可被允许进入要访问的 地点。
安检设备可以包括扫描装置103和控制装置105。对于人体安检而言,扫描装置103可以是非接触式的。例如,扫描装置103可以在控制装置105的控制下通过射线(例如,毫米波)对受检人进行照射,并收集来自受检人(例如,通过散射)的射线。扫描装置103接收到的数据可以通过有线链路(例如,缆线)或无线链路(例如,WiFi)送到控制装置105中,在控制装置105中可以通过重建算法来重建受检人的(体表)图像。此外,控制装置105还可以通过隐匿物识别算法来识别受检人可能隐藏于所穿戴衣物内的物品。成像结果以及识别结果可以通过输出设备如显示器输出,以下将对此进一步描述。控制装置105可以包括各种计算设备,例如计算机、服务器等。
根据本公开的实施例,对人体扫描图像的隐匿物识别算法可以包括分类器训练及在线识别两个部分。这两个部分都基于分区图像的基础,可以先将人体扫描图像根据身体关键点的位置分成若干个区域,如手臂、躯干、腿等。在一示例中,分类器训练可以如下进行:1)建立各分区图像的正负样本数据库;2)特征提取:利用稠密尺度不变特征变换(Scale-invariant feature transform,SIFT)特征对各分区图像进行特征描述;3)利用稀疏编码原理训练得到稠密SIFT特征的超完备字典;4)将2)中的特征描述子在字典中进行投影,得到编码向量;5)利用支持向量机(Support Vector Machine,SVM)训练得到分类器。在线识别可以如下进行:1)首先提取待识别分区图像的稠密SIFT特征;2)计算前述特征在字典中的投影向量;3)将投影向量输入训练好的SVM分类器中进行类别分类。
当然,除了SVM分类器之外,也可以使用其他分类器如线性空间金字塔匹配(LSPM,Linea Spatial Pyramid Matching)或位置受限线性编码(LLC,Locality-constrained Linear coding)或者它们的组合。本领域存在多种其他的隐匿物识别算法,在此不一一描述。
在该示例中,将扫描装置103示出为通过门式结构,但是本公开不限于此。扫描装置103可以具有不同的形式,例如旋转扫描式。在本领域存在各类安检设备,当前常用的人体安检设备例如基于毫米波全息成像技术。
另外,在该示例中,仅示出了针对人体的扫描装置103。根据本公开的实 施例,还可以包括用于物品如受检人所携带行李的扫描装置(未示出)如X光扫描装置(可以与人体扫描装置103共用相同的控制装置,或具有单独的控制装置)。例如,当受检人进入安检通道101时,可以将其所携带行李放置于传送带上以便通过X光扫描装置,而其自己则穿行通过扫描装置103。
如上所述,在常规技术中,针对不同受检人,安检设备采用相同的安检策略,具体地,采用相同的参数(包括扫描装置103中所采用的参数以及控制装置105中所采用的参数)。然而,在实际的安检过程中,不同人群间携带危险品的可能性大不相同。若在对被检人员的安检过程中仍然遵循“人人平等”的原则,则会影响安检结果的准确性。
根据本公开的实施例,可以针对不同受检人,采用不同的安检策略。具体地,可以针对每个个体的身份和行为等差异建立专属于该个体的安检模型。由此,可以大幅提高安检效率,减少安检设备对安全人群的误报,同时可以加强对可疑人员的安检强度,将这些人群可能采取的不法行为在安检过程中及时侦查与辨别,用以提示执法人员采取相应的措施。
为此,系统100可以包括身份信息录入设备107,用于录入受检人的身份标识。身份信息录入设备107可以包括各类信息输入设备。例如,身份信息录入设备107可以是键盘和/或触摸屏。安检人员可以查验受检人的身份证件如身份证或护照,并通过身份信息录入设备107手动输入受检人的身份标识,该身份标识能够唯一地标识受检人的身份,例如受检人的身份证件的号码如身份证号、护照号等(一般不使用姓名,因为存在重名的可能性较大)。备选地,身份信息录入设备107可以是光学或电子读取设备。例如,身份信息录入设备107可以读取身份证件上存在的一维码(例如,条形码)或二维码,或者可以无线地读取身份证件上的信息,例如基于射频识别(RFID)或近场通信(NFC)技术。
当然,不一定使用身份证件,也可以使用其他与用户身份相关联的证件。例如,在机场的情况下,可以使用登机牌。可以通过扫描登机牌上的条形码并查询航空公司的数据库,获知用户的身份以及相应的身份标识如身份证号或护照号。再如,在车站的情况下,如果是实名制购票,则可以通过扫描车票上的二维码并查询火车站的票务数据库,获知用户的身份以及相应的身份标识。
当然,身份信息录入设备107不限于录入受检人的身份标识,还可以输入其他身份信息。例如,如以下实施例中所述,身份信息录入设备107还可以获取受检人的登记照片。在此,所谓“登记照片”,是指受检人在受信机构登记身份时所使用的能够清晰地反应其相貌的照片。例如,这种登记照片可以是受检人在公安机关办理身份证或护照时所使用的照片。
身份信息录入设备107可以直接从带照片的身份证件中获得上述登记照片(例如,通过图像识别),或者可以基于所录入的身份标识,从相关受信数据库(例如,公安机关的数据库)中获得上述登记照片。
身份信息录入设备107可以设置于安检通道101的入口处。当受检人进入安检通道101时,其身份标识可以由身份信息录入设备107录入,并通过有线或无线链路发送到控制装置105。控制装置105可以通过网络109从数据库111-1、111-2、...、111-n中获得与该身份标识所标识的受检人相关的用户数据。在此,用户数据可以包括能够体现用户特征的各种数据,例如与受检人的身份及行为相关的数据,包括但不限于受检人的个人数据、信用、社交关系和历史行为等中的一项或多项。在此,用户数据可以体现用户的社会学特征,而非生物学特征(例如,身高、体重、血型、指纹、虹膜等)。
随着互联网、物联网以及移动互联网的飞速发展,近年来各类数据呈现大幅度地增长,且有研究表明,未来数年数据量将会呈现指数增长。在大数据时代,用户数据广泛分布在各个领域。图2是示出了各种用户数据的示意图。例如,通过公安部的个人信息查询系统,可以获取用户的个人基本信息,包括姓名、性别、年龄、曾用名、出生地、联系方式、工作单位、居住地址、婚史以及亲属关系等多方面的信息。通过运营商可以获取用户的通话记录、通讯记录以及上网记录。对这些数据进行分析,可以获得用户的联系人信息、通话时长、通话频率、每次访问网络的时间、位置以及访问的网站IP地址等。通过银行可以获得用户的银行卡信息,衣食住行等方面的交易信息,信用情况及借贷信息等。通过电商及B2C企业可以获取用户的个人消费明细信息、商品浏览类目、支付习惯、网站的交易明细及用户的联系方式等。通过互联网则可以追踪用户在互联网中留下的一切“蛛丝马迹”:用户经常访问的网站及其主题、内容,访问的日志信息,网站浏览记录,在包括微信、微博等在内的社交网站中 的聊天信息、分享的资讯及评论信息等。
用户数据中蕴藏着巨大的价值,数据间或多或少都存在着某种联系。通过对以上各方面数据的分析可以推测出用户的兴趣爱好、性格特点、思想波动、情绪变化、家庭关系、人脉关系、工作经历、经济状况、信用情况、借贷情况、生活习惯、行为习惯、消费习惯、出行轨迹等全方位的信息。通过用户的家庭关系、工作状况、人际关系、近期家庭或周遭是否发生重大变故等数据,可以预测用户的潜在危险性。从用户的消费数据中可以获得用户近期的消费记录,通过分析消费记录,能够获得用户是否有购买违禁品、危险品以及可用于制作危险品的材料等信息。同时通过将用户的近期消费记录与以往的消费习惯进行一致性分析,可以预测该用户采取危害社会等的异常行为的可能性。通过用户经常浏览的网页信息、转发的消息及发表的言论等,可以知道用户存在反社会言行的可能性。通过用户在社交网络中的活动数据,以该用户为中心,绘制其人际关系图谱,并通过对与其互动频繁的其他用户的社会活动等数据进行综合分析,可根据用户常用联系人的安全性间接预测该用户的安全程度。
根据本公开的实施例,系统100可以包括安全系数确定设备,用于基于受检人的身份标识,获取与该受检人相关的用户数据,并根据所获取的用户数据确定受检人的安全系数。在该示例中,安全系数确定设备例如可以通过控制装置105运行程序指令来实现。但是,本公开不限于此。也可以通过单独的实体例如计算设备来实现安全系数确定设备。例如,控制装置105可以从海量的数据中,例如通过数据挖掘技术,提取与人的身份及行为相关的数据信息并据此将人群进行分类,对于不同类别的人群可以应用不同级别的安检策略。
例如,可以建立相应的评分机制,利用安全系数对受检人的安全程度进行刻画。安全系数越高,代表受检人的各项指标越正常。可以将受检人相应的安全系数作为安检设备中对该受检人执行安检时所采用的参数(例如,隐匿物识别算法中应用的参数)的指导,实现对不同受检人的安检差异化及个性化。现实情况下,可能某些受检人的数据会有缺失或无法获得完整数据,对这类受检人为了防止安检过程中对危险物品的漏报产生的影响,对其安全系数赋予较低的值,也即加强对该类人群的安检力度。在以下,将进一步详细描述安全系数的确定。
另外,用户数据也可以包括历史安检数据,包括过往的出发地和目的地地点、历次安检(包括被检人及同行人员的人体安全检查、携带行李检查等)情况等。在建立用户安全系数模型时,加大对数据中包含某些特殊地点及有过不良记录等信息的权重因子,即增加这些数据对安全系数的影响。
根据本公开的实施例,系统100可以包括参数确定设备,用于基于所确定的受检人的安全系数,确定针对受检人进行安检的参数。在该示例中,参数确定设备例如可以通过控制装置105运行程序指令来实现。但是,本公开不限于此。也可以通过单独的实体例如计算设备来实现参数确定设备。参数确定的基本原则是:对于安全系数值相对较高的受检人,可以设置相对宽松的参数,适当降低对该类人群的安检强度,提升这类相对安全人员对安检设备的使用体验,并加快对该类人群的通过率;对于安全系数值相对较低的用户,可以设置相对严格的参数,加强对这类用户的安检强度。
这些参数可以是系统中与安检强度有关的任何参数,例如涉及硬件的参数(例如,扫描装置103中的扫描参数)和/或涉及软件(例如,系统中所采用的各种算法)的参数。当然,针对受检人不断改变硬件参数是不便的,因此优选地可以改变软件参数。例如,这些参数可以是安检设备所采用的隐匿物识别算法中应用的参数,包括分类器的类型、分类器的参数以及报警阈值中的一项或多项。例如,可以使用SVM分类器。在这种情况下,可以通过调整核函数的方式来控制分类器的类型。分类器的参数可以包括惩罚系数等。报警阈值是指分类器对应的分类面与最优分割超平面的距离。分类面与嫌疑物类别的距离越近,其报警阈值越小。对于安全系数较高的受检人,可以采用泛化能力较强的线性分类器及较小的惩罚系数对应的SVM分类器,同时增加报警阈值;而对于安全系数偏低的受检人,可以采用非线性分类器及较大的惩罚系数对应的SVM分类器,增加惩罚系数的值可以提高对危险品的检出率,同时降低其报警阈值,避免因危险品漏检的情况带来的安全损失。
根据本公开的实施例,系统100还可以包括视频设备113例如摄像头,用于实时拍摄受检人的图像。在该示例中,将视频设备113示出为设置于扫描装置103上,但是本公开不限于此。视频设备113可以设置于安检通道中的任何其他位置处,其数目也不限于一,而是可以更多。视频设备113可以设置为使 得它或它们的拍摄或监控范围能够覆盖安检通道101,从而在受检人通过安检通道101时可以无死角地对受检人进行实时监控。如上所述,也可以通过视频设备113的监控范围来限定安检通道。
在设置了视频设备113的情况下,系统100还可以包括异常行为确定设备,用于根据视频设备113拍摄的受检人图像,确定受检人是否存在异常行为。在该示例中,异常行为确定设备例如可以通过控制装置105运行程序指令来实现。在该情况下,视频设备113可以有线或无线连接到控制装置105。但是,本公开不限于此。也可以通过单独的实体例如计算设备来实现异常行为确定设备。控制装置105可以利用计算机视觉技术,从视频设备113采集的视频图像中提取出受检人的轮廓图像并进行动态跟踪,使用采用机器学习方法预先训练好的人群建模模型进行异常行为的监测,一旦发现受检人存在异常行为,及时通知执法人员。
另外,系统100还可以包括身份验证设备,用于从视频设备113拍摄的图像中提取受检人的脸部图像,并通过将提取的脸部图像与受检人的登记照片相比较,来验证受检人的身份。在该示例中,身份验证设备例如可以通过控制装置105运行程序指令来实现。但是,本公开不限于此。也可以通过单独的实体例如计算设备来实现身份验证设备。在此,受检人的登记照片例如是身份信息录入设备107所输入的登记照片,或者是如上所述从相关受信数据库中获得的登记照片。若存在人证不一的情况,系统可以发出警报提示安检人员对该受检人进行人工核查工作,同时可以加大对其携带行李开箱检查的力度。
另一方面,身份验证设备还可以将提取的脸部图像与非受信人群数据库中的脸部图像相比较。例如,非受信人群数据库可以包括公安部提供的在逃的犯罪嫌疑人的人脸图像数据库、法院公布的限制出入境或者离开某地的受限人群的人脸图像数据库等。如果存在匹配度较高的情况,可以通知相关执法人员进行安全审查。
控制装置105可以利用参数确定设备所确定的参数,来配置安检设备,例如配置安检设备所采用的隐匿物识别算法中应用的参数(配置其中的分类器的类型、分类器的参数和/或报警阈值)。利用如此配置的算法对通过扫描装置103获得的受检人图像进行处理,可以识别受检人是否隐匿了物品。可以向安检人 员输出检查结果。
根据本公开的实施例,可以采用远程端和设备端的互补显示方式。具体地,在安检设备处(例如,在控制装置105处),可以仅显示反映人体轮廓的人偶图与对应的报警信息以及受检人的身份信息。另一方面,可以在距安检设备一定距离的封闭监控室中设置远程控制端。实际的扫描图像仅在远程控制端显示,显示的扫描图像对面部进行模糊处理。远程控制端的安检员能够对当前受检人的扫描图像进行查看和处理,但并不知道任何涉及到当前受检人的身份等个人信息。在远程控制端,完成一次扫描及相应的检查工作后,可以立即删除受检人的扫描图像,同时由于安检设备处显示的人偶图不具有被扫描人的任何体表信息,因而能够充分保护受检人的个人隐私。
也就是说,在远程端和设备端,可以按相互补充且互无交集的方式显示安检结果。这样,一方面可以保护受检人的个人隐私,另一方面可以清楚地查看受检人的检查结果。
在以上实施例中,示出了集中式的控制装置105,但是本公开不限于此,也可以采用分布式控制装置。例如,系统100中的一个或多个部件可以具有相应的控制装置,而其他部件可以具有其他控制装置。
在以上示例中,安全系数确定设备结合在安检系统100中。但是,本公开不限于此。例如,多个安检系统可以共享相同的安全系数确定设备。例如,在机场,分设于不同安检通道处的安检系统可以共用用作安全系数确定设备的服务器。备选地,安全系数确定设备可以由专业的数据分析公司提供。安检系统100可以将所录入的受检人的身份标识发送到数据分析公司的服务器,由数据分析公司的服务器调取相关用户信息并进行数据分析,以得到受检人的安全系数。随后,可以将安全系数发送给安全系统100,然后参数确定设备可以根据接收到的安全系数来确定针对受检人的安检参数。
以下,将参照图3,描述上述安检系统的操作流程。
首先,如310所示,可以进行模型训练,以便建立用户数据与安全系数间的关系模型。这可以在上述安检系统100(例如,控制装置105)中进行,或者也可以在安检系统100之外进行,例如由上述专业数据分析公司进行。
具体地,如311所示,可以利用多个数据库(例如,如图1中所示的111-1、 111-2、...111-n)分别接收来自建立合作关系的第三方机构的共享的海量数据,包括姓名数据、年龄数据、籍贯数据、居住地数据、证件号数据、人际关系数据、通讯数据、交易明细数据、信用数据、社会活动数据及社交网络数据等各类与用户有关的数据。将这些海量数据进行清洗及预处理等操作后导入到大型分布式数据库中。可以根据用户年龄、地域及职业等关键信息对用户数据进行层次存储,便于后期对于数据的读取。例如,可以从公安部、运营商、银行、电商、B2C企业及互联网公司等拥有丰富且可靠来源数据的部门获取与用户有关的诸如文本、图像、音频、视频等数据,将这些海量数据进行格式整理后导入大型分布式数据库中存储。
之后,如313所示,可以通过例如数据挖掘、机器学习及统计分析等方法,对存储用户信息的海量数据进程分析处理,建立用户数据与安全系数之间的关系模型。根据本公开的实施例,可以将用户数据聚类为若干类别,并对每一类别赋予不同的安全系数值。例如,这种模型可以如下建立。
假设从数据库中提取出海量的用户数据是S={Xn,n=1,...,N},其中,
Figure PCTCN2017095118-appb-000001
N表示用户数据总数目,Xn表示第n个用户的数据,|Xn|表示第n个用户的数据维度,
Figure PCTCN2017095118-appb-000002
表示第n个用户的第j维的数据信息,可以是数字等结构化数据,也可以是文本、图像、音频、视频、网页等非结构化的数据。参见图4,这些用户数据可以表示为数据空间中的点。
可以将用户数据聚类为若干类别。例如,可以采用K-means聚类算法将输入的用户数据根据其特性分成K个类别,类别中心用C={Ci,i=0,...,K-1}表示。图4中用虚线圈示意性示出了分类结果。根据本公开的实施例,根据各类别所对应的用户呈现出的安全性,为每类赋予不同的安全系数值。如图4所示,第0类的安全性最高,第1类的安全性次高,...,第(K-1)类的安全性最低。在此,用类别中心Ci表示第i个安全级别。也即,同一类用户具有相同的安全级别(Ci)。对于Ci而言,i越大,其对应类别用户的安全程度越低。
然后,建立用户数据与安全系数间的关系模型。例如,用户数据x与其安全系数间的关系可以用D=f(x)表示,其中
Figure PCTCN2017095118-appb-000003
k= 0,...,K-1,其中
Figure PCTCN2017095118-appb-000004
x属于第k个类别(根据x与各类别中心Ci之间的欧氏距离确定,例如x属于与其最接近的类别中心Ci代表的类别),||·||表示欧氏距离。
不同的用户由于其数据间的差异,安全系数不同,安全系数的取值范围为(0,1]。安全系数值越大,表示用户的安全程度越高。由用户数据与安全系数间的关系表达式D=f(x)可以知道,对于属于同一个类别的数据,用户数据与其类别中心距离越远,其对应的安全系数D越大。但是,这仅仅是一个示例。可以建立其他的模型,例如对于同一类别的数据,可以赋予实质上相同的安全系数。
所建立的模型可以存储于安全系数确定设备(如上所述,其可以在安检系统100之内或之外)处,或者也可以存储于安全系数确定设备之外(这种情况下,安全系数确定设备可以向存储有模型的设备请求该模型)。
在实际的安检过程320中,首先由身份信息录入设备107录入用户的身份标识(可选地,还录入其他身份信息如上述登记照片)。在录入登记照片的情况下,可以在321处进行身份验证。例如,如上所述,可以通过比较登记照片与视频设备113所捕获图像中的人脸图像,来进行身份验证。
安全系数确定模块(例如,控制装置105)可以基于所录入的用户身份标识,获取用户数据,并利用如上所述所建立的模型,确定该用户的安全系数,如323所示。当然,控制装置105也可以将所录入的用户身份标识发送至系统外部的安全系数确定模块,并从外部安全系数确定模块接收其确定的安全系数。在确定了用户的安全系数之后,可以相应地配置安检设备。例如,如325所示,可以基于安全系数,例如利用自学习算法,生成安检设备的参数,例如其中所使用的隐匿物识别算法中应用的参数,包括分类器类型、分类器参数、报警阈值等。这些参数可以参数向量形式表示。
具体地,可以计算当前用户数据x与K个类别中心间的关系,确定用户的类别中心Ck,然后将用户数据x代入上述模型D=f(x),由此计算当前被检人员的安全系数。还可以将用户对应的安全等级Ck输出至设备端及远程端用以提示安检人员该用户的安全程度,提示安检人员做好相应的准备措施,可提高设备的通过率以及降低危险事件的发生率。
通过对用户的安全系数的分析,可以相应地调整安检手段及安检强度。如上所述,对于安全系数值较大(例如,D>0.8)的用户,可以适当降低对该类人群的安检强度,提升这类相对安全人员对安检设备的使用体验,并加快对该类人群的通过率;而对于安全系数值偏小(例如,D<0.4)的用户,可以加强对这类用户的安检强度,同时通过系统将该类用户的安全程度等相关信息提供给工作人员用于预警,提示工作人员做好相应的保护措施。
安检设备可以接收例如以参数向量形式发送的安检参数,并以所接收到的参数进行配置,并对用户进行安检。例如,可以发射毫米波对用户进行照射,根据来自用户的毫米波对用户进行成像,并利用隐匿物识别算法对图像进行处理,如327所示。检查结果可以进行显示,如329所示。如上所述,这种显示可以采用远程端和设备端互补显示的方式。
根据本公开的实施例,还可以对数据进行更新,如330所示。
在每次安检结束后,可以对当前被检人员的数据进行更新。例如,可将安检过程中涉及的各项信息进行实时采集并存入对应的数据库中,如331所示。数据更新包括此次安检过程信息及被检人员的个人信息两个部分。例如,安检过程信息包括当前的安检设备编号及所处地点、值班安检员信息等,被检人员的个人信息部分包括证件信息、出行地点、人脸图像及安检通道中记录的视频数据、成像数据及此次安检检查结果等。
另外,在大数据时代,用户数据时刻都在更新,而用户的安全系数随着数据的变化而不断更改。因而,如333所示,可以对第三方平台提供的大规模数据进行实时的抓取与更新,以保障安检工作的实时性与可靠性。
根据本公开的实施例,通过分析和挖掘用户全方位的数据,可以准确地预测用户行为及评估用户的危险性或潜在的危险性,提供更准确的安检方案。
以上对本公开的实施例进行了描述。但是,这些实施例仅仅是为了说明的目的,而并非为了限制本公开的范围。尽管在以上分别描述了各实施例,但是这并不意味着各个实施例中的措施不能有利地结合使用。本公开的范围由所附权利要求及其等价物限定。不脱离本公开的范围,本领域技术人员可以做出多种替代和修改,这些替代和修改都应落在本公开的范围之内。

Claims (13)

  1. 一种安检系统,包括:
    身份信息录入设备,用于录入受检人的身份标识;
    参数确定设备,用于基于根据与受检人的身份标识相对应的用户数据而确定的受检人的安全系数,确定针对受检人进行安检的参数;以及
    安检设备,用于基于所确定的参数,对受检人进行安检。
  2. 根据权利要求1所述的安检系统,还包括:
    安全系数确定设备,用于基于受检人的身份标识,获取与该受检人相关的用户数据,并根据所获取的用户数据确定受检人的安全系数。
  3. 根据权利要求2所述的安检系统,其中,安全系数确定设备通过将受检人的用户数据代入用户数据与安全系数间的关系模型中,确定受检人的安全系数,其中,在所述关系模型中,用户数据被归类为若干类,并且每类被赋予不同的安全系数。
  4. 根据前述权利要求中任一项所述的安检系统,其中,身份信息录入设备还用于获得受检人的登记照片。
  5. 根据前述权利要求中任一项所述的安检系统,还包括:
    视频设备,用于实时拍摄受检人的图像。
  6. 根据权利要求5所述的安检系统,还包括:
    异常行为确定设备,用于根据视频设备拍摄的受检人图像,确定受检人是否存在异常行为。
  7. 根据从属于权利要求4的权利要求5所述的安检系统,还包括:
    身份验证设备,用于从视频设备拍摄的图像中提取受检人的脸部图像,并通过将提取的脸部图像与登记照片和/或非受信人群数据库中的脸部图像相比较,来验证受检人的身份。
  8. 根据前述权利要求中任一项所述的安检系统,其中,用户数据包括受检人的个人数据、信用、社交关系和历史行为中的一项或多项。
  9. 根据前述权利要求中任一项所述的安检系统,其中,参数确定设备基于受检人的安全系数,确定安检设备所采用的隐匿物识别算法中应用的参数。
  10. 根据前述权利要求中任一项所述的安检系统,还包括:
    显示设备,用于在安检过程中显示受检人的对应安全等级。
  11. 一种配置权利要求1-10中任一项所述安检系统中的安检设备的方法,包括:
    通过身份信息录入设备,获取受检人的身份标识;
    基于受检人的身份标识,获取与该受检人相关的用户数据,并根据所获取的用户数据确定受检人的安全系数;
    通过参数确定设备,基于受检人的安全系数,确定针对受检人进行安检的参数;以及
    基于所确定的参数,配置安检设备。
  12. 根据权利要求11所述的方法,其中,所述参数包括安检设备所采用的隐匿物识别算法中应用的参数。
  13. 根据权利要求12所述的方法,其中,所述参数包括分类器的类型、分类器的参数以及报警阈值中的一项或多项。
PCT/CN2017/095118 2016-10-17 2017-07-31 安检系统及配置安检设备的方法 WO2018072520A1 (zh)

Priority Applications (3)

Application Number Priority Date Filing Date Title
JP2019507210A JP7178343B2 (ja) 2016-10-17 2017-07-31 安全検査システムおよび安全検査装置を設定する方法
US16/321,251 US11631152B2 (en) 2016-10-17 2017-07-31 Security check system and method for configuring security check device
EP17861967.2A EP3480776A4 (en) 2016-10-17 2017-07-31 SECURITY CHECK SYSTEM AND SECURITY CHECK DEVICE CONFIGURATION METHOD

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201610901778.9 2016-10-17
CN201610901778.9A CN107958435A (zh) 2016-10-17 2016-10-17 安检系统及配置安检设备的方法

Publications (1)

Publication Number Publication Date
WO2018072520A1 true WO2018072520A1 (zh) 2018-04-26

Family

ID=61954169

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/095118 WO2018072520A1 (zh) 2016-10-17 2017-07-31 安检系统及配置安检设备的方法

Country Status (5)

Country Link
US (1) US11631152B2 (zh)
EP (1) EP3480776A4 (zh)
JP (1) JP7178343B2 (zh)
CN (1) CN107958435A (zh)
WO (1) WO2018072520A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109448139A (zh) * 2018-10-18 2019-03-08 深圳晟道科技有限公司 一种闸机通行方法及系统

Families Citing this family (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10681073B2 (en) * 2018-01-02 2020-06-09 International Business Machines Corporation Detecting unauthorized user actions
US10956606B2 (en) * 2018-03-22 2021-03-23 International Business Machines Corporation Masking of sensitive personal information based on anomaly detection
CN108627880B (zh) * 2018-06-29 2021-03-19 深圳市重投华讯太赫兹科技有限公司 一种安检系统及安检方法
CN109785446B (zh) * 2019-02-28 2024-02-02 清华大学 图像识别系统及其方法
CN110009277B (zh) * 2019-03-22 2021-09-07 深圳市轱辘车联数据技术有限公司 基于共享后备箱的托运方法、服务器及存储介质
CN110059671A (zh) * 2019-04-29 2019-07-26 深圳市轱辘汽车维修技术有限公司 一种安检方法、系统、装置及计算机可读存储介质
CN110221355A (zh) * 2019-05-31 2019-09-10 张学志 一种高效安检的方法与装置
CN112562105A (zh) * 2019-09-06 2021-03-26 北京国双科技有限公司 安检方法和装置、存储介质及电子设备
CN112485842A (zh) * 2019-09-12 2021-03-12 杭州海康威视数字技术股份有限公司 安检系统
CN112612066B (zh) * 2019-09-18 2023-06-30 同方威视技术股份有限公司 人员安检方法及人员安检系统
CN110619696B (zh) * 2019-09-18 2022-01-11 深圳市元征科技股份有限公司 一种车门解锁方法、装置、设备、介质
CN111160610A (zh) * 2019-11-29 2020-05-15 国政通科技有限公司 一种基于大数据的智能化安检方法
CN111080005B (zh) * 2019-12-12 2022-05-17 华中科技大学 一种基于支持向量机的治安风险预警方法及系统
CN111209868B (zh) * 2020-01-08 2023-05-09 中国铁道科学研究院集团有限公司电子计算技术研究所 一种客运站旅客与行李信息关联方法及装置
EP4113427A4 (en) * 2020-02-28 2023-04-12 NEC Corporation INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING METHOD, AND RECORDING MEDIUM
US11676368B2 (en) * 2020-06-30 2023-06-13 Optum Services (Ireland) Limited Identifying anomalous activity from thermal images
CN112883061B (zh) * 2020-12-07 2022-08-16 浙江大华技术股份有限公司 危险品检测方法、装置、系统和计算机设备
CN112561387A (zh) * 2020-12-24 2021-03-26 中交第四航务工程局有限公司 一种基于可视化和人员管理的工效分析方法和系统
CN112651352B (zh) * 2020-12-30 2022-07-19 深圳市商汤科技有限公司 图像处理方法及装置、电子设备和存储介质
US20220262185A1 (en) * 2021-02-16 2022-08-18 Evolv Technologies, Inc. Identity Determination Using Biometric Data
CN115359622A (zh) * 2022-08-22 2022-11-18 江苏安防科技有限公司 一种基于人工智能的轨道交通安防集成系统及方法
CN115713262A (zh) * 2022-11-21 2023-02-24 河南飙风信息科技有限公司 一种数据质量稽查平台
CN115984781B (zh) * 2023-03-17 2023-05-12 北京智芯微电子科技有限公司 配电线路监控设备故障监测方法、系统及终端设备
CN116401290B (zh) * 2023-03-28 2023-09-29 北京声迅电子股份有限公司 基于金属携带量数据的人员安检方法

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101196505A (zh) * 2007-09-29 2008-06-11 深圳市赛时特科技有限公司 出入口安检终端系统及使用该系统的安检方法
CN201138488Y (zh) * 2007-09-29 2008-10-22 深圳市赛时特科技有限公司 一种出入口安检设备
CN103310511A (zh) * 2013-05-30 2013-09-18 公安部第一研究所 一种旅客身份核查安检系统及其操作方法
CN203350918U (zh) * 2013-05-30 2013-12-18 公安部第一研究所 一种旅客身份核查安检系统

Family Cites Families (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7028899B2 (en) * 1999-06-07 2006-04-18 Metrologic Instruments, Inc. Method of speckle-noise pattern reduction and apparatus therefore based on reducing the temporal-coherence of the planar laser illumination beam before it illuminates the target object by applying temporal phase modulation techniques during the transmission of the plib towards the target
US6088423A (en) * 1998-06-05 2000-07-11 Vivid Technologies, Inc. Multiview x-ray based system for detecting contraband such as in baggage
JP2001306659A (ja) * 2000-04-20 2001-11-02 Hitachi Ltd セキュリティ管理方法およびシステム
US7365672B2 (en) * 2001-03-16 2008-04-29 Battelle Memorial Institute Detection of a concealed object
AU2003219926A1 (en) * 2002-02-26 2003-09-09 Canesta, Inc. Method and apparatus for recognizing objects
US6952163B2 (en) * 2003-06-11 2005-10-04 Quantum Magnetics, Inc. Combined systems user interface for centralized monitoring of a screening checkpoint for passengers and baggage
SE528068C2 (sv) * 2004-08-19 2006-08-22 Jan Erik Solem Med Jsolutions Igenkänning av 3D föremål
US20060262902A1 (en) * 2005-05-19 2006-11-23 The Regents Of The University Of California Security X-ray screening system
US20070235652A1 (en) * 2006-04-10 2007-10-11 Smith Steven W Weapon detection processing
US7796733B2 (en) * 2007-02-01 2010-09-14 Rapiscan Systems, Inc. Personnel security screening system with enhanced privacy
US7873182B2 (en) * 2007-08-08 2011-01-18 Brijot Imaging Systems, Inc. Multiple camera imaging method and system for detecting concealed objects
WO2009043145A1 (en) * 2007-10-01 2009-04-09 Optosecurity Inc. Method and devices for assessing the threat status of an article at a security check point
US8600149B2 (en) * 2008-08-25 2013-12-03 Telesecurity Sciences, Inc. Method and system for electronic inspection of baggage and cargo
WO2011011894A1 (en) * 2009-07-31 2011-02-03 Optosecurity Inc. Method and system for identifying a liquid product in luggage or other receptacle
US9773288B2 (en) * 2009-11-17 2017-09-26 Endera Systems, Llc Radial data visualization system
US20110205359A1 (en) * 2010-02-19 2011-08-25 Panasonic Corporation Video surveillance system
WO2012119216A1 (en) * 2011-03-09 2012-09-13 Optosecurity Inc. Method and apparatus for performing a security scan on a person
US9129277B2 (en) * 2011-08-30 2015-09-08 Digimarc Corporation Methods and arrangements for identifying objects
US8917913B2 (en) * 2011-09-22 2014-12-23 International Business Machines Corporation Searching with face recognition and social networking profiles
US20130266925A1 (en) * 2012-01-30 2013-10-10 Arizona Board Of Regents On Behalf Of The University Of Arizona Embedded Conversational Agent-Based Kiosk for Automated Interviewing
US9697710B2 (en) * 2012-06-20 2017-07-04 Apstec Systems Usa Llc Multi-threat detection system
US10289917B1 (en) * 2013-11-12 2019-05-14 Kuna Systems Corporation Sensor to characterize the behavior of a visitor or a notable event
US9230250B1 (en) * 2012-08-31 2016-01-05 Amazon Technologies, Inc. Selective high-resolution video monitoring in a materials handling facility
CN104933075A (zh) * 2014-03-20 2015-09-23 百度在线网络技术(北京)有限公司 用户属性预测平台和方法
DE102014205447A1 (de) * 2014-03-24 2015-09-24 Smiths Heimann Gmbh Detektion von Gegenständen in einem Objekt
DE102014225592A1 (de) * 2014-12-11 2016-06-16 Smiths Heimann Gmbh Personenidentifikation für mehrstufige Personenkontrollen
CN104464058B (zh) * 2014-12-17 2017-04-12 同方威视技术股份有限公司 基于物品识别标签的物品安检方法和系统
CN104965235B (zh) * 2015-06-12 2017-07-28 同方威视技术股份有限公司 一种安检系统及方法
CN205562838U (zh) * 2015-11-25 2016-09-07 广东兵工安检设备有限公司 智能监控测温安检门
US20170236232A1 (en) * 2016-01-19 2017-08-17 Rapiscan Systems, Inc. Integrated Security Inspection System
CN105738965A (zh) * 2016-01-31 2016-07-06 安徽泷汇安全科技有限公司 一种人体安全检测方法
EP3420563A4 (en) * 2016-02-22 2020-03-11 Rapiscan Systems, Inc. SYSTEMS AND METHODS FOR THREAT DETECTION AND SMUGGLING IN CARGO
WO2018023190A1 (en) * 2016-08-01 2018-02-08 Optosecurity Inc. Security checkpoint screening system and related methods

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101196505A (zh) * 2007-09-29 2008-06-11 深圳市赛时特科技有限公司 出入口安检终端系统及使用该系统的安检方法
CN201138488Y (zh) * 2007-09-29 2008-10-22 深圳市赛时特科技有限公司 一种出入口安检设备
CN103310511A (zh) * 2013-05-30 2013-09-18 公安部第一研究所 一种旅客身份核查安检系统及其操作方法
CN203350918U (zh) * 2013-05-30 2013-12-18 公安部第一研究所 一种旅客身份核查安检系统

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3480776A4 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109448139A (zh) * 2018-10-18 2019-03-08 深圳晟道科技有限公司 一种闸机通行方法及系统

Also Published As

Publication number Publication date
EP3480776A4 (en) 2019-12-04
US11631152B2 (en) 2023-04-18
US20190180398A1 (en) 2019-06-13
JP7178343B2 (ja) 2022-11-25
JP2019534491A (ja) 2019-11-28
EP3480776A1 (en) 2019-05-08
CN107958435A (zh) 2018-04-24

Similar Documents

Publication Publication Date Title
WO2018072520A1 (zh) 安检系统及配置安检设备的方法
US11259718B1 (en) Systems and methods for automated body mass index calculation to determine value
CN109214274B (zh) 一种机场安保管理系统
Finn et al. Seven types of privacy
US10748217B1 (en) Systems and methods for automated body mass index calculation
Eastwood et al. Biometric-enabled authentication machines: A survey of open-set real-world applications
Perrinet et al. Edge co-occurrences can account for rapid categorization of natural versus animal images
CN114692593B (zh) 一种网络信息安全监测预警方法
Naseri et al. Optimized face detector-based intelligent face mask detection model in IoT using deep learning approach
Twitchell et al. Advancing the assessment of automated deception detection systems: Incorporating base rate and cost into system evaluation
Montasari The application of big data predictive analytics and surveillance technologies in the field of policing
Păvăloaia et al. Tracking unauthorized access using machine learning and PCA for face recognition developments
Lai et al. Mass evidence accumulation and traveler risk scoring engine in e-border infrastructure
Moll et al. Fiber orientation measurement of fiber injection molded nonwovens by image analysis
Wambura et al. Deep and confident image analysis for disease detection
JP7388532B2 (ja) 処理システム、処理方法及びプログラム
Andronikou et al. Biometric profiling: Opportunities and risks
Kazemian et al. Comparisons of machine learning techniques for detecting fraudulent criminal identities
Lin et al. Establishment of Biometric Verification System Based on Design Science Research Methodology and Sensing System for Smart Border Control.
Nethravathi et al. Acne Vulgaris Severity Analysis Application
Zhang et al. An effective method for the abnormal monitoring of stage performance based on visual sensor network
Khan et al. Rapid Face Mask Detection and Person Identification Model Based on Deep Neural Networks
Pandey et al. Measure the Performance by Analysis of Different Boosting algorithms on Various Patterns of Phishing datasets
CN115063940B (zh) 一种风险监控方法及装置、存储介质及电子设备
Rudnichenko et al. Hybrid Intelligent System for Recognizing Biometric Personal Data.

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17861967

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2017861967

Country of ref document: EP

Effective date: 20190130

Ref document number: 2019507210

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE