WO2020230766A1 - Security system - Google Patents

Security system Download PDF

Info

Publication number
WO2020230766A1
WO2020230766A1 PCT/JP2020/018874 JP2020018874W WO2020230766A1 WO 2020230766 A1 WO2020230766 A1 WO 2020230766A1 JP 2020018874 W JP2020018874 W JP 2020018874W WO 2020230766 A1 WO2020230766 A1 WO 2020230766A1
Authority
WO
WIPO (PCT)
Prior art keywords
data processing
processing unit
image
space
security system
Prior art date
Application number
PCT/JP2020/018874
Other languages
French (fr)
Japanese (ja)
Inventor
泉 井川
砂川 隆一
政志 三木
和憲 芳賀
Original Assignee
太陽誘電株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 太陽誘電株式会社 filed Critical 太陽誘電株式会社
Priority to JP2021519428A priority Critical patent/JP7467434B2/en
Publication of WO2020230766A1 publication Critical patent/WO2020230766A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/89Radar or analogous systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01VGEOPHYSICS; GRAVITATIONAL MEASUREMENTS; DETECTING MASSES OR OBJECTS; TAGS
    • G01V3/00Electric or magnetic prospecting or detecting; Measuring magnetic field characteristics of the earth, e.g. declination, deviation
    • G01V3/12Electric or magnetic prospecting or detecting; Measuring magnetic field characteristics of the earth, e.g. declination, deviation operating with electromagnetic waves
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B25/00Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • An embodiment of the present invention relates to a security system.
  • the walk-through type dangerous goods detection device passes through the gate one by one in order to inspect, even if the inspection time for each person is shortened, there is a concern that congestion will occur due to waiting in line. Further, in the walk-through type dangerous goods detection device, the person passing through the gate is made aware that the inspection is being performed, which gives a feeling of oppression.
  • the present invention has been made in view of such circumstances, and it is possible to perform an inspection on a large number of people at the same time without being aware that the inspection is being performed, without giving a feeling of oppression to the person.
  • One of the purposes is to provide a security system that can alleviate the congestion associated with inspections.
  • the security system of the embodiment includes a plurality of millimeter-wave sensors, a sensor data processing unit, a camera, a video data processing unit, and a display device.
  • the plurality of millimeter-wave sensors are arranged at intervals behind a structure that partitions a space that can be passed by a plurality of people.
  • the sensor data processing unit detects the position of the inspection object existing in the three-dimensional space based on the composite data obtained by synthesizing the output data of the plurality of millimeter-wave sensors.
  • the camera captures the space from a predetermined position.
  • the video data processing unit generates an image in which the position on the image captured by the camera is marked, which corresponds to the position of the inspection object detected by the sensor data processing unit.
  • the display device displays the video generated by the video data processing unit.
  • FIG. 1 is a diagram for explaining a usage example of the security system of the embodiment, and is a plan view of a facility in which the security system is introduced.
  • FIG. 2 is a diagram for explaining a usage example of the security system of the embodiment, and is a vertical cross-sectional view taken along the line AA of FIG.
  • FIG. 3 is a block diagram showing a configuration example of the security system of the embodiment.
  • FIG. 4 is a block diagram showing a configuration example of the sensor data processing unit.
  • FIG. 5 is a block diagram showing a configuration example of the video data processing unit.
  • FIG. 6 is a diagram showing an example of a surveillance image.
  • FIG. 7-1 is a diagram showing an example of the surveillance video.
  • FIG. 7-2 is a diagram showing an example of the surveillance video.
  • FIG. 8 is a block diagram showing another configuration example of the video data processing unit.
  • FIG. 9 is a diagram showing an example of a surveillance image.
  • FIG. 10 is a flowchart illustrating
  • the security system of the present embodiment is a system that monitors an inspection object using a millimeter wave sensor in a space that can be passed by a plurality of people.
  • the space is, for example, a three-dimensional space.
  • the three-dimensional space to be monitored (hereinafter referred to as “monitored space”) is set in, for example, a facility such as an airport that can be a target of terrorism.
  • monitoring space is set in, for example, a facility such as an airport that can be a target of terrorism.
  • an area through which an unspecified number of people pass such as an area connecting the airport lobby to a security zone with a gate, as a monitoring target space.
  • the space monitored by the security system of this embodiment is not limited to the airport, but may be set in a facility where many people gather, such as a shopping mall, a concert venue, a school, or a church. Since these facilities are also easily targeted by terrorism, it is highly useful to introduce the security system of this embodiment.
  • the inspection objects are firearms, firearms, cutlery, explosives and the like.
  • the security system of this embodiment is effective not only as a countermeasure against terrorism but also as a countermeasure against information leakage in office buildings where corporate activities are carried out, for example. That is, in office buildings where corporate activities are carried out, it may be prohibited to bring in camera-equipped mobile phones, smartphones, etc. for the purpose of preventing leakage of confidential information. At this time, if baggage inspection or the like is performed on all people who visit the office building, there is a risk of congestion. Therefore, it is effective to introduce the security system of the present embodiment into an office building where corporate activities are carried out, and to monitor the carry-on of a camera-equipped mobile phone or smartphone by using the entrance of the office building as a monitoring target space.
  • the security system of the present embodiment is not intended to accurately find the inspection target, but to identify a person suspected of possessing the inspection target and inform the guard or the like of the person.
  • the purpose If you can tell the guards who you suspect you have the object to be inspected, the guards will guide you to a place where you can do a visual physical examination, baggage inspection, X-ray inspection, etc. It is possible to inspect in detail whether or not you have an object to be inspected. In this case, freedom is not restricted except for the person suspected of possessing the object to be inspected, and the person can enter the security zone of the airport as it is, for example.
  • a simple inspection is performed on a plurality of people existing in the monitored space, a person suspected of possessing the inspection target is identified, and only that person is identified. Is configured to guide the detailed inspection, so that the congestion associated with the inspection, which is a concern in the prior art, can be alleviated.
  • a plurality of millimeter-wave sensors are discretely arranged on the back side of a structure such as a wall panel or a ceiling panel that partitions the above-mentioned monitoring target space, and output data of the plurality of millimeter-wave sensors. It is a configuration that monitors the monitored space using. Therefore, the person existing in the monitored space is not made aware that the inspection is being performed, and does not give a feeling of oppression.
  • the security system of the present embodiment will be described in detail assuming an example in which an area connected to the security zone is set as a monitoring target space in a facility having a security zone such as an airport.
  • FIGS. 1 and 2 are views for explaining a usage example of the security system of the present embodiment
  • FIG. 1 is a plan view of a facility in which the security system is introduced
  • FIG. 2 is shown on line AA of FIG. It is a vertical sectional view along.
  • the monitored space 100 is set in an area connected to the security section 110 of the facility.
  • the security section 110 is a section in which the guard 200 stays to ensure safety.
  • a plurality of millimeter wave sensors 10 are discretely arranged on the back side of a structure such as a wall panel 101 or a ceiling panel 102 which is a boundary of the monitored space 100.
  • the millimeter wave sensor 10 irradiates radio waves in a frequency band such as 24 GHz to 100 GHz forward, and based on the result of receiving the reflected wave reflected by the object in front, the distance, angle, reflection intensity, etc. to the object are determined. It is a sensor that acquires information.
  • the millimeter wave is an electromagnetic wave in the frequency band of 30 GHz to 300 GHz, but in the present embodiment, the millimeter wave sensor 10 includes a microwave wave (24 GHz or more) that is close to the millimeter wave.
  • a passive type millimeter wave sensor 10 that receives millimeter waves radiated from an object without irradiating radio waves, and this may be used, but in the present embodiment, radio waves are irradiated to receive reflected waves. It is assumed that the active type millimeter wave sensor 10 is used.
  • the plurality of millimeter-wave sensors 10 include a wall panel 101 that serves as a boundary of the monitored space 100 so that a large detection range that covers the entire monitored space 100 is generated by integrating the detection ranges of the individual sensors. It is discretely arranged on the back side of a structure such as a ceiling panel 102. Each of the individual millimeter wave sensors 10 is installed on the back side of the wall surface panel 101 and the ceiling panel 102 so that the detection direction faces the monitoring target space 100, and these millimeter wave sensors 10 cannot be visually recognized from within the monitoring target space 100. It has become like.
  • the broken line square in FIG. 1 indicates the projection position of the millimeter wave sensor 10 installed on the back side of the ceiling panel 102 on the floor surface.
  • the number of millimeter-wave sensors 10 installed on the back side of the wall panel 101 and the ceiling panel 102 is the detection range of each millimeter-wave sensor 10 based on the relationship between the size of the monitoring target space 100 and the detection range of each millimeter-wave sensor 10. By integrating the above, the number may be such that the entire monitored space 100 can be covered.
  • FIG. 1 shows an example in which two surveillance cameras 20 and two display devices 30 are installed, but the number of surveillance cameras 20 and display devices 30 is arbitrary.
  • the surveillance image displayed on the display device 30 is referred to by the guard 200 staying in the security zone 110.
  • the guard 200 guides the person to the detailed inspection area 120 before entering the security zone 110. Then, in the detailed inspection area 120, a visual physical examination, a baggage inspection, an X-ray inspection, and the like are performed.
  • FIG. 3 is a block diagram showing a configuration example of the security system of the present embodiment.
  • the security system of the present embodiment processes sensor data that processes output data of a plurality of millimeter wave sensors 10 in addition to the plurality of millimeter wave sensors 10, the surveillance camera 20, and the display device 30 described above.
  • a unit 40 and a video data processing unit 50 that processes the video data of the surveillance camera 20 are provided.
  • Each of the plurality of millimeter wave sensors 10 transmits output data to the sensor data processing unit 40 by optical wireless communication. That is, each of the plurality of millimeter-wave sensors 10 and the processor are communicably connected by optical wireless communication.
  • the surveillance camera 20 transmits video data to the video data processing unit 50 by optical wireless communication. That is, the surveillance camera 20 and the processor are communicably connected by optical wireless communication.
  • the sensor data processing unit 40 and the video data processing unit 50 are functional units realized by a processor.
  • these sensor data processing units 40 and video data processing units 50 can be realized by a general-purpose processor such as a CPU (Central Processing Unit) or GPU (Graphics Processing Unit) executing a predetermined program.
  • these sensor data processing units 40 and video data processing units 50 may be realized by using a dedicated processor such as an ASIC (Application Specific Integrated Circuit) or an FPGA (Field Programmable Gate Array).
  • the sensor data processing unit 40 detects the position of the inspection target object existing in the monitoring target space 100 based on the composite data obtained by synthesizing the output data of the plurality of millimeter wave sensors 10 described above. In the present embodiment, the case where the sensor data processing unit 40 detects the three-dimensional position of the inspection object will be described as an example.
  • FIG. 4 shows a configuration example of the sensor data processing unit 40. As shown in FIG. 4, for example, the sensor data processing unit 40 includes a data synthesis unit 41 and a detector 42.
  • the data synthesis unit 41 synthesizes the output data of the plurality of millimeter wave sensors 10 described above, and generates synthetic data corresponding to the sensing data for the entire monitoring target space 100.
  • the output data of each of the plurality of millimeter-wave sensors 10 is sensing data obtained by sensing the detection range in the monitoring target space 100 from the installation position of each millimeter-wave sensor 10.
  • the detection range in the monitoring target space 100 can be specified based on the installation position of the millimeter wave sensor 10, and each millimeter wave sensor 10 can be specified. It can be seen which part of the monitored space 100 is sensed as the output data of. Therefore, by synthesizing the output data of each millimeter-wave sensor 10 based on the respective installation positions of the plurality of millimeter-wave sensors 10, it is possible to generate synthetic data corresponding to the sensing data for the entire monitored space 100. ..
  • the detector 42 takes the composite data generated by the data synthesis unit 41 as an input, obtains a likelihood indicating the certainty of being an inspection target for each of the objects existing in the monitored space 100, and the likelihood is a reference value. Outputs the three-dimensional position of an object that exceeds.
  • a DNN Deep Neural Network
  • a DNN is a neural network having a multi-stage intermediate layer between an input layer and an output layer.
  • the DNN constituting the detector 42 sequentially updates the network parameters (weights and biases in each layer) by the inverse error propagation method using supervised data for the training data, and among the objects existing in the monitored space 100. , It is learned to output the three-dimensional position of the object whose likelihood of the inspection object exceeds the reference value.
  • the training data includes a label indicating the three-dimensional position of the inspection target to the composite data generated by the data synthesis unit 41 when the inspection target exists in the monitoring target space 100.
  • a large number of composite data generated by the data synthesis unit 41 when the inspection target does not exist in the monitoring target space 100 are labeled as indicating no output, and the DNN constituting the detector 42 is learned. Used for.
  • These learning data can be created, for example, by operating the security system of the present embodiment on a trial basis before actually operating it and generating synthetic data in various situations.
  • Information on the three-dimensional position of an object estimated to be an inspection target in the monitoring target space 100 output by the detector 42 is input to the video data processing unit 50.
  • the video data processing unit 50 is a three-dimensional position output from the detector 42 of the sensor data processing unit 40 on the video of the monitoring target space 100 captured by the surveillance camera 20, that is, an inspection target object in the monitoring target space 100.
  • the position on the surveillance image corresponding to the three-dimensional position of is marked, and the surveillance image to be displayed on the display device 30 is generated.
  • FIG. 5 shows a configuration example of the video data processing unit 50.
  • the video data processing unit 50 includes a coordinate conversion unit 51, a marker generation unit 52, and a monitoring video generation unit 53.
  • the monitoring camera 20 photographs the three-dimensional position output from the detector 42 of the sensor data processing unit 40, that is, the three-dimensional position of the object estimated to be the inspection target in the monitoring target space 100. Coordinates are converted to the position (two-dimensional position) of the image of the monitored space 100 in the two-dimensional coordinate system. Since the position, orientation, angle of view, etc. of the surveillance camera 20 that captures the surveillance target space 100 are fixed, the three-dimensional position in the surveillance target space 100 and each pixel position (two-dimensional position) on the image of the surveillance camera 20 The correspondence of is uniquely determined, and this correspondence can be expressed by a coordinate conversion formula. Using this coordinate conversion formula, the coordinate conversion unit 51 transforms the three-dimensional position of the object estimated to be the inspection target in the monitoring target space 100 into the two-dimensional position of the image of the surveillance camera 20.
  • the marker generation unit 52 is within the monitoring target space 100 from the surveillance camera 20 based on the three-dimensional position output from the detector 42 of the sensor data processing unit 40 and the installation position of the surveillance camera 20 which is a known value. Calculate the distance to the three-dimensional position of the object estimated to be inspected. Then, the marker generation unit 52 is sized as the three-dimensional position of the marker having a size corresponding to the calculated distance, that is, the object presumed to be the inspection target in the monitoring target space 100, is closer to the installation position of the surveillance camera 20. Generates a marker that increases.
  • the marker may be generated in a form that stands out when superimposed on the image of the surveillance camera 20, and its shape, color, brightness, and the like are arbitrary.
  • the surveillance image generation unit 53 superimposes the marker generated by the marker generation unit 52 on the image of the surveillance target space 100 captured by the surveillance camera 20 at the position obtained by the coordinate conversion by the coordinate conversion unit 51. Generate surveillance video.
  • the surveillance video generated by the surveillance video generation unit 53 is displayed on the display device 30. If the inspection target does not exist in the monitoring target space 100 and the three-dimensional position is not output from the detector 42 of the sensor data processing unit 40, the monitoring image generation unit 53 monitors the image taken by the surveillance camera 20.
  • the image of the target space 100 is output to the display device 30 as it is. In this case, the image of the surveillance target space 100 taken by the surveillance camera 20 is displayed as it is on the display device 30.
  • the sensor data processing unit 40 and the video data processing unit 50 in the security system of the present embodiment are, for example, a cycle matching the frame cycle of the surveillance camera 20, 1/2 of the frame cycle, 1/3 of the frame cycle, and the like. , The above process is repeatedly executed at a predetermined cycle based on the frame cycle of the surveillance camera 20. As a result, a surveillance image in which the marker moves in accordance with the movement of the image in the surveillance target space 100 captured by the surveillance camera 20 is generated at any time and displayed on the display device 30.
  • FIG. 6 is a diagram showing an example of the surveillance image 60 displayed on the display device 30.
  • the monitoring image 60 displayed on the display device 30 shows three dimensions of the object presumed to be the inspection target, as shown in FIG.
  • the marker 70 is superimposed on the position corresponding to the position. Therefore, the guard 200 who referred to the surveillance image 60 can identify the person 90 suspected of possessing the inspection object from the persons in the surveillance target space 100, and inspects the person 90 in detail. You can guide to the area 120 and perform visual physical examination, baggage inspection, X-ray inspection, and so on.
  • the person behind is in front. It is blocked by a person and cannot be clearly seen on the image.
  • the marker 70 is superimposed on the position on the image corresponding to the three-dimensional position of the object presumed to be the inspection object, as shown in FIG. 7-1.
  • the marker 70 may be superimposed on the person in front.
  • the size of the marker 70 and the size of the marker 70 are superimposed. From the size of the person on the image, it is possible to accurately determine whether or not the person on which the marker 70 is superimposed is the person 90 suspected of possessing the inspection object. Further, when the person behind in the surveillance image 60 of FIG. 7-1, which is shielded by the person in front, approaches the position of the surveillance camera 20 after that, as shown in FIG. 7-2, the person is compared on that person. Since the surveillance image 60 on which the marker 70 having a large size is superimposed is displayed, it is possible to identify this person as the person 90 suspected of possessing the inspection object.
  • FIG. 8 shows a configuration example of the video data processing unit 50 in this case.
  • the video data processing unit 50 shown in FIG. 8 includes a person detection and tracking unit 54 in place of the marker generation unit 52 described above.
  • the person detection / tracking unit 54 outputs an image of the monitoring target space 100 taken by the monitoring camera 20 when the three-dimensional position of the object estimated to be the inspection target is output from the detector 42 of the sensor data processing unit 40.
  • a person is detected by analysis, and the person reflected at the position obtained by the coordinate conversion by the coordinate conversion unit 51 is specified as the person to be emphasized.
  • a person shielded by the previous person may exist at the position obtained by the coordinate conversion by the coordinate conversion unit 51, so the image of the surveillance camera 20 is repeatedly analyzed.
  • a known person detection algorithm using an image feature such as a face feature may be used.
  • a known object tracking algorithm that tracks an object moving between frames of the image that person. May be tracked on the video. In this case, it is possible to simplify the process after identifying the person to be emphasized.
  • the surveillance image generation unit 53 applies the person specified by the person detection and tracking unit 54 on the image of the surveillance target space 100 taken by the surveillance camera 20 to highlight the person, and the surveillance image 60 emphasizes the person. Is generated and displayed on the display device 30.
  • An example of the surveillance image 60 in this case is shown in FIG.
  • the guard 200 who referred to the surveillance image 60 can identify the person 90 suspected of possessing the inspection object from the persons in the surveillance target space 100, and the person can be identified as the detailed inspection area. You can guide to 120 and perform visual physical examination, baggage inspection, X-ray inspection, and so on.
  • the highlight processing 80 may be any processing that can emphasize the person 90 suspected of possessing the inspection object on the surveillance image 60.
  • the brightness of the area in which the person 90 is projected is increased to make the person 90 stand out. Any method may be used, such as giving a blinking effect by increasing or decreasing the brightness of the area in which the person 90 is reflected in small steps, or coloring the area in which the person 90 is reflected with a predetermined color.
  • FIG. 10 is a flowchart showing a processing procedure repeatedly executed by the sensor data processing unit 40 and the video data processing unit 50 at a predetermined cycle in the security system of the present embodiment.
  • the data synthesis unit 41 of the sensor data processing unit 40 is a plurality of millimeter-wave sensors discretely arranged behind the wall surface panel 101 and the ceiling panel 102, which are the boundaries of the monitored space 100.
  • the output data of 10 is combined to generate the combined data (step S101).
  • step S101 the synthetic data generated in step S101 is input to the detector 42 configured by DNN or the like (step S102).
  • the detector 42 takes the composite data as an input, obtains a likelihood indicating the certainty of the object to be inspected for each of the objects existing in the monitored space 100, and determines the likelihood of the object whose likelihood exceeds the reference value in three dimensions. Is learned to output.
  • step S103 it is confirmed whether or not the three-dimensional position of the object presumed to be the inspection object is output from the detector 42 in which the composite data is input in step S102 (step S103).
  • the surveillance video generation unit 53 (53') of the video data processing unit 50 is a surveillance camera.
  • the image of the monitored space 100 captured by 20 is displayed on the display device 30 as it is (step S104).
  • the coordinate conversion unit 51 of the video data processing unit 50 is output from the detector 42.
  • the three-dimensional position is coordinate-converted to the position (two-dimensional position) in the two-dimensional coordinate system of the image of the surveillance camera 20 (step S105).
  • the surveillance image generation unit 53 marks the position obtained by the coordinate conversion of the coordinate conversion unit 51 on the image of the surveillance camera 20, that is, the position where the object presumed to be the inspection object exists.
  • the monitored surveillance image 60 is generated, and the surveillance image 60 is displayed on the display device 30 (step S106).
  • the guard 200 can refer to the surveillance image 60 and identify the person 90 suspected of possessing the inspection object from the persons in the monitored space 100, and the person 90 can be identified. It is possible to guide to the detailed inspection area 120 and perform a visual physical examination, baggage inspection, X-ray inspection, and the like.
  • the space that can be passed by a plurality of people is defined as the monitoring target space 100, and the wall panel 101 that partitions the monitoring target space 100.
  • a plurality of millimeter wave sensors 10 are arranged at intervals on the back side of a structure such as a ceiling panel 102 or a ceiling panel 102. Then, the output data of the plurality of millimeter-wave sensors 10 are combined to generate synthetic data, and the position of the inspection target object existing in the monitoring target space 100 is detected based on the combined data.
  • the surveillance camera 20 photographs the surveillance target space 100 and the position of the inspection target object existing in the surveillance target space 100 is detected
  • the image of the surveillance target space 100 captured by the surveillance camera 20 is displayed.
  • the position on the image corresponding to the position of the inspection object is marked with a predetermined mark to generate the surveillance image 60.
  • a security guard 200 or the like who refers to the surveillance image 60 can identify a person suspected of possessing the inspection object.
  • the security system of the present embodiment is a structure that partitions the monitored space 100 that can be passed by a plurality of people, instead of inspecting each person in turn as in the conventional walk-through type dangerous substance detection device.
  • a plurality of millimeter wave sensors 10 arranged at intervals on the back side it is possible to identify a person suspected of possessing an inspection object from among a plurality of people existing in the monitored space 100. ing. Therefore, according to the security system of the present embodiment, it is possible to perform an inspection targeting many people at the same time without being aware that the inspection is being performed, and it does not give a feeling of oppression to the person and accompanies the inspection. Congestion can be alleviated.

Abstract

A security system of the present embodiment is provided with: a plurality of millimetre wave sensors (10) which are disposed at intervals at the back of a structure partitioning a space through which a plurality of people can pass; a sensor data processing unit (40) for detecting the position of an inspection object which is present within the space, on the basis of synthesized data obtained by synthesising output data of the plurality of millimetre wave sensors (10); cameras (20) for imaging the space from predetermined positions; image data processing units (50) for generating images which are the result of creating a mark at a position in the images captured by the cameras (20), such position corresponding to the position of the inspection object detected by the sensor data processing unit (40); and display devices (30) for displaying the images generated by the image data processing units (50).

Description

セキュリティシステムSecurity system
 本発明の実施形態は、セキュリティシステムに関する。 An embodiment of the present invention relates to a security system.
 近年、テロ防止などの目的で例えば空港などのハードターゲットにおけるセキュリティ対策の強化が進められている。従来のセキュリティ対策は、X線検査や目視による手荷物検査などが主流である。ここで問題となるのが、X線検査や目視による手荷物検査では一人ひとりの検査に時間がかかり、人の滞留や混雑を発生させてしまうことである。そこで、検査による人の滞留を緩和させるアプローチの一つとして、例えば特許文献1や非特許文献1に示すようなウォークスルータイプの危険物検知装置が提案されている。特に非特許文献1の危険物検知装置は移動式のため、比較的警備の薄いソフトターゲットを対象としたセキュリティ対策を強化するためにも有効な手段として期待されている。 In recent years, security measures have been strengthened at hard targets such as airports for the purpose of preventing terrorism. The mainstream of conventional security measures are X-ray inspection and visual baggage inspection. The problem here is that in X-ray inspection and visual baggage inspection, it takes time for each person to inspect, causing stagnation and congestion of people. Therefore, as one of the approaches to alleviate the retention of people by inspection, for example, a walk-through type dangerous substance detection device as shown in Patent Document 1 and Non-Patent Document 1 has been proposed. In particular, since the dangerous goods detection device of Non-Patent Document 1 is mobile, it is expected as an effective means for strengthening security measures for soft targets with relatively weak security.
特表2001-521157号公報Special Table 2001-521157
 しかし、ウォークスルータイプの危険物検知装置は、一人ずつ順番にゲートを通過させて検査を行うため、一人ひとりの検査時間は短縮されるとしても、順番待ちによる混雑が発生する懸念がある。また、ウォークスルータイプの危険物検知装置では、ゲートを通過する人に対し検査が行われていることを意識させてしまうため、圧迫感を与えることになる。 However, since the walk-through type dangerous goods detection device passes through the gate one by one in order to inspect, even if the inspection time for each person is shortened, there is a concern that congestion will occur due to waiting in line. Further, in the walk-through type dangerous goods detection device, the person passing through the gate is made aware that the inspection is being performed, which gives a feeling of oppression.
 本発明は、こうした実情に鑑みてなされたものであり、検査が行われていることを意識させずに同時に多くの人を対象とした検査を行うことができ、人に圧迫感を与えず、検査に伴う混雑を緩和できるセキュリティシステムを提供することを目的の一つとする。 The present invention has been made in view of such circumstances, and it is possible to perform an inspection on a large number of people at the same time without being aware that the inspection is being performed, without giving a feeling of oppression to the person. One of the purposes is to provide a security system that can alleviate the congestion associated with inspections.
 実施形態のセキュリティシステムは、複数のミリ波センサと、センサデータ処理部と、カメラと、映像データ処理部と、表示装置と、を備える。複数のミリ波センサは、複数の人が通行可能な空間を仕切る構造物の裏側に間隔を隔てて配置される。センサデータ処理部は、前記複数のミリ波センサの出力データを合成した合成データに基づいて、前記三次元空間内に存在する検査対象物の位置を検出する。カメラは、所定位置から前記空間を撮影する。映像データ処理部は、前記センサデータ処理部により検出される前記検査対象物の位置に対応する、前記カメラが撮影する映像上の位置にマーキングを施した映像を生成する。表示装置は、前記映像データ処理部が生成した映像を表示する。 The security system of the embodiment includes a plurality of millimeter-wave sensors, a sensor data processing unit, a camera, a video data processing unit, and a display device. The plurality of millimeter-wave sensors are arranged at intervals behind a structure that partitions a space that can be passed by a plurality of people. The sensor data processing unit detects the position of the inspection object existing in the three-dimensional space based on the composite data obtained by synthesizing the output data of the plurality of millimeter-wave sensors. The camera captures the space from a predetermined position. The video data processing unit generates an image in which the position on the image captured by the camera is marked, which corresponds to the position of the inspection object detected by the sensor data processing unit. The display device displays the video generated by the video data processing unit.
図1は、実施形態のセキュリティシステムの利用例を説明する図であり、セキュリティシステムが導入された施設の平面図である。FIG. 1 is a diagram for explaining a usage example of the security system of the embodiment, and is a plan view of a facility in which the security system is introduced. 図2は、実施形態のセキュリティシステムの利用例を説明する図であり、図1のA-A線に沿った縦断面図である。FIG. 2 is a diagram for explaining a usage example of the security system of the embodiment, and is a vertical cross-sectional view taken along the line AA of FIG. 図3は、実施形態のセキュリティシステムの構成例を示すブロック図である。FIG. 3 is a block diagram showing a configuration example of the security system of the embodiment. 図4は、センサデータ処理部の構成例を示すブロック図である。FIG. 4 is a block diagram showing a configuration example of the sensor data processing unit. 図5は、映像データ処理部の構成例を示すブロック図である。FIG. 5 is a block diagram showing a configuration example of the video data processing unit. 図6は、監視映像の一例を示す図である。FIG. 6 is a diagram showing an example of a surveillance image. 図7-1は、監視映像の一例を示す図である。FIG. 7-1 is a diagram showing an example of the surveillance video. 図7-2は、監視映像の一例を示す図である。FIG. 7-2 is a diagram showing an example of the surveillance video. 図8は、映像データ処理部の他の構成例を示すブロック図である。FIG. 8 is a block diagram showing another configuration example of the video data processing unit. 図9は、監視映像の一例を示す図である。FIG. 9 is a diagram showing an example of a surveillance image. 図10は、実施形態のセキュリティシステムの処理手順を説明するフローチャートである。FIG. 10 is a flowchart illustrating a processing procedure of the security system of the embodiment.
 以下、添付図面を参照しながら、本発明に係るセキュリティシステムの具体的な実施形態について詳細に説明する。本実施形態のセキュリティシステムは、複数の人が通行可能な空間を対象としてミリ波センサを用いた検査対象物の監視を行うシステムである。空間は、例えば、三次元空間である。以下、空間が三次元空間である場合を一例として説明する。監視の対象となる三次元空間(以下、これを「監視対象空間」と呼ぶ)は、例えば、空港などのテロの標的となり得る施設などに設定される。例えば本実施形態のセキュリティシステムを空港に導入する場合、空港ロビーからゲートのある保安区画に繋がるエリアなど、不特定多数の人が通過するエリアを監視対象空間として設定することが有効である。 Hereinafter, specific embodiments of the security system according to the present invention will be described in detail with reference to the accompanying drawings. The security system of the present embodiment is a system that monitors an inspection object using a millimeter wave sensor in a space that can be passed by a plurality of people. The space is, for example, a three-dimensional space. Hereinafter, the case where the space is a three-dimensional space will be described as an example. The three-dimensional space to be monitored (hereinafter referred to as “monitored space”) is set in, for example, a facility such as an airport that can be a target of terrorism. For example, when the security system of the present embodiment is introduced into an airport, it is effective to set an area through which an unspecified number of people pass, such as an area connecting the airport lobby to a security zone with a gate, as a monitoring target space.
 本実施形態のセキュリティシステムによる監視対象空間は、空港に限らず、例えばショッピングモールやコンサート会場、学校、教会など、多くの人が集まる施設に設定してもよい。これらの施設もテロの標的となり易いので、本実施形態のセキュリティシステムを導入することの有用性が高い。テロ対策の用途で本実施形態のセキュリティシステムを導入する場合、検査対象物は、銃器、火器、刃物、爆発物などである。 The space monitored by the security system of this embodiment is not limited to the airport, but may be set in a facility where many people gather, such as a shopping mall, a concert venue, a school, or a church. Since these facilities are also easily targeted by terrorism, it is highly useful to introduce the security system of this embodiment. When the security system of the present embodiment is introduced for the purpose of anti-terrorism, the inspection objects are firearms, firearms, cutlery, explosives and the like.
 その他、本実施形態のセキュリティシステムは、テロ対策以外にも、例えば企業活動が行われるオフィスビルなどにおける情報漏洩対策としても有効である。すなわち、企業活動が行われるオフィスビルなどにおいては、機密情報の漏洩を防止する目的で、カメラ付き携帯電話機やスマートフォンなどの持ち込みを禁止している場合がある。このとき、オフィスビルを訪問する全ての人を対象に手荷物検査などを行うと混雑を招く虞がある。そこで、企業活動が行われるオフィスビルに本実施形態のセキュリティシステムを導入し、オフィスビルのエントランスなどを監視対象空間として、カメラ付き携帯電話機やスマートフォンなどの持ち込みを監視することが有効となる。 In addition, the security system of this embodiment is effective not only as a countermeasure against terrorism but also as a countermeasure against information leakage in office buildings where corporate activities are carried out, for example. That is, in office buildings where corporate activities are carried out, it may be prohibited to bring in camera-equipped mobile phones, smartphones, etc. for the purpose of preventing leakage of confidential information. At this time, if baggage inspection or the like is performed on all people who visit the office building, there is a risk of congestion. Therefore, it is effective to introduce the security system of the present embodiment into an office building where corporate activities are carried out, and to monitor the carry-on of a camera-equipped mobile phone or smartphone by using the entrance of the office building as a monitoring target space.
 本実施形態のセキュリティシステムは、検査対象物を正確に見つけ出すことを目的とするものではなく、検査対象物を所持していると疑われる人物を特定し、その人物を警備員などに伝えることを目的とする。検査対象物を所持していると疑われる人物を警備員に伝えることができれば、警備員がその人物を目視による身体検査や手荷物検査、X線検査などを行う場所に誘導し、その人物が実際に検査対象物を所持しているかどうかを詳細に検査することができる。この場合、検査対象物を所持していると疑われる人物以外は自由が拘束されず、例えば空港の保安区画などにそのまま入ることができる。このように、本実施形態のセキュリティシステムでは、監視対象空間に存在する複数の人を対象として簡易的な検査を行い、検査対象物を所持していると疑われる人物を特定してその人物のみを詳細検査に誘導させる構成であるため、従来技術で懸念される検査に伴う混雑を緩和することができる。 The security system of the present embodiment is not intended to accurately find the inspection target, but to identify a person suspected of possessing the inspection target and inform the guard or the like of the person. The purpose. If you can tell the guards who you suspect you have the object to be inspected, the guards will guide you to a place where you can do a visual physical examination, baggage inspection, X-ray inspection, etc. It is possible to inspect in detail whether or not you have an object to be inspected. In this case, freedom is not restricted except for the person suspected of possessing the object to be inspected, and the person can enter the security zone of the airport as it is, for example. As described above, in the security system of the present embodiment, a simple inspection is performed on a plurality of people existing in the monitored space, a person suspected of possessing the inspection target is identified, and only that person is identified. Is configured to guide the detailed inspection, so that the congestion associated with the inspection, which is a concern in the prior art, can be alleviated.
 また、本実施形態のセキュリティシステムは、上述の監視対象空間を仕切る壁面パネルや天井パネルなどの構造物の裏側に複数のミリ波センサを離散的に配置し、これら複数のミリ波センサの出力データを用いて監視対象空間の監視を行う構成である。このため、監視対象空間内に存在する人に対し検査が行われていることを意識させることがなく、圧迫感を与えることがない。以下では、例えば空港などの保安区画を有する施設において、保安区画に繋がるエリアを監視対象空間として設定した例を想定して、本実施形態のセキュリティシステムを詳しく説明する。 Further, in the security system of the present embodiment, a plurality of millimeter-wave sensors are discretely arranged on the back side of a structure such as a wall panel or a ceiling panel that partitions the above-mentioned monitoring target space, and output data of the plurality of millimeter-wave sensors. It is a configuration that monitors the monitored space using. Therefore, the person existing in the monitored space is not made aware that the inspection is being performed, and does not give a feeling of oppression. In the following, the security system of the present embodiment will be described in detail assuming an example in which an area connected to the security zone is set as a monitoring target space in a facility having a security zone such as an airport.
 図1および図2は、本実施形態のセキュリティシステムの利用例を説明する図であり、図1は、セキュリティシステムが導入された施設の平面図、図2は、図1のA-A線に沿った縦断面図である。図1および図2に示す例では、監視対象空間100が、施設の保安区画110に繋がるエリアに設定されている。保安区画110とは、安全性を確保するために警備員200が滞在する区画である。 1 and 2 are views for explaining a usage example of the security system of the present embodiment, FIG. 1 is a plan view of a facility in which the security system is introduced, and FIG. 2 is shown on line AA of FIG. It is a vertical sectional view along. In the examples shown in FIGS. 1 and 2, the monitored space 100 is set in an area connected to the security section 110 of the facility. The security section 110 is a section in which the guard 200 stays to ensure safety.
 監視対象空間100の境界となる壁面パネル101や天井パネル102などの構造物の裏側には、複数のミリ波センサ10が離散的に配置されている。ミリ波センサ10は、例えば24GHz~100GHzなどの周波数帯の電波を前方に照射し、前方の物体で反射された反射波を受信した結果に基づいて、物体までの距離や角度、反射強度などの情報を取得するセンサである。なお、一般的にミリ波とは30GHz~300GHzの周波数帯の電磁波であるが、本実施形態ではミリ波に近いマイクロ波(24GHz以上)の電波を扱うものも含めてミリ波センサ10と呼ぶ。また、電波を照射せずに物体から放射されるミリ波を受信するパッシブ型のミリ波センサ10もあり、これを利用してもよいが、本実施形態では電波を照射して反射波を受信するアクティブ型のミリ波センサ10を用いることを想定する。 A plurality of millimeter wave sensors 10 are discretely arranged on the back side of a structure such as a wall panel 101 or a ceiling panel 102 which is a boundary of the monitored space 100. The millimeter wave sensor 10 irradiates radio waves in a frequency band such as 24 GHz to 100 GHz forward, and based on the result of receiving the reflected wave reflected by the object in front, the distance, angle, reflection intensity, etc. to the object are determined. It is a sensor that acquires information. Generally, the millimeter wave is an electromagnetic wave in the frequency band of 30 GHz to 300 GHz, but in the present embodiment, the millimeter wave sensor 10 includes a microwave wave (24 GHz or more) that is close to the millimeter wave. Further, there is also a passive type millimeter wave sensor 10 that receives millimeter waves radiated from an object without irradiating radio waves, and this may be used, but in the present embodiment, radio waves are irradiated to receive reflected waves. It is assumed that the active type millimeter wave sensor 10 is used.
 複数のミリ波センサ10は、個々のセンサの検知範囲を統合することで監視対象空間100の全体を網羅する大きな検知範囲が生成されるように、監視対象空間100の境界となる壁面パネル101や天井パネル102などの構造物の裏側に離散的に配置されている。個々のミリ波センサ10の各々は、検知方向が監視対象空間100側を向くように壁面パネル101や天井パネル102の裏側に設置され、監視対象空間100内からはこれらミリ波センサ10を視認できないようになっている。なお、図1の破線の四角は、天井パネル102の裏側に設置されたミリ波センサ10の床面への投影位置を示している。 The plurality of millimeter-wave sensors 10 include a wall panel 101 that serves as a boundary of the monitored space 100 so that a large detection range that covers the entire monitored space 100 is generated by integrating the detection ranges of the individual sensors. It is discretely arranged on the back side of a structure such as a ceiling panel 102. Each of the individual millimeter wave sensors 10 is installed on the back side of the wall surface panel 101 and the ceiling panel 102 so that the detection direction faces the monitoring target space 100, and these millimeter wave sensors 10 cannot be visually recognized from within the monitoring target space 100. It has become like. The broken line square in FIG. 1 indicates the projection position of the millimeter wave sensor 10 installed on the back side of the ceiling panel 102 on the floor surface.
 壁面パネル101や天井パネル102の裏側に設置するミリ波センサ10の個数は、監視対象空間100の大きさと個々のミリ波センサ10の検知範囲との関係から、個々のミリ波センサ10の検知範囲を統合することで監視対象空間100の全体を網羅できる数とすればよい。 The number of millimeter-wave sensors 10 installed on the back side of the wall panel 101 and the ceiling panel 102 is the detection range of each millimeter-wave sensor 10 based on the relationship between the size of the monitoring target space 100 and the detection range of each millimeter-wave sensor 10. By integrating the above, the number may be such that the entire monitored space 100 can be covered.
 監視対象空間100と保安区画110との間には、図1に示すように、監視対象空間100を撮影する監視カメラ20と、この監視カメラ20の映像をベースに生成される後述の監視映像を表示する液晶ディスプレイなどの表示装置30とが設置されている。監視カメラ20は、カメラの一例である。監視映像は、映像の一例である。図1では、2つの監視カメラ20と2つの表示装置30が設置されている例を示しているが、監視カメラ20と表示装置30の数は任意である。 As shown in FIG. 1, between the surveillance target space 100 and the security section 110, a surveillance camera 20 for photographing the surveillance target space 100 and a surveillance image to be described later generated based on the image of the surveillance camera 20 are placed. A display device 30 such as a liquid crystal display for displaying is installed. The surveillance camera 20 is an example of a camera. The surveillance video is an example of the video. FIG. 1 shows an example in which two surveillance cameras 20 and two display devices 30 are installed, but the number of surveillance cameras 20 and display devices 30 is arbitrary.
 表示装置30に表示される監視映像は、保安区画110内に滞在する警備員200によって参照される。警備員200がこの監視映像を参照し、検査対象物を所持していると疑われる人物を発見した場合、その人物が保安区画110に入る前にその人物を詳細検査エリア120へと誘導する。そして、この詳細検査エリア120にて、目視による身体検査や手荷物検査、X線検査などが行われる。 The surveillance image displayed on the display device 30 is referred to by the guard 200 staying in the security zone 110. When the guard 200 refers to this surveillance image and finds a person suspected of possessing the inspection object, the guard 200 guides the person to the detailed inspection area 120 before entering the security zone 110. Then, in the detailed inspection area 120, a visual physical examination, a baggage inspection, an X-ray inspection, and the like are performed.
 図3は、本実施形態のセキュリティシステムの構成例を示すブロック図である。本実施形態のセキュリティシステムは、図3に示すように、上述の複数のミリ波センサ10、監視カメラ20および表示装置30のほかに、複数のミリ波センサ10の出力データを処理するセンサデータ処理部40と、監視カメラ20の映像データを処理する映像データ処理部50とを備える。複数のミリ波センサ10の各々は、光無線通信により出力データをセンサデータ処理部40へ送信する。すなわち、複数のミリ波センサ10の各々とプロセッサとは、光無線通信により通信可能に接続されている。監視カメラ20は、光無線通信により映像データを映像データ処理部50へ送信する。すなわち、監視カメラ20とプロセッサとは、光無線通信により通信可能に接続されている。 FIG. 3 is a block diagram showing a configuration example of the security system of the present embodiment. As shown in FIG. 3, the security system of the present embodiment processes sensor data that processes output data of a plurality of millimeter wave sensors 10 in addition to the plurality of millimeter wave sensors 10, the surveillance camera 20, and the display device 30 described above. A unit 40 and a video data processing unit 50 that processes the video data of the surveillance camera 20 are provided. Each of the plurality of millimeter wave sensors 10 transmits output data to the sensor data processing unit 40 by optical wireless communication. That is, each of the plurality of millimeter-wave sensors 10 and the processor are communicably connected by optical wireless communication. The surveillance camera 20 transmits video data to the video data processing unit 50 by optical wireless communication. That is, the surveillance camera 20 and the processor are communicably connected by optical wireless communication.
 センサデータ処理部40と映像データ処理部50は、プロセッサによって実現される機能部である。例えば、CPU(Central Processing Unit)やGPU(Graphics Processing Unit)などの汎用プロセッサが所定のプログラムを実行することにより、これらセンサデータ処理部40や映像データ処理部50を実現することができる。また、ASIC(Application Specific Integrated Circuit)やFPGA(Field Programmable Gate Array)などの専用のプロセッサを用いてこれらセンサデータ処理部40や映像データ処理部50を実現してもよい。 The sensor data processing unit 40 and the video data processing unit 50 are functional units realized by a processor. For example, these sensor data processing units 40 and video data processing units 50 can be realized by a general-purpose processor such as a CPU (Central Processing Unit) or GPU (Graphics Processing Unit) executing a predetermined program. Further, these sensor data processing units 40 and video data processing units 50 may be realized by using a dedicated processor such as an ASIC (Application Specific Integrated Circuit) or an FPGA (Field Programmable Gate Array).
 センサデータ処理部40は、上述の複数のミリ波センサ10の出力データを合成した合成データに基づいて、監視対象空間100内に存在する検査対象物の位置を検出する。本実施形態では、センサデータ処理部40は、検査対象物の三次元位置を検出する場合を一例として説明する。センサデータ処理部40の構成例を図4に示す。センサデータ処理部40は、例えば図4に示すように、データ合成部41と、検出器42とを備える。 The sensor data processing unit 40 detects the position of the inspection target object existing in the monitoring target space 100 based on the composite data obtained by synthesizing the output data of the plurality of millimeter wave sensors 10 described above. In the present embodiment, the case where the sensor data processing unit 40 detects the three-dimensional position of the inspection object will be described as an example. FIG. 4 shows a configuration example of the sensor data processing unit 40. As shown in FIG. 4, for example, the sensor data processing unit 40 includes a data synthesis unit 41 and a detector 42.
 データ合成部41は、上述の複数のミリ波センサ10の出力データを合成し、監視対象空間100の全体に対するセンシングデータに相当する合成データを生成する。複数のミリ波センサ10の各々の出力データは、各ミリ波センサ10の設置位置から監視対象空間100内の検知範囲をセンシングしたセンシングデータである。ここで、複数のミリ波センサ10の各々の設置位置は既知であるため、ミリ波センサ10の設置位置に基づいて監視対象空間100内の検知範囲を特定することができ、各ミリ波センサ10の出力データが監視対象空間100内のどの部分をセンシングしたセンシングデータであるかが分かる。したがって、複数のミリ波センサ10の各々の設置位置に基づいて各ミリ波センサ10の出力データを合成することにより、監視対象空間100の全体に対するセンシングデータに相当する合成データを生成することができる。 The data synthesis unit 41 synthesizes the output data of the plurality of millimeter wave sensors 10 described above, and generates synthetic data corresponding to the sensing data for the entire monitoring target space 100. The output data of each of the plurality of millimeter-wave sensors 10 is sensing data obtained by sensing the detection range in the monitoring target space 100 from the installation position of each millimeter-wave sensor 10. Here, since the installation position of each of the plurality of millimeter wave sensors 10 is known, the detection range in the monitoring target space 100 can be specified based on the installation position of the millimeter wave sensor 10, and each millimeter wave sensor 10 can be specified. It can be seen which part of the monitored space 100 is sensed as the output data of. Therefore, by synthesizing the output data of each millimeter-wave sensor 10 based on the respective installation positions of the plurality of millimeter-wave sensors 10, it is possible to generate synthetic data corresponding to the sensing data for the entire monitored space 100. ..
 検出器42は、データ合成部41が生成した合成データを入力とし、監視対象空間100内に存在する物体の各々について検査対象物である確からしさを示す尤度を求めて、尤度が基準値を超える物体の三次元位置を出力する。この検出器42には、例えば深層学習(Deep Learning)により学習されたDNN(Deep Neural Network)などを用いることができる。DNNは、入力層と出力層との間に多段の中間層を持つニューラルネットワークである。検出器42を構成するDNNは、学習データに教師ありデータを用いた逆誤差伝播法によってネットワークパラメータ(各層における重みやバイアス)を逐次更新することで、監視対象空間100内に存在する物体のうち、検査対象物である尤度が基準値を超える物体の三次元位置を出力するように学習される。 The detector 42 takes the composite data generated by the data synthesis unit 41 as an input, obtains a likelihood indicating the certainty of being an inspection target for each of the objects existing in the monitored space 100, and the likelihood is a reference value. Outputs the three-dimensional position of an object that exceeds. For this detector 42, for example, a DNN (Deep Neural Network) learned by deep learning can be used. A DNN is a neural network having a multi-stage intermediate layer between an input layer and an output layer. The DNN constituting the detector 42 sequentially updates the network parameters (weights and biases in each layer) by the inverse error propagation method using supervised data for the training data, and among the objects existing in the monitored space 100. , It is learned to output the three-dimensional position of the object whose likelihood of the inspection object exceeds the reference value.
 この場合、学習データとしては、監視対象空間100内に検査対象物が存在するときにデータ合成部41が生成する合成データに対して検査対象物の三次元位置を示すラベルを付与したものと、監視対象空間100内に検査対象物が存在しないときにデータ合成部41が生成する合成データに対して出力なしを示すラベルを付与したものとが多数用意され、検出器42を構成するDNNの学習に用いられる。これらの学習用データは、例えば、本実施形態のセキュリティシステムを実際に運用する前に試験的に稼働させ、様々な状況で合成データを生成することにより作成することができる。検出器42が出力する監視対象空間100内の検査対象物と推定される物体の三次元位置の情報は、映像データ処理部50に入力される。 In this case, the training data includes a label indicating the three-dimensional position of the inspection target to the composite data generated by the data synthesis unit 41 when the inspection target exists in the monitoring target space 100. A large number of composite data generated by the data synthesis unit 41 when the inspection target does not exist in the monitoring target space 100 are labeled as indicating no output, and the DNN constituting the detector 42 is learned. Used for. These learning data can be created, for example, by operating the security system of the present embodiment on a trial basis before actually operating it and generating synthetic data in various situations. Information on the three-dimensional position of an object estimated to be an inspection target in the monitoring target space 100 output by the detector 42 is input to the video data processing unit 50.
 映像データ処理部50は、監視カメラ20が撮影する監視対象空間100の映像上で、センサデータ処理部40の検出器42から出力される三次元位置、すなわち、監視対象空間100内の検査対象物の三次元位置に対応する、監視映像上の位置に、マーキングを施して、表示装置30に表示する監視映像を生成する。映像データ処理部50の構成例を図5に示す。映像データ処理部50は、例えば図5に示すように、座標変換部51と、マーカ生成部52と、監視映像生成部53とを備える。 The video data processing unit 50 is a three-dimensional position output from the detector 42 of the sensor data processing unit 40 on the video of the monitoring target space 100 captured by the surveillance camera 20, that is, an inspection target object in the monitoring target space 100. The position on the surveillance image corresponding to the three-dimensional position of is marked, and the surveillance image to be displayed on the display device 30 is generated. FIG. 5 shows a configuration example of the video data processing unit 50. As shown in FIG. 5, for example, the video data processing unit 50 includes a coordinate conversion unit 51, a marker generation unit 52, and a monitoring video generation unit 53.
 座標変換部51は、センサデータ処理部40の検出器42から出力される三次元位置、すなわち、監視対象空間100内の検査対象物と推定される物体の三次元位置を、監視カメラ20が撮影する監視対象空間100の映像の二次元座標系における位置(二次元位置)に座標変換する。監視対象空間100を撮影する監視カメラ20の位置、向き、画角などは固定であるため、監視対象空間100内の三次元位置と監視カメラ20の映像上の各画素位置(二次元位置)との対応関係は一意に定まり、この対応関係は座標変換式で表すことができる。座標変換部51は、この座標変換式を用いて、監視対象空間100内の検査対象物と推定される物体の三次元位置を、監視カメラ20の映像の二次元位置に座標変換する。 In the coordinate conversion unit 51, the monitoring camera 20 photographs the three-dimensional position output from the detector 42 of the sensor data processing unit 40, that is, the three-dimensional position of the object estimated to be the inspection target in the monitoring target space 100. Coordinates are converted to the position (two-dimensional position) of the image of the monitored space 100 in the two-dimensional coordinate system. Since the position, orientation, angle of view, etc. of the surveillance camera 20 that captures the surveillance target space 100 are fixed, the three-dimensional position in the surveillance target space 100 and each pixel position (two-dimensional position) on the image of the surveillance camera 20 The correspondence of is uniquely determined, and this correspondence can be expressed by a coordinate conversion formula. Using this coordinate conversion formula, the coordinate conversion unit 51 transforms the three-dimensional position of the object estimated to be the inspection target in the monitoring target space 100 into the two-dimensional position of the image of the surveillance camera 20.
 マーカ生成部52は、センサデータ処理部40の検出器42から出力される三次元位置と、既知の値である監視カメラ20の設置位置とに基づいて、監視カメラ20から監視対象空間100内の検査対象物と推定される物体の三次元位置までの距離を算出する。そして、マーカ生成部52は、算出した距離に応じた大きさのマーカ、つまり、監視対象空間100内の検査対象物と推定される物体の三次元位置が監視カメラ20の設置位置に近いほどサイズが大きくなるマーカを生成する。マーカは、監視カメラ20の映像上に重畳したときに目立つ形態で生成されればよく、その形状、色、明るさなどは任意である。 The marker generation unit 52 is within the monitoring target space 100 from the surveillance camera 20 based on the three-dimensional position output from the detector 42 of the sensor data processing unit 40 and the installation position of the surveillance camera 20 which is a known value. Calculate the distance to the three-dimensional position of the object estimated to be inspected. Then, the marker generation unit 52 is sized as the three-dimensional position of the marker having a size corresponding to the calculated distance, that is, the object presumed to be the inspection target in the monitoring target space 100, is closer to the installation position of the surveillance camera 20. Generates a marker that increases. The marker may be generated in a form that stands out when superimposed on the image of the surveillance camera 20, and its shape, color, brightness, and the like are arbitrary.
 監視映像生成部53は、監視カメラ20により撮影された監視対象空間100の映像上で、座標変換部51での座標変換によって得られた位置にマーカ生成部52により生成されたマーカを重畳して監視映像を生成する。監視映像生成部53により生成された監視映像は、表示装置30に表示される。なお、監視対象空間100内に検査対象物が存在せず、センサデータ処理部40の検出器42から三次元位置が出力されない場合は、監視映像生成部53は、監視カメラ20により撮影された監視対象空間100の映像をそのまま表示装置30に出力する。この場合、表示装置30には、監視カメラ20により撮影された監視対象空間100の映像がそのまま表示される。 The surveillance image generation unit 53 superimposes the marker generated by the marker generation unit 52 on the image of the surveillance target space 100 captured by the surveillance camera 20 at the position obtained by the coordinate conversion by the coordinate conversion unit 51. Generate surveillance video. The surveillance video generated by the surveillance video generation unit 53 is displayed on the display device 30. If the inspection target does not exist in the monitoring target space 100 and the three-dimensional position is not output from the detector 42 of the sensor data processing unit 40, the monitoring image generation unit 53 monitors the image taken by the surveillance camera 20. The image of the target space 100 is output to the display device 30 as it is. In this case, the image of the surveillance target space 100 taken by the surveillance camera 20 is displayed as it is on the display device 30.
 本実施形態のセキュリティシステムにおけるセンサデータ処理部40および映像データ処理部50は、例えば、監視カメラ20のフレーム周期に一致する周期、フレーム周期の1/2、フレーム周期の1/3・・・など、監視カメラ20のフレーム周期を基準とした所定周期で上述の処理を繰り返し実行する。これにより、監視カメラ20により撮影された監視対象空間100の映像の動きに合わせてマーカも動く監視映像が随時生成され、表示装置30に表示される。 The sensor data processing unit 40 and the video data processing unit 50 in the security system of the present embodiment are, for example, a cycle matching the frame cycle of the surveillance camera 20, 1/2 of the frame cycle, 1/3 of the frame cycle, and the like. , The above process is repeatedly executed at a predetermined cycle based on the frame cycle of the surveillance camera 20. As a result, a surveillance image in which the marker moves in accordance with the movement of the image in the surveillance target space 100 captured by the surveillance camera 20 is generated at any time and displayed on the display device 30.
 図6は、表示装置30に表示される監視映像60の一例を示す図である。監視対象空間100内に検査対象物と推定される物体が存在する場合、表示装置30に表示される監視映像60では、図6に示すように、この検査対象物と推定される物体の三次元位置に対応する位置にマーカ70が重畳される。したがって、この監視映像60を参照した警備員200は、監視対象空間100内の人物の中から検査対象物を所持していると疑われる人物90を特定することができ、その人物90を詳細検査エリア120へと誘導して目視による身体検査や手荷物検査、X線検査などを行うことができる。 FIG. 6 is a diagram showing an example of the surveillance image 60 displayed on the display device 30. When an object presumed to be an inspection target exists in the monitoring target space 100, the monitoring image 60 displayed on the display device 30 shows three dimensions of the object presumed to be the inspection target, as shown in FIG. The marker 70 is superimposed on the position corresponding to the position. Therefore, the guard 200 who referred to the surveillance image 60 can identify the person 90 suspected of possessing the inspection object from the persons in the surveillance target space 100, and inspects the person 90 in detail. You can guide to the area 120 and perform visual physical examination, baggage inspection, X-ray inspection, and so on.
 監視カメラ20により撮影された監視対象空間100の映像上では、監視カメラ20の設置位置から見て監視対象空間100の奥行方向に複数の人物が重なるように存在する場合、後ろの人物が前の人物によって遮蔽されて映像上で明確に視認できない。ここで、後ろの人物が検査対象物を所持している場合、検査対象物と推定される物体の三次元位置に対応する映像上の位置にマーカ70を重畳すると、図7-1に示すように、前の人物上にマーカ70が重畳されることもある。 On the image of the surveillance target space 100 taken by the surveillance camera 20, when a plurality of people exist so as to overlap in the depth direction of the surveillance target space 100 when viewed from the installation position of the surveillance camera 20, the person behind is in front. It is blocked by a person and cannot be clearly seen on the image. Here, when the person behind is in possession of the inspection object, the marker 70 is superimposed on the position on the image corresponding to the three-dimensional position of the object presumed to be the inspection object, as shown in FIG. 7-1. In addition, the marker 70 may be superimposed on the person in front.
 しかし、本実施形態では、検査対象物と推定される物体の三次元位置と監視カメラ20との間の距離に応じた大きさのマーカ70を重畳するようにしているので、マーカ70のサイズと映像上の人物の大きさから、マーカ70が重畳された人物が検査対象物を所持していると疑われる人物90であるかどうかを的確に判断することができる。また、図7-1の監視映像60において前の人物に遮蔽されている後ろの人物が、その後、監視カメラ20の位置に近づいたときには、図7-2に示すように、その人物上に比較的大きいサイズのマーカ70が重畳された監視映像60が表示されるので、この人物が検査対象物を所持していると疑われる人物90であると特定することができる。 However, in the present embodiment, since the marker 70 having a size corresponding to the distance between the three-dimensional position of the object presumed to be the inspection object and the surveillance camera 20 is superimposed, the size of the marker 70 and the size of the marker 70 are superimposed. From the size of the person on the image, it is possible to accurately determine whether or not the person on which the marker 70 is superimposed is the person 90 suspected of possessing the inspection object. Further, when the person behind in the surveillance image 60 of FIG. 7-1, which is shielded by the person in front, approaches the position of the surveillance camera 20 after that, as shown in FIG. 7-2, the person is compared on that person. Since the surveillance image 60 on which the marker 70 having a large size is superimposed is displayed, it is possible to identify this person as the person 90 suspected of possessing the inspection object.
 なお、以上はマーキングの一例として、監視カメラ20により撮影された監視対象空間100の映像上にマーカ70を重畳する例を説明したが、マーキングの手法はこれに限らない。例えば、検査対象物と推定される物体の三次元位置に対応する映像上の位置に映る人物を強調させるようにしてもよい。この場合の映像データ処理部50の構成例を図8に示す。図8に示す映像データ処理部50は、上述のマーカ生成部52に代えて人物検出追跡部54を備える。 Although the above has described an example in which the marker 70 is superimposed on the image of the surveillance target space 100 taken by the surveillance camera 20 as an example of marking, the marking method is not limited to this. For example, the person appearing at the position on the image corresponding to the three-dimensional position of the object presumed to be the inspection object may be emphasized. FIG. 8 shows a configuration example of the video data processing unit 50 in this case. The video data processing unit 50 shown in FIG. 8 includes a person detection and tracking unit 54 in place of the marker generation unit 52 described above.
 人物検出追跡部54は、センサデータ処理部40の検出器42から検査対象物と推定される物体の三次元位置が出力された場合に、監視カメラ20により撮影された監視対象空間100の映像を解析して人物を検出し、座標変換部51での座標変換によって得られた位置に映る人物を、強調の対象となる人物として特定する。ここで、座標変換部51での座標変換によって得られた位置には、上述のように、前の人物によって遮蔽された人物が存在する場合もあるので、監視カメラ20の映像を繰り返し解析し、座標変換部51での座標変換によって得られた位置に同じ人物が映っていると判断できるときに、その人物を強調の対象となる人物として特定することが望ましい。 The person detection / tracking unit 54 outputs an image of the monitoring target space 100 taken by the monitoring camera 20 when the three-dimensional position of the object estimated to be the inspection target is output from the detector 42 of the sensor data processing unit 40. A person is detected by analysis, and the person reflected at the position obtained by the coordinate conversion by the coordinate conversion unit 51 is specified as the person to be emphasized. Here, as described above, a person shielded by the previous person may exist at the position obtained by the coordinate conversion by the coordinate conversion unit 51, so the image of the surveillance camera 20 is repeatedly analyzed. When it can be determined that the same person appears at the position obtained by the coordinate conversion by the coordinate conversion unit 51, it is desirable to specify that person as the person to be emphasized.
 なお、映像から人物を検出する方法は、例えば、顔特徴などの画像特徴を用いた公知の人物検出アルゴリズムを用いればよい。また、映像のフレーム間で動きのある物体を追跡する公知の物体追跡アルゴリズムを用い、監視カメラ20により撮影された監視対象空間100の映像から強調の対象となる人物を特定した後は、その人物を映像上で追跡するようにしてもよい。この場合は、強調の対象となる人物を特定した後の処理を簡略化することができる。 As a method of detecting a person from a video, for example, a known person detection algorithm using an image feature such as a face feature may be used. In addition, after identifying a person to be emphasized from the image of the surveillance target space 100 taken by the surveillance camera 20 using a known object tracking algorithm that tracks an object moving between frames of the image, that person. May be tracked on the video. In this case, it is possible to simplify the process after identifying the person to be emphasized.
 監視映像生成部53’は、監視カメラ20により撮影された監視対象空間100の映像上で、人物検出追跡部54により特定された人物にハイライト加工を施して、その人物を強調した監視映像60を生成し、表示装置30に表示させる。この場合の監視映像60の一例を図9に示す。この監視映像60では、検査対象物と推定される物体の三次元位置に対応する位置に存在する人物がハイライト加工80によって強調されている。したがって、この監視映像60を参照した警備員200は、監視対象空間100内の人物の中から検査対象物を所持していると疑われる人物90を特定することができ、その人物を詳細検査エリア120へと誘導して目視による身体検査や手荷物検査、X線検査などを行うことができる。 The surveillance image generation unit 53'applies the person specified by the person detection and tracking unit 54 on the image of the surveillance target space 100 taken by the surveillance camera 20 to highlight the person, and the surveillance image 60 emphasizes the person. Is generated and displayed on the display device 30. An example of the surveillance image 60 in this case is shown in FIG. In the surveillance image 60, the person existing at the position corresponding to the three-dimensional position of the object presumed to be the inspection object is emphasized by the highlight processing 80. Therefore, the guard 200 who referred to the surveillance image 60 can identify the person 90 suspected of possessing the inspection object from the persons in the surveillance target space 100, and the person can be identified as the detailed inspection area. You can guide to 120 and perform visual physical examination, baggage inspection, X-ray inspection, and so on.
 なお、ハイライト加工80は、監視映像60上で検査対象物を所持していると疑われる人物90を強調できる加工であればよく、例えば、その人物90が映る領域の輝度を高めて際立たせる、その人物90が映る領域の輝度を小刻みに増減させることで点滅効果を持たせる、その人物90が映る領域を所定の色で色付けするなど、任意の方法を用いればよい。 The highlight processing 80 may be any processing that can emphasize the person 90 suspected of possessing the inspection object on the surveillance image 60. For example, the brightness of the area in which the person 90 is projected is increased to make the person 90 stand out. Any method may be used, such as giving a blinking effect by increasing or decreasing the brightness of the area in which the person 90 is reflected in small steps, or coloring the area in which the person 90 is reflected with a predetermined color.
 次に、図10を参照して本実施形態のセキュリティシステムの動作の流れを説明する。図10は、本実施形態のセキュリティシステムにおいて、センサデータ処理部40および映像データ処理部50により所定周期で繰り返し実行される処理手順を示すフローチャートである。 Next, the operation flow of the security system of this embodiment will be described with reference to FIG. FIG. 10 is a flowchart showing a processing procedure repeatedly executed by the sensor data processing unit 40 and the video data processing unit 50 at a predetermined cycle in the security system of the present embodiment.
 処理が開始されると、まず、センサデータ処理部40のデータ合成部41が、監視対象空間100の境界となる壁面パネル101や天井パネル102の裏側に離散的に配置された複数のミリ波センサ10の出力データを合成し、合成データを生成する(ステップS101)。 When the processing is started, first, the data synthesis unit 41 of the sensor data processing unit 40 is a plurality of millimeter-wave sensors discretely arranged behind the wall surface panel 101 and the ceiling panel 102, which are the boundaries of the monitored space 100. The output data of 10 is combined to generate the combined data (step S101).
 次に、ステップS101で生成された合成データが、DNNなどにより構成される検出器42に入力される(ステップS102)。検出器42は、合成データを入力とし、監視対象空間100内に存在する物体の各々について検査対象物である確からしさを示す尤度を求め、該尤度が基準値を超える物体の三次元位置を出力するように学習されている。 Next, the synthetic data generated in step S101 is input to the detector 42 configured by DNN or the like (step S102). The detector 42 takes the composite data as an input, obtains a likelihood indicating the certainty of the object to be inspected for each of the objects existing in the monitored space 100, and determines the likelihood of the object whose likelihood exceeds the reference value in three dimensions. Is learned to output.
 次に、ステップS102で合成データを入力した検出器42から、検査対象物と推定される物体の三次元位置が出力されたか否かが確認される(ステップS103)。ここで、検出器42から検査対象物と推定される物体の三次元位置が出力されない場合は(ステップS103:No)、映像データ処理部50の監視映像生成部53(53’)は、監視カメラ20により撮影された監視対象空間100の映像をそのまま表示装置30に表示させる(ステップS104)。 Next, it is confirmed whether or not the three-dimensional position of the object presumed to be the inspection object is output from the detector 42 in which the composite data is input in step S102 (step S103). Here, when the three-dimensional position of the object presumed to be the inspection target is not output from the detector 42 (step S103: No), the surveillance video generation unit 53 (53') of the video data processing unit 50 is a surveillance camera. The image of the monitored space 100 captured by 20 is displayed on the display device 30 as it is (step S104).
 一方、検出器42から検査対象物と推定される物体の三次元位置が出力された場合は(ステップS103:Yes)、映像データ処理部50の座標変換部51が、検出器42から出力された三次元位置を監視カメラ20の映像の二次元座標系における位置(二次元位置)に座標変換する(ステップS105)。そして、監視映像生成部53(53’)が、監視カメラ20の映像上で座標変換部51の座標変換によって得られた位置、すなわち、検査対象物と推定される物体が存在する位置にマーキングを施した監視映像60を生成し、監視映像60を表示装置30に表示させる(ステップS106)。これにより、警備員200がこの監視映像60を参照して、監視対象空間100内の人物の中から検査対象物を所持していると疑われる人物90を特定することができ、その人物90を詳細検査エリア120へと誘導して目視による身体検査や手荷物検査、X線検査などを行うことができる。 On the other hand, when the three-dimensional position of the object presumed to be the inspection target is output from the detector 42 (step S103: Yes), the coordinate conversion unit 51 of the video data processing unit 50 is output from the detector 42. The three-dimensional position is coordinate-converted to the position (two-dimensional position) in the two-dimensional coordinate system of the image of the surveillance camera 20 (step S105). Then, the surveillance image generation unit 53 (53') marks the position obtained by the coordinate conversion of the coordinate conversion unit 51 on the image of the surveillance camera 20, that is, the position where the object presumed to be the inspection object exists. The monitored surveillance image 60 is generated, and the surveillance image 60 is displayed on the display device 30 (step S106). As a result, the guard 200 can refer to the surveillance image 60 and identify the person 90 suspected of possessing the inspection object from the persons in the monitored space 100, and the person 90 can be identified. It is possible to guide to the detailed inspection area 120 and perform a visual physical examination, baggage inspection, X-ray inspection, and the like.
 以上、具体的な例を挙げながら詳細に説明したように、本実施形態のセキュリティシステムにおいては、複数の人が通行可能な空間を監視対象空間100とし、この監視対象空間100を仕切る壁面パネル101や天井パネル102などの構造物の裏側に複数のミリ波センサ10を、間隔を隔てて配置している。そして、これら複数のミリ波センサ10の出力データを合成して合成データを生成し、この合成データに基づいて、監視対象空間100内に存在する検査対象物の位置を検出する。また、監視カメラ20により監視対象空間100を撮影し、監視対象空間100内に存在する検査対象物の位置が検出された場合には、監視カメラ20により撮影された監視対象空間100の映像上で、検査対象物の位置に対応する、映像上の位置に、所定のマーキングを施して監視映像60を生成する。そして、この監視映像60を表示装置30に表示することにより、監視映像60を参照した警備員200などが検査対象物を所持していると疑われる人物を特定できるようにしている。 As described in detail above with reference to specific examples, in the security system of the present embodiment, the space that can be passed by a plurality of people is defined as the monitoring target space 100, and the wall panel 101 that partitions the monitoring target space 100. A plurality of millimeter wave sensors 10 are arranged at intervals on the back side of a structure such as a ceiling panel 102 or a ceiling panel 102. Then, the output data of the plurality of millimeter-wave sensors 10 are combined to generate synthetic data, and the position of the inspection target object existing in the monitoring target space 100 is detected based on the combined data. Further, when the surveillance camera 20 photographs the surveillance target space 100 and the position of the inspection target object existing in the surveillance target space 100 is detected, the image of the surveillance target space 100 captured by the surveillance camera 20 is displayed. , The position on the image corresponding to the position of the inspection object is marked with a predetermined mark to generate the surveillance image 60. Then, by displaying the surveillance image 60 on the display device 30, a security guard 200 or the like who refers to the surveillance image 60 can identify a person suspected of possessing the inspection object.
 このように、本実施形態のセキュリティシステムは、従来のウォークスルータイプの危険物検知装置のように一人ひとり順番に検査するのではなく、複数の人が通行可能な監視対象空間100を仕切る構造物の裏側に間隔を隔てて配置された複数のミリ波センサ10を用いて、監視対象空間100内に存在する複数の人の中から検査対象物を所持していると疑われる人物を特定できるようにしている。したがって、本実施形態のセキュリティシステムによれば、検査が行われていることを意識させずに同時に多くの人を対象とした検査を行うことができ、人に圧迫感を与えず、検査に伴う混雑を緩和することができる。 As described above, the security system of the present embodiment is a structure that partitions the monitored space 100 that can be passed by a plurality of people, instead of inspecting each person in turn as in the conventional walk-through type dangerous substance detection device. By using a plurality of millimeter wave sensors 10 arranged at intervals on the back side, it is possible to identify a person suspected of possessing an inspection object from among a plurality of people existing in the monitored space 100. ing. Therefore, according to the security system of the present embodiment, it is possible to perform an inspection targeting many people at the same time without being aware that the inspection is being performed, and it does not give a feeling of oppression to the person and accompanies the inspection. Congestion can be alleviated.
 以上、本発明の具体的な実施形態について説明したが、上述した実施形態は本発明の一適用例を示したものである。本発明は、上述した実施形態そのままに限定されるものではなく、実施段階ではその要旨を逸脱しない範囲で、種々の省略、置き換え、変更を加えて具体化することができる。これらの実施形態やその変形は、発明の範囲や要旨に含まれるとともに、請求の範囲に記載された発明とその均等の範囲に含まれる。 Although the specific embodiment of the present invention has been described above, the above-described embodiment shows an application example of the present invention. The present invention is not limited to the above-described embodiment as it is, and can be embodied by making various omissions, replacements, and changes within a range that does not deviate from the gist at the implementation stage. These embodiments and modifications thereof are included in the scope and gist of the invention, and are also included in the scope of the invention described in the claims and the equivalent scope thereof.
 10 ミリ波センサ
 20 カメラ
 30 表示装置
 40 センサデータ処理部
 41 データ合成部
 42 検出器
 50 映像データ処理部
 51 座標変換部
 52 マーカ生成部
 53(53’) 監視映像生成部
 54 人物検出追跡部
 60 監視映像
 70 マーカ
 80 ハイライト加工
10 mm wave sensor 20 camera 30 display device 40 sensor data processing unit 41 data synthesis unit 42 detector 50 video data processing unit 51 coordinate conversion unit 52 marker generation unit 53 (53') monitoring video generation unit 54 person detection tracking unit 60 monitoring Video 70 Marker 80 Highlight processing

Claims (5)

  1.  複数の人が通行可能な空間を仕切る構造物の裏側に間隔を隔てて配置された複数のミリ波センサと、
     前記複数のミリ波センサの出力データを合成した合成データに基づいて、前記空間内に存在する検査対象物の位置を検出するセンサデータ処理部と、
     所定位置から前記空間を撮影するカメラと、
     前記センサデータ処理部により検出される前記検査対象物の位置に対応する、前記カメラが撮影する映像上の位置にマーキングを施した映像を生成する映像データ処理部と、
     前記映像データ処理部が生成した映像を表示する表示装置と、を備えることを特徴とするセキュリティシステム。
    Multiple millimeter-wave sensors placed at intervals behind a structure that divides a space that can be passed by multiple people,
    A sensor data processing unit that detects the position of an inspection object existing in the space based on the composite data obtained by synthesizing the output data of the plurality of millimeter wave sensors.
    A camera that shoots the space from a predetermined position,
    An image data processing unit that generates an image in which a position on an image taken by the camera is marked corresponding to the position of the inspection object detected by the sensor data processing unit.
    A security system including a display device for displaying an image generated by the image data processing unit.
  2.  前記映像データ処理部は、前記センサデータ処理部により検出される前記検査対象物の前記位置が前記カメラの設置位置に近いほどサイズが大きくなるマーカを前記カメラが撮影する映像に重畳して前記映像を生成することを特徴とする請求項1に記載のセキュリティシステム。 The video data processing unit superimposes a marker whose size increases as the position of the inspection object detected by the sensor data processing unit is closer to the installation position of the camera on the video captured by the camera. The security system according to claim 1, wherein the security system is generated.
  3.  前記映像データ処理部は、前記カメラが撮影する映像上で、前記センサデータ処理部により検出される前記検査対象物の前記位置に対応する前記映像上の位置に映る人物を特定し、該人物が強調される前記映像を生成することを特徴とする請求項1に記載のセキュリティシステム。 The video data processing unit identifies a person who appears at a position on the video corresponding to the position of the inspection target detected by the sensor data processing unit on the video captured by the camera, and the person The security system according to claim 1, wherein the image to be emphasized is generated.
  4.  前記センサデータ処理部は、前記合成データを入力とし、前記空間内に存在する物体の各々について前記検査対象物である確からしさを示す尤度を求め、該尤度が基準値を超える物体の位置を出力するように学習された検出器を用いて、前記空間内に存在する前記検査対象物の位置を検出することを特徴とする請求項1に記載のセキュリティシステム。 The sensor data processing unit receives the composite data as an input, obtains a likelihood indicating the certainty of the inspection target for each of the objects existing in the space, and positions an object whose likelihood exceeds a reference value. The security system according to claim 1, wherein the position of the inspection object existing in the space is detected by using a detector learned to output the data.
  5.  前記センサデータ処理部は、前記複数のミリ波センサの各々の前記出力データを光無線通信により受信し、
     前記映像データ処理部は、前記カメラによって撮影された前記映像の映像データを光無線通信により受信する、
     ことを特徴とする請求項1に記載のセキュリティシステム。
    The sensor data processing unit receives the output data of each of the plurality of millimeter-wave sensors by optical wireless communication, and receives the output data.
    The video data processing unit receives the video data of the video captured by the camera by optical wireless communication.
    The security system according to claim 1.
PCT/JP2020/018874 2019-05-14 2020-05-11 Security system WO2020230766A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2021519428A JP7467434B2 (en) 2019-05-14 2020-05-11 Security Systems

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019-091697 2019-05-14
JP2019091697 2019-05-14

Publications (1)

Publication Number Publication Date
WO2020230766A1 true WO2020230766A1 (en) 2020-11-19

Family

ID=73289924

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/018874 WO2020230766A1 (en) 2019-05-14 2020-05-11 Security system

Country Status (2)

Country Link
JP (1) JP7467434B2 (en)
WO (1) WO2020230766A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7411539B2 (en) 2020-12-24 2024-01-11 株式会社日立エルジーデータストレージ Ranging system and its coordinate calibration method

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10282247A (en) * 1997-04-10 1998-10-23 Mitsubishi Electric Corp Metal detecting device
JPH1183996A (en) * 1997-09-03 1999-03-26 Omron Corp Millimetric wave detector
JPH11203569A (en) * 1998-01-12 1999-07-30 Hitachi Zosen Corp Passerby detecting method and detection system
JP2007517275A (en) * 2003-10-30 2007-06-28 バッテル メモリアル インスティチュート Hidden object detection
JP2011503597A (en) * 2007-11-13 2011-01-27 シェクロン、クロード Device for detecting objects, especially dangerous materials
CN102565794A (en) * 2011-12-30 2012-07-11 北京华航无线电测量研究所 Microwave security inspection system for automatically detecting dangerous object hidden in human body
US20160291148A1 (en) * 2015-04-03 2016-10-06 Evolv Technologies, Inc. Modular Imaging System
JP2018013448A (en) * 2016-07-22 2018-01-25 日本信号株式会社 Portable object detection device
JP2018146257A (en) * 2017-03-01 2018-09-20 株式会社東芝 Dangerous object detector

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
ATE553397T1 (en) 2008-12-30 2012-04-15 Sony Corp CAMERA-ASSISTED SCANNING IMAGING SYSTEM AND MULTI-ASPECT IMAGING SYSTEM
US9928425B2 (en) 2012-06-20 2018-03-27 Apstec Systems Usa Llc Methods and systems for non-cooperative automatic security screening in crowded areas
JP2018156586A (en) 2017-03-21 2018-10-04 株式会社東芝 Monitoring system

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10282247A (en) * 1997-04-10 1998-10-23 Mitsubishi Electric Corp Metal detecting device
JPH1183996A (en) * 1997-09-03 1999-03-26 Omron Corp Millimetric wave detector
JPH11203569A (en) * 1998-01-12 1999-07-30 Hitachi Zosen Corp Passerby detecting method and detection system
JP2007517275A (en) * 2003-10-30 2007-06-28 バッテル メモリアル インスティチュート Hidden object detection
JP2011503597A (en) * 2007-11-13 2011-01-27 シェクロン、クロード Device for detecting objects, especially dangerous materials
CN102565794A (en) * 2011-12-30 2012-07-11 北京华航无线电测量研究所 Microwave security inspection system for automatically detecting dangerous object hidden in human body
US20160291148A1 (en) * 2015-04-03 2016-10-06 Evolv Technologies, Inc. Modular Imaging System
JP2018013448A (en) * 2016-07-22 2018-01-25 日本信号株式会社 Portable object detection device
JP2018146257A (en) * 2017-03-01 2018-09-20 株式会社東芝 Dangerous object detector

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7411539B2 (en) 2020-12-24 2024-01-11 株式会社日立エルジーデータストレージ Ranging system and its coordinate calibration method

Also Published As

Publication number Publication date
JPWO2020230766A1 (en) 2020-11-19
JP7467434B2 (en) 2024-04-15

Similar Documents

Publication Publication Date Title
RU2510036C2 (en) Adaptive detection system
US9928425B2 (en) Methods and systems for non-cooperative automatic security screening in crowded areas
US10347062B2 (en) Personal identification for multi-stage inspections of persons
JP6178511B2 (en) Active microwave device and detection method
RU2671237C2 (en) Method for generating image and handheld screening device
US20160252646A1 (en) System and method for viewing images on a portable image viewing device related to image screening
US20160117898A1 (en) Multi-threat detection system
US9282258B2 (en) Active microwave device and detection method
Hu et al. Detecting, locating, and characterizing voids in disaster rubble for search and rescue
WO2015010531A1 (en) Human body security screening method and system
KR101608889B1 (en) Monitoring system and method for queue
CN102630301A (en) Method for remote inspection of target in monitored space
JP2022509370A (en) Personnel inspection with threat detection and discrimination functions
CN103885088A (en) Method For Operating A Handheld Screening Device And Handheld Screening Device
WO2020230766A1 (en) Security system
JP6276736B2 (en) Substance identification device
US10445591B2 (en) Automated target recognition based body scanner using database scans
EP3387627B1 (en) Multi-threat detection system
US20170329033A1 (en) Multi-threat detection of moving targets
US20160133023A1 (en) Method for image processing, presence detector and illumination system
JP7388532B2 (en) Processing system, processing method and program
US10796544B2 (en) Concealed item detection with image series in eulerian and lagrangian frame of reference
US20220334243A1 (en) Systems and methods for detection of concealed threats
CN113126176A (en) Terahertz wave security inspection system and method
CN219594565U (en) Whole body scanner

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20804907

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021519428

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20804907

Country of ref document: EP

Kind code of ref document: A1