WO2023162723A1 - Dispositif de surveillance, système de surveillance et procédé de surveillance - Google Patents

Dispositif de surveillance, système de surveillance et procédé de surveillance Download PDF

Info

Publication number
WO2023162723A1
WO2023162723A1 PCT/JP2023/004584 JP2023004584W WO2023162723A1 WO 2023162723 A1 WO2023162723 A1 WO 2023162723A1 JP 2023004584 W JP2023004584 W JP 2023004584W WO 2023162723 A1 WO2023162723 A1 WO 2023162723A1
Authority
WO
WIPO (PCT)
Prior art keywords
range
information
processor
detection
radar
Prior art date
Application number
PCT/JP2023/004584
Other languages
English (en)
Japanese (ja)
Inventor
雅士 古賀
Original Assignee
i-PRO株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by i-PRO株式会社 filed Critical i-PRO株式会社
Publication of WO2023162723A1 publication Critical patent/WO2023162723A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/117Identification of persons
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/50Systems of measurement based on relative movement of target
    • G01S13/52Discriminating between fixed and moving objects or between objects moving at different speeds
    • G01S13/56Discriminating between fixed and moving objects or between objects moving at different speeds for presence detection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/87Combinations of radar systems, e.g. primary radar and secondary radar
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B25/00Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
    • G08B25/01Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium
    • G08B25/04Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium using a single signalling line, e.g. in a closed loop

Definitions

  • the present disclosure relates to a monitoring device, a monitoring system and a monitoring method.
  • Patent Document 1 discloses a contactless vital signs monitoring system having a radar installed near a monitored object.
  • the non-contact vital signs monitoring system stores the output signals of the radar sequentially sampled within a predetermined period of time, determines the status of the monitored object based on the output signals, and determines the status of the monitored object based on the stationary output signals. Determine the subject's vital signs.
  • a non-contact vital signs monitoring system uses radar to detect and measure a monitored subject's health status and signs of illness, such as body temperature, blood pressure, heart rate, and respiratory rate. It is determined whether or not.
  • the surveillance radar distinguishes a background such as buildings and trees from a person by detecting a moving object.
  • the present disclosure has been devised in view of the above-described conventional circumstances, and aims to improve the detection accuracy of living things by radar.
  • the present disclosure includes a first transceiver that transmits a first radio wave in a first range and receives a reflected wave of the first radio wave, and a reflected wave of the first radio wave that is received by the first transceiver.
  • a first detection processing unit that detects the presence or absence of an object in the first range and acquires first information about the object detected in the first range; and a second radio wave in a second range that is wider than the first range. and detecting the presence or absence of an object in the second range based on the reflected wave of the second radio wave received by the second transmitting/receiving unit that receives the reflected wave of the second radio wave.
  • a second detection processing unit that acquires second information about the object detected in the second range, wherein the first information includes identification information that can determine whether the object is a living thing. , provides monitoring equipment.
  • the present disclosure is a monitoring system including a monitoring device and an information processing device communicably connected to the monitoring device, wherein the monitoring device transmits a first radio wave in a first range, a first transmission/reception unit that receives a reflected wave of a first radio wave; based on the reflected wave of the first radio wave received by the first transmission/reception unit, detects the presence or absence of an object in the first range; and a second transmission/reception unit that transmits a second radio wave in a second range wider than the first range and receives a reflected wave of the second radio wave.
  • the first information includes identification information capable of determining whether or not the object is a living thing.
  • the present disclosure is to transmit a first radio wave in a first range, receive a reflected wave of the first radio wave, and based on the reflected wave of the first radio wave, determine the position of an object in the first range. detecting the presence/absence; acquiring first information that is information about an object detected in the first range and that includes identification information that can determine whether the object is a living thing; Transmitting a second radio wave in a second range wider than the range, receiving a reflected wave of the second radio wave, and determining the presence or absence of an object in the second range based on the reflected wave of the second radio wave.
  • a method of monitoring is provided, comprising: sensing; and obtaining second information about an object sensed at the second range.
  • FIG. 1 is a block diagram showing a system configuration example of a detection system according to Embodiment 1.
  • FIG. 2 is a block diagram showing an internal configuration example of the surveillance radar device according to the first embodiment.
  • FIG. 3 is a diagram illustrating an example of detection areas of the long-distance detection radar and the short-distance detection radar.
  • FIG. 4 is a diagram illustrating an example of detection areas of the long-distance detection radar and the short-distance detection radar.
  • 5 is a flow chart showing an example of an operation procedure of the surveillance radar device according to Embodiment 1.
  • FIG. 6 is a flowchart showing an example of an operation procedure of the surveillance radar device in Modification 1 of Embodiment 1.
  • FIG. 1 is a block diagram showing a system configuration example of a detection system according to Embodiment 1.
  • FIG. 2 is a block diagram showing an internal configuration example of the surveillance radar device according to the first embodiment.
  • FIG. 3 is a diagram illustrating an example of detection areas of the
  • FIG. 7 is a flowchart illustrating an example of an operation procedure of a surveillance radar device according to Modification 2 of Embodiment 1.
  • FIG. 8 is a flowchart showing an example of an operation procedure of the surveillance radar device in Modification 2 of Embodiment 1.
  • FIG. 9 is a flowchart illustrating an example of an operation procedure of a surveillance radar device according to Modification 3 of Embodiment 1.
  • FIG. 10 is a flowchart showing an example of an operation procedure of the surveillance radar device in Modification 3 of Embodiment 1.
  • FIG. FIG. 11 is a diagram showing an example of a detection screen.
  • FIG. 1 is a diagram showing a system configuration example of a detection system 100 according to Embodiment 1.
  • the detection system 100 includes one or more surveillance radar devices RD1, a server S1, and a network NW.
  • the detection system 100 may also include a camera C1, a security drone DR1, and a security guard terminal TP1. Note that the camera C1, the security drone DR1, and the security guard terminal TP1 are not essential elements of the detection system 100 and may be omitted.
  • the detection system 100 is an example of a surveillance system, and uses at least one surveillance radar device RD1 installed indoors and outdoors to detect objects (for example, humans, animals, buildings, etc.).
  • the detection system 100 analyzes the vital information of the detected object by the surveillance radar device RD1.
  • the detection system 100 superimposes the position of the detected object (for example, a person) and vital information on a map (an example of map information) corresponding to the detection area AR0 of the surveillance radar device RD1, and a detection screen SC1 ( An example of notification information and superimposed map information) (see FIG. 11) is generated.
  • the detection system 100 displays the detection screen generated by the surveillance radar device RD1 on the monitor MN of the server S1.
  • the detection area AR0 corresponds to an area in which an object can be detected by the long-distance detection radar RA (see FIG. 2) and the short-distance detection radar RB (see FIG. 2).
  • the surveillance radar device RD1 is connected to the server S1 and the camera C1 via the network NW so as to be capable of wired communication or wireless communication.
  • Wireless communication is, for example, short-range wireless communication such as Bluetooth (registered trademark) or NFC (registered trademark), or communication via a wireless LAN (Local Area Network) such as Wi-Fi (registered trademark).
  • the surveillance radar device RD1 detects the position, moving speed, etc. of an object from within the second detection area AR2 (see FIG. 3) using the long-distance detection radar RA. Then, the surveillance radar device RD1 acquires vital information of an object located within the first detection area AR1 (see FIG. 3) detected by the short-range detection radar RB.
  • the surveillance radar device RD1 generates a detection screen SC1 (see FIG. 11) and transmits it to the server S1.
  • the detection screen SC1 superimposes the position of an object detected by the long-distance detection radar RA or the position and vital information of an object detected by the short-distance detection radar RB on a map corresponding to the detection area AR0. be. Note that the number of surveillance radar devices RD1 shown in FIG. 1 is one, but may be two or more.
  • the surveillance radar device RD1 may acquire a captured image captured by the camera C1 when it is connected to the camera C1 described later so that data can be transmitted and received. In such a case, the surveillance radar device RD1 displays the position of the object detected by the long-range detection radar RA or the position and vital information of the object detected by the short-range detection radar RB on the image captured by the camera C1. may be superimposed to generate a detection screen.
  • the camera C1 is connected to the surveillance radar device RD1, the server S1, or the security guard terminal TP1 via the network NW so as to be capable of wired or wireless communication.
  • the camera C1 is, for example, a monitoring camera such as a security camera.
  • the camera C1 captures an image of at least a part of the detection area AR0, and transmits the captured image to the monitoring radar device RD1, the server S1, or the security guard terminal TP1. Note that the number of cameras C1 shown in FIG. 1 is one, but may be two or more.
  • the camera C1 may be a camera equipped with artificial intelligence (AI).
  • AI artificial intelligence
  • the camera C1 uses a learned AI model to detect an object appearing in the captured image, and acquires the position of the object.
  • the camera C1 also acquires the detection result including the vital information of the object (for example, information such as breathing rate, heart rate, blood pressure, breathing interval, or heartbeat interval) transmitted from the surveillance radar device RD1.
  • the camera C1 may generate a detection screen in which the acquired position of the object and the vital information of the object are superimposed on the captured image, and transmit the detection screen to the server S1.
  • the server S1 is connected to the surveillance radar device RD1, the camera C1, the security drone DR1, or the security guard terminal TP1 via the network NW so as to be capable of wired communication or wireless communication.
  • the server S1 transmits data between an operation unit 23 that can be operated by a user (e.g., an employee of a management company who monitors the monitored area, a security guard, a manager, etc.) and a monitor MN that can be viewed by the user. Connected for transmission and reception.
  • the server S1 may be realized by an information processing device such as a PC (Personal Computer), a notebook PC, a tablet terminal, a smart phone, or the like.
  • the server S1 outputs the detection screen SC1 (see FIG. 11) and the like transmitted from the surveillance radar device RD1 or the camera C1 to the monitor MN for display. Further, when the user performs an operation to report to a predetermined destination (for example, a management company, a security company, an insurance company, a security guard, a security drone DR1, a security guard terminal TP1, etc.), the server S1 performs a predetermined report to the reporting destination. Based on the user's operation, the server S1 directs the security drone DR1 to the position where the object was detected, generates a control command for intimidating the object, giving a warning, etc., and transmits the control command to the security drone DR1. good too.
  • a predetermined destination for example, a management company, a security company, an insurance company, a security guard, a security drone DR1, a security guard terminal TP1, etc.
  • the server S1 includes at least a communication unit 20, a processor 21, and a memory 22.
  • the database DB2 may be mounted in an information processing apparatus different from the server S1 and connected to the server S1 so that data can be transmitted and received.
  • the communication unit 20 is configured using a communication interface circuit for executing data transmission/reception with the surveillance radar device RD1, the camera C1, the security drone DR1, or the security guard terminal TP1 via the network NW.
  • the communication unit 20 outputs the detection screen SC1 transmitted from the surveillance radar device RD1 or the camera C1 to the processor 21 through a wireless communication network or a wired communication network. Also, the communication unit 20 transmits a control command corresponding to the user operation output from the processor 21 to the corresponding device (for example, the security drone DR1, the security guard terminal TP1, etc.).
  • the processor 21 is, for example, a computing device such as a CPU (Central Processing Unit) or an FPGA (Field Programmable Gate Array).
  • the processor 21 cooperates with the memory 22 to perform various types of processing and control. Specifically, the processor 21 implements various functions by loading and executing programs and data held in the memory 22 .
  • the processor 21 Based on the electrical signal output from the operation unit 23, the processor 21 receives a preset report destination (e.g., a management company, a security company, an insurance company, or a security guard terminal TP1, etc.), and receives the mail address and telephone number of the report destination. etc.). Based on the electrical signal output from the operation unit 23, the processor 21 moves the security drone DR1 to the position of the detected object and generates a control command to threaten or warn the object.
  • a preset report destination e.g., a management company, a security company, an insurance company, or a security guard terminal TP1, etc.
  • the memory 22 includes, for example, a RAM (Random Access Memory) as a work memory used when executing each process of the processor 21, and a ROM (Read Only Memory) for storing programs and data that define the operation of the processor 21. have.
  • the memory 22 may have a storage device including either a storage device such as an SSD (Solid State Drive) or HDD (Hard Disk Drive). Data or information generated or obtained by the processor 21 is temporarily stored in the RAM.
  • a program that defines the operation of the processor 21 is written in the ROM.
  • the memory 22 stores the information of the report destination (for example, the management company, the security company, the insurance company, or the mail address and telephone number of the report destination of the security guard terminal TP1, etc.), the information regarding the security drone DR1, and the like.
  • the database DB2 is, for example, a storage device such as an HDD or SSD.
  • the database DB2 includes the detection area AR0, information (eg, serial number, ID, etc.) of the surveillance radar device RD1 that monitors the detection area AR0, and information (eg, , name, face image, vital information, etc.) are linked and registered (stored).
  • the operation unit 23 is a user interface configured using, for example, a touch panel, buttons, keyboard, and the like.
  • the operating unit 23 converts the accepted user's operation into an electrical signal (control command) and outputs the electrical signal to the processor 21 .
  • the operation unit 23 is a touch panel, the operation unit 23 is configured integrally with the monitor MN.
  • the monitor MN is a display such as LCD (Liquid Crystal Display) or organic EL (Electroluminescence).
  • the monitor MN displays the detection screen SC1 transmitted from the surveillance radar device RD1 or camera C1.
  • the security guard terminal TP1 is connected to the server S1 via the network NW so that data can be sent and received.
  • the security guard terminal TP1 is used by security guards guarding the detection area AR0 monitored by the surveillance radar device RD1, building employees, and the like, and is realized by, for example, a notebook PC, a tablet terminal, a smartphone, and the like.
  • the security terminal TP1 displays the detection screen SC1 or the alarm information transmitted from the server S1. Note that the number of security guard terminals TP1 shown in FIG. 1 is one, but may be two or more.
  • the security drone DR1 is connected to the surveillance radar device RD1 or the server S1 via the network NW so that data can be sent and received.
  • the security drone DR1 is equipped with a speaker, lighting, etc., and threatens, warns, etc. by voice or illumination light to an object.
  • the security drone DR1 flies to the position where the object is detected based on the control command transmitted from the surveillance radar device RD1 or the server S1.
  • the security drone DR1 flies to the position where the object is detected, it threatens, warns, or the like to the object.
  • the security drone DR1 may include a camera, capture an image of an object, and transmit the captured image (live video) to the server S1.
  • FIG. 2 is a block diagram showing an internal configuration example of the surveillance radar device RD1.
  • FIG. 3 is a diagram illustrating an example of detection areas of the long-distance detection radar RA and the short-distance detection radar RB.
  • FIG. 4 is a diagram illustrating an example of detection areas of the long-distance detection radar RA and the short-distance detection radar RB.
  • the surveillance radar device RD1 includes a communication unit 10, a processor 11, a memory 12, a long-range detection radar RA, a short-range detection radar RB, and a database DB11.
  • the database DB11 may be configured separately from the surveillance radar device RD1.
  • the database DB11 is not an essential component and may be omitted.
  • the communication unit 10 is configured using a communication interface circuit for executing data transmission/reception with the server S1 and the camera C1 via the network NW.
  • the communication unit 10 transmits the detection screen SC1 (see FIG. 11) generated by the processor 11 to the server S1 through a wireless communication network or a wired communication network.
  • the communication unit 10 also outputs the captured image transmitted from the camera C ⁇ b>1 to the processor 11 .
  • the processor 11 is configured using, for example, a CPU or FPGA, and cooperates with the memory 12 to perform various types of processing and control. Specifically, the processor 11 refers to the programs and data held in the memory 12 and executes the programs, thereby realizing various functions of the AI processing unit 13 and the like.
  • the processor 11 controls the long-range detection radar RA and the short-range detection radar RB.
  • the processor 11 may independently control the long-range detection radar RA and the short-range detection radar RB. Based on the detection result of the long-distance detection radar RA (an example of the first information) and the detection result of the short-range detection radar RB (an example of the second information), the processor 11 detects the distance between the long-distance detection radar RA and the near distance detection radar RA. It may be controlled cooperatively with the distance detection radar RB.
  • the processor 11 generates a detection screen SC1 (see FIG. 11) in which the detection result of the long-distance detection radar RA or the detection result of the short-distance detection radar RB is superimposed on the map, and outputs it to the communication unit 10. do.
  • the communication unit 10 transmits the detection screen SC1 output from the processor 11 to the server S1.
  • the processor 11 determines whether the person who has entered the first detection area AR1 is detected by the first detection based on the vital information of the person who has entered the first detection area AR1. Person authentication is performed to determine whether the person is permitted to enter the area AR1. If the processor 11 determines that the person who has entered the first detection area AR1 is not a person permitted to enter the first detection area AR1, the processor 11 generates a detection screen SC1 in which the position of the person is superimposed on a map, and sends the detection screen SC1 to the server S1. Send.
  • the memory 12 has, for example, a RAM as a work memory that is used when executing each process of the processor 11, and a ROM that stores programs and data that define the operation of the processor 11.
  • the memory 12 may have a storage device including either a storage device such as an SSD or an HDD.
  • the RAM temporarily stores data or information generated or acquired by the processor 11 .
  • a program that defines the operation of the processor 11 is written in the ROM.
  • the memory 12 stores map data corresponding to the detection area AR0 (that is, the first detection area AR1 and the second detection area AR2) of the surveillance radar device RD1.
  • the map may be two-dimensional map data or three-dimensional map data.
  • the first detection area AR1 is an area in which an object can be detected by the short-range detection radar RB.
  • the second detection area AR2 is an area in which an object can be detected by the long-distance detection radar RA.
  • the first detection area AR1 has a horizontal distance X1 from the installation position PS0 of the surveillance radar device RD1, and is a range in which the short-range detection radar RB can transmit and receive radio waves.
  • the second detection area AR2 has a horizontal distance X2 from the installation position PS0 of the surveillance radar device RD1, and is a range in which the long-distance detection radar RA can transmit and receive radio waves.
  • the distance X1 is a horizontal distance of 10 to 20 m from the installation position PS0 of the surveillance radar device RD1.
  • the distance X2 is greater than or equal to the distance X1, and the horizontal distance from the installation position PS0 of the surveillance radar device RD1 is, for example, 20 to 30 m.
  • the first detection area AR1 is smaller than the second detection area AR2 and is located closer to the installation position PS0 of the surveillance radar device RD1 than the second detection area AR2 (see FIG. 3). ). Also, the first detection area AR1 and the second detection area AR2 may partially overlap.
  • the above-described distances X1 and X2 are examples and are not limited to these. Further, if the distance X2 from the installation position PS0 of the surveillance radar device RD1 in the second detection area AR2 is equal to or greater than the distance X1 from the installation position PS0 of the surveillance radar device RD1, the radio waves are transmitted and received by the long-distance detection radar RA. is variable within the possible range.
  • the AI processing unit 13 forms a neural network based on at least one trained AI model.
  • the AI processing unit 13 forms a neural network corresponding to each trained AI model stored in the learning model memory 15 .
  • the AI processing unit 13 uses the formed neural network to perform signal analysis processing on the signals output from each of the long-distance detection radar RA and the short-distance detection radar RB.
  • AI processing unit 13 includes AI arithmetic processing unit 14 and learning model memory 15 .
  • the AI arithmetic processing unit 14 uses the learned AI model stored in the learning model memory 15, and based on the signal output from each of the long-distance detection radar RA and the short-distance detection radar RB, detected An object type determination process may be executed.
  • the type determination processing includes processing for determining whether or not a detected object is a living thing (examples of first determination processing and fourth determination processing), This includes processing for determining whether or not an object is a human (examples of third determination processing and sixth determination processing).
  • the AI arithmetic processing unit 14 uses the learned AI model stored in the learning model memory 15, and acquires vital information of the detected object based on the signal output from the short-range detection radar RB. may be executed.
  • the learning model memory 15 is composed of memories such as RAM, ROM, and flash memory.
  • the learning model memory 15 stores a learned AI model created in advance by learning processing.
  • the AI arithmetic processing unit 14 forms a neural network corresponding to the trained AI model stored in the learning model memory 15 and executes desired signal analysis processing.
  • the learning model memory 15 stores a trained AI model capable of determining the type of the detected object and a trained AI model capable of acquiring vital information of the detected object based on the signal output from the radar.
  • the learning model memory 15 may store a learned AI model with which the surveillance radar device RD1 can perform person authentication. Note that the above-described trained AI model is an example, and is not limited to this.
  • the learning model memory 15 may store other trained AI models used for other purposes.
  • the long-distance detection radar RA includes M (M: an integer equal to or greater than 1) radar ICs (Integrated Circuits) RA1, . Each radar IC is connected to transmitting antenna units RAT1, . . . , RATM and receiving antenna units RAR1, . The transmitting antenna units RAT1, . . . , RATM and the receiving antenna units RAR1, .
  • the long-distance detection radar RA detects objects within the second detection area AR2 (for example, vehicles, two-wheeled vehicles, etc.).
  • the second predetermined distance is, for example, several meters to 100 meters, and is a distance larger than the first predetermined distance at which the short-range detection radar RB can detect an object.
  • the second predetermined distance may be a distance that partially includes the first predetermined distance (see FIG. 3). That is, the second detection area AR2 may partially include the first detection area AR1 of the short-range detection radar RB.
  • each of the receiving antenna units RAR1, . . . , RARM converts the received reflected wave into an analog signal and outputs it to the corresponding radar IC.
  • Each of the M radar ICs RA1, . an example of processing), object type determination processing, and the like are executed.
  • the short-range detection radar RB includes N (N: an integer equal to or greater than 1) radar ICs RB1, ..., RBN (an example of a first detection processing unit). , RBN are connected to transmitting antenna units RBT1, . . . , RBTN and receiving antenna units RBR1, . The transmitting antenna units RBT1, . . . , RBTN and the receiving antenna units RBR1, .
  • the short-range detection radar RB detects an object within a first detection area AR1 near the installation position PS0 of the surveillance radar device RD1, and objects (for example, vehicles, motorcycles, etc.).
  • the first predetermined distance referred to here is, for example, several meters to 10 meters.
  • the transmission antenna units RAT1, ..., RATM transmit radio waves over a wider range than the transmission antenna units RBT1, ..., RBTN.
  • the receiving antenna units RAR1, . . . , RARM receive radio waves in a wider range than the receiving antenna units RBR1, .
  • the transmitting antenna units RBT1, . . . , RBTM transmit radio waves, and the receiving antenna units RBR1, . including).
  • the transmitting antenna units RAT1, . . . , RATM transmit radio waves, and the receiving antenna units RAR1, . (including waves). That is, the first detection area AR1 is closer to the installation position PS0 of the surveillance radar device RD1 than the second detection area AR2, and has a smaller object detectable range than the second detection area AR2.
  • Each of the N radar ICs RB1, ..., RBN is reflected by objects (for example, buildings, plants, humans, animals, etc.) existing in the irradiation direction of each antenna received by the receiving antenna units RBR1, ..., RBRN receive reflected waves.
  • Each of the receiving antenna units RBR1, . . . , RBRN further converts the received reflected wave into an analog signal and outputs it to the corresponding radar IC.
  • Each of the N radar ICs RB1, Acquisition processing of vital information of the object (for example, information such as respiratory rate, heart rate, blood pressure, breathing interval, or heartbeat interval) is executed.
  • the number of radar ICs provided in the surveillance radar device RD1 is only an example and is not limited to this. Further, the number M of radar ICs provided in the long-range detection radar RA, the number N of radar ICs provided in the short-range detection radar RB, and the number of antennas connected to each radar IC need not be the same, and may be different. may be
  • the database DB11 is configured using a storage device such as an HDD, SSD, or NAND.
  • the database DB11 registers (stores) information (for example, name, face image, vital information, etc.) about an object permitted to enter the first detection area AR1.
  • information for example, name, face image, vital information, etc.
  • the name, face image, vital information, etc. of a person who is permitted to enter the first detection area AR1 is stored. It is registered in association with an identifying identifier.
  • FIG. 5 is a flowchart showing an operation procedure example of the surveillance radar device RD1 according to the first embodiment. Note that the respective processes of steps St102 to St103 and steps St202 to St203 may be executed by the radar IC or by the processor 11. FIG.
  • Each of the radar ICs RA1, ..., RAM determines whether the object is human.
  • Each of the radar ICs RA1, . is generated and output to the processor 11 .
  • Each of the radar ICs RB1, . . . , RBN acquires reflected waves received by the receiving antenna units RBR1, .
  • Each of the radar ICs RB1, . . . , RBN performs clustering processing on the obtained point cloud data (step St203).
  • Each of the radar ICs RB1, ..., RBN determines whether or not the object is a living thing based on the size of the object indicated by the set of clustered point clouds, information on the calculated moving speed of the point cloud, etc. (step St203).
  • the size of the object indicated by the set of clustered point clouds, information on the calculated moving speed of the point cloud, and the like are examples of the identification information of the present embodiment.
  • the process of step St203 is an example of the first determination process or the third determination process.
  • each of the radar ICs RB1, ..., RBN determines whether an object (for example, an object determined to be a living thing) is a human. Analyze the signal from the object determined to be human among the set of clustered point clouds, and obtain the vital information of the object (for example, information such as respiratory rate, heart rate, blood pressure, breathing interval, or heartbeat interval) is calculated (step St203).
  • an object for example, an object determined to be a living thing
  • the vital information of the object for example, information such as respiratory rate, heart rate, blood pressure, breathing interval, or heartbeat interval
  • the surveillance radar device RD1 can obtain the vital information of the object as described above.
  • Vital information corresponds to respiratory rate, heart rate, blood pressure, breathing interval, heartbeat interval, or the like, that is, information that can determine whether an object is a living thing. Therefore, vital information is an example of identification information in this embodiment.
  • the respiratory rate, heart rate, blood pressure, breathing interval, heartbeat interval, etc. derived from living organisms fluctuate according to predetermined cycles and climatic conditions.
  • the vital information in which variation according to a predetermined period or climate conditions is observed is an example of the biological information of the present embodiment.
  • the biological information is information such as breathing rate, heart rate, blood pressure, breathing interval, heartbeat interval, extremely high/not too low breathing rate, heart rate, blood pressure, breathing interval, heartbeat interval, etc. It is information that has series variation.
  • Each of the radar ICs RB1, ..., RBN acquires information such as the position (azimuth, distance), movement speed, etc. of the object indicated by the point cloud set determined to be human from the clustered point cloud set do.
  • the processor 11 receives the coordinate information of the point cloud corresponding to the object output from each of the radar ICs RA1, . and get. Using the acquired coordinate information of the point cloud, the processor 11 superimposes the detection points of the object detected by the two types of radar on the same map to generate a detection screen SC1 (see FIG. 11) (step St300). At this time, if the vital information is associated with the coordinate information of the object, the processor 11 generates a detection screen SC1 in which the detection points of the object and the vital information are superimposed on the same map. Thereby, the processor 11 can display the detection point (coordinate information) of the object and the vital information of the object together in the detection screen SC1.
  • the two types of radar referred to here are a long-distance detection radar RA and a short-distance detection radar RB.
  • the processor 11 outputs the generated detection screen SC1 to the communication unit 10 and causes it to be transmitted to the server S1 (step St301).
  • the surveillance radar device RD1 repeatedly executes the above-described processing to detect an object positioned within the detection area AR0. At the timing when an object detected by the long-distance detection radar RA approaches the first detection area AR1, the surveillance radar device RD1 performs object detection processing by the short-distance detection radar RB and acquires vital information of the object. Processing may begin.
  • the surveillance radar device RD1 may perform person authentication based on the vital information acquired by the short-range detection radar RB.
  • the surveillance radar device RD1 uses different radars for long range (that is, second detection area AR2) and short range (first detection area AR1). By doing so, the surveillance radar device RD1 can simultaneously perform detection of an object positioned at a long distance, detection of an object positioned at a short distance, and vital sensing.
  • the surveillance radar device RD1 in Embodiment 1 described above detects an object positioned at a long distance (second detection area AR2), detects an object positioned at a short distance (first detection area AR1), and performs vital sensing. and an example of running An example in which the surveillance radar device RD1 in Modification 1 of Embodiment 1 determines whether or not a stationary object is a living thing will be described.
  • the configuration of the surveillance radar device RD1 according to Modification 1 has the same internal configuration as that of the surveillance radar device RD1 according to the first embodiment.
  • functions realized by each internal configuration of the surveillance radar device RD1 in Modification 1 will be described.
  • the surveillance radar device RD1 executes the second determination process or the fifth determination process for determining whether or not the detected object is stationary (that is, whether or not it is moving).
  • the surveillance radar device RD1 determines that the detected object is not moving (that is, is stationary) as a result of the second determination process or the fifth determination process, the object determined not to be moving is Determine whether or not it is a living thing.
  • a second determination process example and a fifth determination process example will be described below.
  • Each of the radar ICs RB1, ..., RBN of the short-range detection radar RB determines whether or not the detected object is stationary.
  • the wavelength of the reflected wave changes due to the Doppler phenomenon. Whether or not the object is moving can be determined based on the change in the wavelength of the reflected wave. That is, when the object is moving, the point cloud data includes information on the moving speed of the object in the direction toward the surveillance radar device RD1. Therefore, each of the radar ICs RB1, . Subsequently, each of the radar ICs RB1, ..., RBN analyzes the signal corresponding to the object determined to be stationary, and calculates the vital information of the object.
  • each of the radar ICs RB1, . position of the data to determine whether the detected object is stationary.
  • each of the radar ICs RB1 Based on the position of the point cloud data at t3 ⁇ t4)), it is determined whether or not the detected object is stationary. In such a case, each of the radar ICs RB1, . Each of the radar ICs RB1, .
  • Each of the radar ICs RB1, ..., RBN determines whether the object is a stationary organism (human, animal, etc.) based on the calculated vital information of the object.
  • the processor 11 acquires information on the coordinates of the point cloud determined to be a stationary organism output from each of the radar ICs RB1, . . . , RBN, and vital information.
  • the processor 11 superimposes the acquired position of the stationary organism and the vital information on the map to generate a detection screen SC1 (see FIG. 11).
  • the processor 11 transmits the generated detection screen SC1 to the server S1.
  • FIG. 6 is a flowchart showing an operation procedure example of the surveillance radar device RD1 according to Modification 1 of Embodiment 1. As shown in FIG. Note that the respective processes of steps St104 to St105 and steps St204 to St205 may be executed by the radar IC or by the processor 11. FIG.
  • steps St100 to St101 and steps St200 to St202 are the same as those of FIG. 5, so description thereof will be omitted.
  • each of the radar ICs RA1, ..., RAM applies a background removal algorithm that removes the background from the generated point cloud data.
  • the background removal algorithm here is specifically CFAR (Constant False Alarm Rate) or the like.
  • Each of the radar ICs RA1, ..., RAM determines whether the object is a person.
  • Each of the radar ICs RB1, . . . , RBN acquires reflected waves received by the receiving antenna units RBR1, .
  • Each of the radar ICs RB1, . . . , RBN performs clustering processing on the obtained point cloud data (step St205).
  • each of the radar ICs RB1, ..., RBN further determines whether an object determined as a stationary object among the object type determination results is a living thing such as a human being or an animal.
  • Each of the radar ICs RB1, . information) is calculated (step St205).
  • Each of the radar ICs RB1, ..., RBN indicates the position (azimuth, distance), movement speed, etc. of the creature indicated by the set of point clouds determined to be a stationary creature among the clustered point clouds. and the calculated vital information are associated with each living thing to generate a detection result of the living thing.
  • Each of the radar ICs RB1, . . . , RBN outputs to the processor 11 the generated organism detection results.
  • the processor 11 receives the information on the coordinates of the point cloud corresponding to the creature output from each of the radar ICs RA1, . information and vital information.
  • the processor 11 compares the coordinates of the organism detected in the overlap area where the second detection area AR2 of the long-range detection radar RA and the first detection area AR1 of the short-range detection radar RB overlap (step St 302). It should be noted that the processor 11 may omit the process of comparing the information regarding the organisms detected in the overlapping area when the overlapping area does not exist.
  • the processor 11 determines that the same living thing is detected by each of the long-distance detection radar RA and the short-distance detection radar RB among the living things detected in the overlapping area, the long-distance detection radar RA
  • the detection results e.g., position, distance, etc.
  • the processor 11 generates a detection screen SC1 in which the detection result of the long-range detection radar RA and the detection result of the short-range detection radar RB are superimposed on the map, and transmits the detection screen SC1 to the server S1 (step St303).
  • the surveillance radar device RD1 may detect both stationary and moving creatures. In such a case, the surveillance radar device RD1 generates a detection screen SC1 by superimposing the detection result of a stationary creature and the detection result of a non-stationary object on a map, and transmits the detection screen SC1 to the server S1.
  • the surveillance radar device RD1 can determine whether the stationary object detected by the short-range detection radar RB is a mere object or a creature such as a human being or an animal. In other words, the surveillance radar device RD1 can improve the detection accuracy of stationary living things.
  • the surveillance radar device RD1 in Embodiment 1 described above detects an object positioned at a long distance (second detection area AR2), detects an object positioned at a short distance (first detection area AR1), and performs vital sensing. and an example of running An example in which the surveillance radar device RD1 according to Modification 2 of Embodiment 1 determines whether a detected object is a human or an animal will be described.
  • the configuration of the surveillance radar device RD1 in Modification 2 of Embodiment 1 has the same internal configuration as that of the surveillance radar device RD1 in Embodiment 1.
  • functions realized by each internal configuration of the surveillance radar device RD1 in Modification 2 of Embodiment 1 will be described.
  • the surveillance radar device RD1 determines whether the detected object is human or animal.
  • the learning model memory 15 may further store a learned AI model that can determine whether an object is human or animal.
  • Each of the radar ICs RB1, . and vital information to the processor 11 are included in the radar ICs RB1, . and vital information to the processor 11 .
  • the processor 11 acquires the coordinate information and the vital information of the point cloud determined to be human output from each of the radar ICs RB1, . . . , RBN.
  • the processor 11 superimposes the acquired human position and vital information on the map to generate a detection screen SC1 (see FIG. 11).
  • the processor 11 transmits the generated detection screen SC1 to the server S1.
  • the memory 12 stores areas where human entry is prohibited (hereinafter referred to as "no-entry areas”) or lines where human entry is prohibited (hereinafter referred to as “no-entry lines”).
  • the entry prohibition area AR10 (see FIG. 4) and the entry prohibition line LN (see FIG. 4) may each be set in advance by the administrator, and a plurality of them may be set within the detection area AR0.
  • the entry prohibition line LN may include information on the direction in which entry is prohibited.
  • the entry prohibition line LN may be set to prohibit the human TG from passing through the entry prohibition line LN in the direction of the arrow.
  • entry prohibition area AR10 and entry prohibition line LN may be set so as to prohibit entry of animals.
  • the database DB11 also stores information on the prohibited entry area AR10 and information on persons permitted to enter the prohibited area AR10 (for example, facial images and vital information of persons permitted to enter the prohibited area AR10). etc.) may be stored in association with each other.
  • the information on the entry prohibition line LN includes information on persons permitted to pass through the entry prohibition line LN (for example, a face image, vital information, etc. of a person permitted to enter the entry prohibition area AR10). etc.) may be associated and stored.
  • the processor 11 determines whether or not a person has entered the no-entry area AR10 based on the detected position of the person.
  • the processor 11 determines that a person has entered the intrusion prohibited area AR10
  • the processor 11 includes the position information of the person who has entered the intrusion prohibited area AR10 and notifies that the person's entry into the intrusion prohibited area AR10 has been detected.
  • generate an alert to Processor 11 transmits the generated alert to server S1.
  • the warning may be the detection screen SC1.
  • the processor 11 may generate an alarm including a captured image obtained by cutting out at least a partial area in which a person who has entered the no-entry area AR10 is captured from the captured image transmitted from the camera C1.
  • the processor 11 acquires the person's flow line information (movement trajectory information) based on the time-series change in the detected position of the person. Based on the flow line information of the person, the processor 11 determines whether the detected person has passed the entry prohibition line LN, or whether the detected person has passed the entry prohibition line LN from a predetermined direction. judge.
  • the processor 11 determines that the person has passed through the entry prohibition line LN
  • the processor 11 includes the position information of the person who has passed through the entry prohibition line LN and notifies that the person has passed through the entry prohibition line LN. Generate an alert.
  • Processor 11 transmits the generated alert to server S1.
  • the warning may be the detection screen SC1.
  • the processor 11 may generate an alarm including a captured image obtained by cutting out at least a partial region in which a person who has passed through the no-entry line LN is captured from the captured image transmitted from the camera C1.
  • the processor 11 when the set intrusion prohibition area AR10 or the intrusion prohibition line LN is set within the first detection area AR1 detectable by the short-distance detection radar RB, prevents the processor 11 from entering the intrusion prohibition area AR10. It may be determined whether or not the person who has entered or passed through the entry prohibition line LN is a person registered in the database DB11. The processor 11 collates the vital information of the person detected to have entered the no-entry area AR10 or passed the no-entry line LN with the vital information of each of the plurality of persons registered in the database DB11.
  • the processor 11 determines that the vital information of each of the plurality of persons registered in the database DB11 includes vital information identical or similar to the detected human vital information, the processor 11 omits the alarm generation process. On the other hand, when the processor 11 determines that the vital information of each of the plurality of persons registered in the database DB11 does not contain the vital information identical or similar to the detected human vital information, the processor 11 generates an alarm and Send to
  • FIG. 7 is a flow chart showing an operation procedure example of the surveillance radar device RD1 according to the second modification of the first embodiment.
  • FIG. 8 is a flow chart showing an operation procedure example of the surveillance radar device RD1 according to the second modification of the first embodiment.
  • the processing of steps St102 to St103 and steps St203 to St206 may be executed by the radar IC or by the processor 11.
  • FIG. 8 is a flow chart showing an operation procedure example of the surveillance radar device RD1 according to the second modification of the first embodiment.
  • steps St100 to St103 and steps St200 to St203 are the same as those of FIG. 5, so description thereof will be omitted.
  • the process of step St206 is an example of the third determination process or the sixth determination process.
  • each of the radar ICs RB1, ..., RBN can determine whether the object is a human or an animal based on the size of the clustered point cloud, the radio wave reflection intensity, etc., as described above. good.
  • each of the radar ICs RB1, ..., RBN acquires the image analysis results transmitted from the camera C1.
  • Each of the radar ICs RB1, ..., RBN is an object (human) in which information such as the position (azimuth, distance) and movement speed of an object determined to be a human is associated with human vital information for each person is generated and output to the processor 11 .
  • the processor 11 acquires the object (for example, person, animal, etc.) detection result output from the long-range detection radar RA and the object (human) detection result output from the short-range detection radar RB.
  • the processor 11 generates a detection screen SC1 (see FIG. 11) in which these acquired detection results are superimposed on the map (step St304).
  • the processor 11 determines whether or not the entire set no-entry area AR10 is within the first detection area AR1 by the short-range detection radar RB (step St305).
  • step St305 determines in the processing of step St305 that the entire set no-entry area AR10 is within the first detection area AR1 by the short-distance detection radar RB (step St305, YES)
  • the short-distance detection It is determined whether or not the object detected by the radar RB has entered the no-entry area AR10 (step St306).
  • step St306 determines whether the detected object has entered the no-entry area AR10 (step St306, YES).
  • the process of step St307 may be omitted when the process of step St206 is executed before the process of step St307.
  • step St306 determines in the process of step St306 that the detected object has not entered the no-entry area AR10 (step St306, NO)
  • the processor 11 displays the detection screen SC1 generated in the process of step St304. It is transmitted to the server S1 (step St311).
  • step St307 When the processor 11 determines in the process of step St307 that the detected object is a human (step St307, YES), it generates an alarm notifying that a human has entered the no-entry area AR10. , to the server S1 (step St308). The processor 11 also transmits the detection screen SC1 generated in the process of step St304 to the server S1 (step St311).
  • step St307 determines in the process of step St307 that the detected person is not a person (step St307, NO)
  • the process proceeds to step St311.
  • step St305 determines in the process of step St305 that the entire area of the set no-entry area AR10 is not within the first detection area AR1 by the short-range detection radar RB (step St305, NO)
  • the long-distance It is determined whether or not an object detected by the detection radar RA or the short-range detection radar RB has entered the no-entry area AR10 (step St309).
  • step St309 When the processor 11 determines in the process of step St309 that the detected object has entered the entry-prohibited area AR10 (step St309, YES), it notifies that the entry of the object into the entry-prohibited area AR10 has been detected. An alarm is generated and transmitted (issued) to the server S1 (step St308).
  • Objects detected here include moving objects such as humans and animals for which vital information can be obtained, and vehicles and motorcycles for which vital information cannot be obtained.
  • step St309 determines in the process of step St309 that the detected object has not entered the no-entry area AR10 (step St309, NO), it proceeds to the process of step St311.
  • the surveillance radar device RD1 can detect an object entering the no-entry area AR10 when the no-no-entry area AR10 is set within the detection area AR0.
  • the monitoring radar device RD1 can detect an object passing through the entry prohibition line LN when the entry prohibition line LN or the like is set.
  • the surveillance radar device RD1 When the surveillance radar device RD1 detects an object entering the no-entry area AR10 or passing through the no-entry line LN, it generates an alarm different from the detection screen SC1 and transmits it to the server S1. This allows the administrator to intuitively understand that an object entering the prohibited entry area AR10 or an object passing through the prohibited entry line LN has been detected based on the alarm.
  • the entry prohibition area AR10 or the entry prohibition line LN may be set according to the example described in the first modification of the first and second embodiments.
  • the surveillance radar device RD1 in Embodiment 1 may execute the processes of steps St305 to St311 after the process of step St300. Note that the surveillance radar device RD1 may omit the process of step St307. As a result, the surveillance radar device RD1 can detect an object entering the prohibited entry area AR10 or an object passing through the prohibited entry line LN.
  • the surveillance radar device RD1 in Modification 1 of Embodiment 1 may execute the processes of steps St305 to St311 after the process of step St302.
  • the surveillance radar device RD1 may omit the process of step St307.
  • the surveillance radar device RD1 can detect an object entering the prohibited entry area AR10 or an object passing through the prohibited entry line LN.
  • the surveillance radar device RD1 in Embodiment 1 described above detects an object positioned at a long distance (second detection area AR2), detects an object positioned at a short distance (first detection area AR1), and performs vital sensing. and an example of running
  • the surveillance radar device RD1 in Modification 3 of Embodiment 1 determines whether or not the detected object is an object that should be warned to the administrator, based on the emotion index of the object using the acquired vital information.
  • the configuration of the surveillance radar device RD1 in Modification 3 of Embodiment 1 has the same internal configuration as that of the surveillance radar device RD1 in Embodiment 1.
  • functions realized by each internal configuration of the surveillance radar device RD1 in Modification 3 of Embodiment 1 will be described.
  • the surveillance radar device RD1 calculates a person's emotion index based on the chronological change in vital information. The surveillance radar device RD1 determines whether or not a warning determination is necessary based on the calculated emotion index of the person.
  • the short-range detection radar RB calculates an emotion index that estimates the emotion of each object (for example, a person detected by the short-range detection radar RB) based on the chronological change in the acquired vital information.
  • the short-range detection radar RB evaluates an attention index for determining whether to generate an alarm based on the calculated human emotion index.
  • the short-range detection radar RB highly evaluates the attention index. Also, the short-range detection radar RB may evaluate the attention index based on the human emotion index and the acquired vital information. The short-range detection radar RB determines whether to generate an alarm based on whether the evaluated caution index is greater than or equal to the threshold. When the short-distance detection radar RB determines that the evaluated caution index is equal to or greater than the threshold, the short-distance detection radar RB includes the position information of the person who passed the entry prohibition line LN, and the person detected by the short-distance detection radar RB is cautioned. An alarm (an example of a first notification) is generated and output to the processor 11 to notify that it has been detected that the person should be identified.
  • An alarm (an example of a first notification) is generated and output to the processor 11 to notify that it has been detected that the person should be identified.
  • the processor 11 may perform each of the processing of calculating the emotion index, the processing of evaluating the attention index, and the processing of determining whether to generate an alarm. Functions realized by the processor 11 and the learning model memory 15 when each of these processes is executed by the processor 11 will be described below.
  • the learning model memory 15 further stores a learned AI model capable of calculating a person's emotional index based on changes in the person's vital information over time.
  • the processor 11 calculates an emotion index that estimates the emotion of each object based on the chronological change in the acquired vital information. Processor 11 evaluates an attention index to determine whether to generate an alert based on the calculated human emotion index.
  • the processor 11 determines whether to generate an alert based on whether the evaluated caution index is equal to or greater than the threshold.
  • the processor 11 determines that the evaluated caution index is equal to or greater than the threshold, the processor 11 includes the position information of the person who passed the no-entry line LN, and the person detected by the short-distance detection radar RB is the person to whom attention should be paid.
  • An alarm (an example of a first notification) is generated to notify that a certain thing has been detected.
  • Processor 11 transmits the generated alert to server S1. Note that the warning may be the detection screen SC1. Further, the processor 11 may generate an alarm including a captured image obtained by cutting out at least a partial region of a person whose attention index is determined to be equal to or greater than the threshold from the captured image transmitted from the camera C1.
  • FIG. 9 is a flow chart showing an operation procedure example of the surveillance radar device RD1 according to the third modification of the first embodiment.
  • FIG. 10 is a flow chart showing an operation procedure example of the surveillance radar device RD1 according to the third modification of the first embodiment.
  • the processing of steps St102 to St103 and steps St203 to St206 may be executed by the radar IC or by the processor 11.
  • FIG. 10 is a flow chart showing an operation procedure example of the surveillance radar device RD1 according to the third modification of the first embodiment.
  • steps St100 to St103 and steps St200 to St203 are the same as those of FIG. 5, so description thereof will be omitted.
  • Each of the radar ICs RB1, ..., RBN generates time series data in which the acquired vital information of the object is arranged in time series.
  • Each of the radar ICs RB1, ..., RBN contains information as to whether or not the detected person is a person to be warned of, information such as the location (azimuth, distance) and movement speed of the person, and vital information of the person. are associated with each person, and output to the processor 11 .
  • the processor 11 acquires the object (for example, human, animal, etc.) detection result output from the long-range detection radar RA and the object (human) detection result output from the short-range detection radar RB.
  • the processor 11 generates a detection screen SC1 (see FIG. 11) in which the acquired detection results of these objects are superimposed on the map (step St312).
  • the processor 11 determines whether or not the person detected by the short-range detection radar RB has entered the no-entry area AR10 (step St313).
  • step St31313 When the processor 11 determines in the process of step St313 that the detected human has entered the no-entry area AR10 (step St313, YES), it calculates the caution index of the detected object. The processor 11 determines whether or not the calculated caution index is greater than or equal to the threshold (step St314).
  • step St313 when the processor 11 determines in the process of step St313 that the detected human has not entered the no-entry area AR10 (step St313, NO), the long-distance detection radar RA and the short-distance detection radar A detection screen SC1 is generated by superimposing the detection results of the objects detected by each of and on a map (step St315). The processor 11 transmits the generated detection screen SC1 to the server S1 (step St315).
  • step St314 When the processor 11 determines in the process of step St314 that the calculated caution index is equal to or greater than the threshold (step St314, YES), it generates an alarm notifying that a person to be warned of has been detected, and sends the warning to the server S1. It transmits (issues a report) (step St316).
  • step St314 determines in the process of step St314 that the calculated attention index is not equal to or greater than the threshold (step St314, NO), the process proceeds to step St315.
  • the surveillance radar device RD1 calculates a caution index based on the emotion index of the detected person and, based on the calculated caution index, detects the detection of a person requiring caution (that is, monitoring or vigilance). You can notify the administrator. As a result, the surveillance radar device RD1 can support the detection and surveillance of a person to whom attention should be paid in the surveillance work by the administrator.
  • FIG. 11 is a diagram showing an example of the detection screen SC1.
  • the detection screen SC1 is generated by the processor 11 of the surveillance radar device RD1 and displayed on the monitor MN by the server S1. Note that the detection screen SC1 may be generated by superimposing the detection result on a two-dimensional map or a three-dimensional map.
  • a detection screen SC1 shown in FIG. 11 shows an example generated by superimposing a detection result on a two-dimensional map.
  • the installation position PS0 of the surveillance radar device RD1 and the respective positions of the objects PS1, PS2, and PS3 detected by the surveillance radar device RD1 are superimposed on a map including the detection area AR0 of the surveillance radar device RD1. be.
  • the detection screen SC1 determines that the object PS3 detected by the short-range detection radar RB is a living thing (person, animal, etc.) as a result of the first determination process or the fourth determination process, the vital information INF is are further superimposed.
  • the vital information INF shown in FIG. 11 shows an example including heart rate information “heartbeat: ⁇ ” and respiration rate information “breathing: XXX” of the object PS3, but is not limited to this.
  • the vital information INF may include the result of the third determination process or the sixth determination process for determining whether or not the object detected in the first detection area AR1 is a human.
  • processor 11 may generate vital information INF (an example of first notification information) including information notifying that the object is human.
  • vital information INF an example of second notification information
  • the vital information INF (an example of the first notification) shown in FIG. 11 includes collation result information "unregistered person” indicating that the object PS3 is a person and not a person pre-registered in the database DB11. If the object PS3 is a human and is a person pre-registered in the database DB11, the vital information INF (an example of the first notification) is a collation indicating that the person is pre-registered in the database DB11. Result information (for example, a person's name, face image, employee number, etc. associated with the vital information stored in the database DB11) may be included.
  • the server S1 executes enlargement, reduction, rotation, etc., of the detection screen SC1 displayed on the monitor MN based on operations such as enlargement, reduction, rotation, etc. by the administrator via the operation unit 23. Display on monitor MN.
  • the surveillance radar device RD1 has a transmission antenna unit that transmits a first radio wave in a first detection area AR1 (an example of a first range).
  • a radar IC RB1 that detects the presence or absence of an object in the first detection area AR1 based on the reflected wave and acquires the detection result (an example of the first information) regarding the object detected in the first detection area AR1, ..., RBN (an example of a first detection processing unit) and a transmission antenna unit RAT1, .
  • the surveillance radar device RD1 according to the first embodiment and modifications 1 to 3 of the first embodiment has a long range (that is, the second detection area AR2) and a short range (the first detection area AR1 ), detection of an object located at a long distance, detection of an object located at a short distance, and vital sensing can be performed at the same time.
  • the surveillance radar device RD1 according to Embodiment 1 uses different radars for a long range (that is, the second detection area AR2) and for a short range (the first detection area AR1). While monitoring a wide range with one unit, it can determine whether an object is a living thing at a short distance.
  • the detection result regarding the object detected in the first detection area AR1 is further divided into the first It includes a position (an example of the first coordinate information) indicating the coordinates of the object detected in the detection area AR1.
  • the detection result regarding the object detected in the second detection area AR2 includes second coordinate information indicating the coordinates of the object detected in the second range.
  • the monitoring radar device RD1 according to the first embodiment and the first to third modifications of the first embodiment includes the detection results acquired by the radar ICs RB1, . . . , RBN and the radar ICs RA1, . and a processor 11 for generating and outputting a detection screen SC1 (an example of notification information) related to the detected object based on the detection result (an example of second information) obtained by.
  • the surveillance radar device RD1 according to the first embodiment and modifications 1 to 3 of the first embodiment outputs the detection result of the object detected in the detection area AR0, thereby It is possible to support the monitoring work performed by such as.
  • the radar ICs RB1, ..., RBN of the surveillance radar device RD1 according to Embodiment 1 and Modifications 1 to 3 of Embodiment 1 operate in the first detection area AR1 based on the identification information.
  • a first determination process is performed to determine whether the detected object is a living thing.
  • the processor 11 stores information (for example, vital information of the object) indicating that an object determined to be a living thing by the first determination process is a living thing, and information corresponding to the object determined to be a living thing by the first determination process.
  • a detection screen SC1 is generated by superimposing position information (an example of first coordinate information) indicating coordinates (that is, azimuth and distance) to a map including at least the first detection area AR1.
  • the surveillance radar device RD1 can detect an object (creature) using the short-range detection radar RB, and the object is detected. By outputting the position, it is possible to support monitoring work performed by administrators, security guards, and the like.
  • the radar ICs RB1, ..., RBN of the surveillance radar device RD1 according to the first modification of the first embodiment determine whether or not the object detected in the first detection area AR1 is moving. 2 determination processing is performed, and if the identification information corresponding to the object determined not to move in the second determination processing includes biological information indicating that the object is a living thing, in the second determination processing An object determined not to move is determined to be a living thing. Accordingly, the surveillance radar device RD1 according to Modification 1 of Embodiment 1 can determine whether or not the detected object is a living thing.
  • the radar ICs RB1, ..., RBN of the surveillance radar device RD1 according to Modification 2 of Embodiment 1 have the identification information, the size of the object, and the reception antenna units RBR1, ..., RBRN
  • a third determination process is performed for determining whether or not the object determined to be a living thing in the first determination process is a human based on at least one of the intensity of the reflected wave of the radio wave and the intensity of the reflected wave of the radio wave.
  • the surveillance radar device RD1 according to the second modification of the first embodiment can more accurately determine whether the detected object is a person or an animal.
  • the processor 11 of the surveillance radar device RD1 determines whether or not the object detected in the first detection area AR1 is a living thing based on the identification information.
  • Information indicating that an object determined to be a living thing in the fourth determination process is a living thing, and position information corresponding to the object determined to be a living thing in the fourth determination process is superimposed on a map showing at least the first detection area AR1 to generate a detection screen SC1.
  • the surveillance radar device RD1 can detect an object (creature) using the short-range detection radar RB, and by outputting the position at which this object is detected, the management It can support surveillance work performed by personnel, security guards, etc.
  • the processor 11 of the surveillance radar device RD1 performs the fifth determination process of determining whether or not the object detected in the first detection area AR1 is moving. and if it is determined that the identification information corresponding to the object determined not to move in the fifth determination process includes biological information indicating that the object is a living thing, the fifth determination process moves An object that is determined not to be is determined to be a living thing.
  • the surveillance radar device RD1 according to Modification 1 of Embodiment 1 can determine whether or not a stationary object is a living thing.
  • the processor 11 of the surveillance radar device RD1 according to the second modification of the first embodiment includes the identification information, the size of the object, and the reflected waves of the first radio waves received by the receiving antenna units RBR1, . . . , RBRN.
  • a sixth determination process is performed to determine whether or not the object determined to be a living thing in the fourth determination process is a human based on at least one of the intensity of the .
  • the surveillance radar device RD1 according to the second modification of the first embodiment can more accurately determine whether the detected object is a person or an animal.
  • the processor 11 of the surveillance radar device RD1 according to the second modification of the first embodiment uses the radar ICs RB1, . . . , RBN or the first Generate first notification information for notification and second notification information for second notification different from the first notification regarding an object not determined to be human by the radar IC RB1, . . . , RBN or processor 11 .
  • the surveillance radar device RD1 according to the second modification of the first embodiment can notify the administrator, security guard, etc. of whether or not the creature is a human (person).
  • the processor 11 of the surveillance radar device RD1 according to the second modification of the first embodiment has determined that an object determined to be a human has entered the no-entry area AR10 into which humans are prohibited from entering.
  • the first notification is that a person has entered the no-entry area AR10.
  • the surveillance radar device RD1 according to Modification 2 of Embodiment 1 notifies the administrator, security guards, etc. of the detection of the intrusion of a person into the intrusion-prohibited area AR10 that has been set in advance, so that surveillance work can be carried out. can support
  • the processor 11 of the surveillance radar device RD1 according to the third modification of the first embodiment based on the vital information of the object determined to be human by the radar ICs RB1, .
  • the emotion index of the object determined to be human is obtained, and the emotion index related to the object determined to be human is notified as a first notification.
  • the surveillance radar device RD1 according to the third modification of the first embodiment notifies the administrator, the security guard, etc. of the detection of a person for whom an excited state or an aggressive emotion index has been calculated, thereby performing surveillance work. can support
  • the processor 11 of the surveillance radar device RD1 according to the third modification of the first embodiment is the radar IC RB1, .
  • the first notification is that an object requiring attention has entered the no-entry area.
  • the surveillance radar device RD1 according to the third modification of the first embodiment prevents a person whose excited state or aggressive emotion index has been calculated from entering the preset no-entry area AR10. Monitoring work can be supported by notifying a security guard or the like.
  • the processor 11 of the surveillance radar device RD1 according to the second modification of the first embodiment includes the radar ICs RB1, . If the registered biometric information is similar, as the first notification, the radar IC RB1, .
  • the surveillance radar device RD1 according to Modification 2 of Embodiment 1 a person located within the first detection area AR1 is registered in the database DB11 in advance, and the person is prevented from entering the first detection area AR1. It can be determined whether the person is an authorized person.
  • the first radio wave and the second radio wave in the surveillance radar device RD1 according to Modifications 1 to 3 of Embodiment 1 are millimeter waves or microwaves.
  • the surveillance radar device RD1 according to Embodiment 1 and Modifications 1 to 3 of Embodiment 1 emits radio waves in a wavelength band more suitable for acquiring vital information, thereby detecting objects in the detection area AR0. (People, animals, etc.) can be detected with higher accuracy.
  • the detection results acquired by the radar ICs RB1, The distance from PS0 to the object detected in the first detection area AR1, the azimuth from the installation position PS0 of the surveillance radar device RD1 toward the object detected in the first detection area AR1, and the height of the detected object.
  • the surveillance radar device RD1 according to Embodiment 1 and Modifications 1 to 3 of Embodiment 1 can determine the type of object based on the position of the object based on the position and orientation of the object and the height of the object. (eg, person, animal, vehicle, etc.).
  • the detection results obtained by the radar ICs RB1, . . . , RBN further include the moving speed of the object detected in the first detection area AR1.
  • the radar ICs RA1, . . . , RAM obtain the moving speed of the object detected in the second detection area AR2.
  • the detection results obtained by the radar ICs RA1, . . . , RAM further include the moving speed of the object detected in the second detection area AR2.
  • the surveillance radar device RD1 according to Modifications 1 to 3 of Embodiment 1 can determine the type of object (for example, person, animal, vehicle, etc.) to a higher degree based on the calculated moving speed of the object. Accurate judgment is possible.
  • the map of the surveillance radar device RD1 according to Embodiment 1 and Modifications 1 to 3 of Embodiment 1 is two-dimensional or three-dimensional map data.
  • the monitoring radar device RD1 according to the first embodiment and modifications 1 to 3 of the first embodiment can be monitored by an administrator using the detection screen SC1 in which detection information is superimposed on a two-dimensional or three-dimensional map. I can support my business.
  • the identification information includes the respiration rate, the heart rate, the blood pressure, the respiration interval, and the heartbeat interval. and at least one of
  • the surveillance radar device RD1 according to the first embodiment and modifications 1 to 3 of the first embodiment can classify the type of detected object (for example, person, animal, etc.) to a higher level based on the identification information. Accurate judgment is possible.
  • the detection result regarding the object detected in the second detection area AR2 is the identification information (for example, breathing rate, heart rate).
  • the surveillance radar device RD1 according to the first embodiment and modifications 1 to 3 of the first embodiment can more accurately identify the type of detected object (for example, a living thing, a stationary object, etc.) based on the identification information. It can be determined with high accuracy.
  • the detection system 100 (an example of a monitoring system) according to Embodiment 1 and Modifications 1 to 3 of Embodiment 1 emits the first radio wave in the first detection area AR1 (an example of the first range). transmitting; receiving a reflected wave of the first radio wave; detecting presence or absence of an object in the first detection area AR1 based on the reflected wave of the first radio wave; Acquiring first information that is information about a detected object and includes identification information that can determine whether or not the object is a living thing; obtaining a second detection area AR2 that is wider than the first detection area AR1; Transmitting the second radio wave in (an example of the second range), receiving the reflected wave of the second radio wave, and detecting the presence or absence of the object in the second detection area AR2 based on the reflected wave of the second radio wave and obtaining second information about the object detected in the second detection area AR2.
  • the detection system 100 according to Embodiment 1 and Modifications 1 to 3 of Embodiment 1 has a long range (that is, the second detection area AR2) and a short range (detection area AR0), respectively.
  • the surveillance radar device RD1 according to the first embodiment uses different radars for a long range (that is, the second detection area AR2) and for a short range (detection area AR0). While monitoring a wide range with , it can determine whether an object is a living thing at a short distance.
  • the present disclosure is useful as a monitoring device, a monitoring system, and a monitoring method that improve the detection accuracy of living things by radar.
  • AI processing unit 14 AI arithmetic processing unit 15 learning model memory 100 detection system AR0 detection area AR1 first detection area AR2 second detection area C1 camera DB11, DB2 database DR1 security drone RA1, RAM, RB1, RBN radar IC RAT1, RATM, RBT1, RBTN Transmitting antenna unit RAR1, RARM, RBR1, RBRN Receiving antenna unit RD1 Monitoring radar device MN Monitor NW Network S1 Server TP1 Security guard terminal

Abstract

Un dispositif de surveillance comprend une première unité d'émission/réception permettant d'émettre des premières ondes radio dans une première plage et de recevoir des ondes réfléchies des premières ondes radio, une première unité de traitement de détection permettant de détecter la présence/l'absence d'un objet dans la première plage et d'acquérir des premières informations relatives à l'objet détecté dans la première plage, une seconde unité d'émission/réception permettant d'émettre des secondes ondes radio dans une seconde plage plus grande que la première plage et de recevoir des ondes réfléchies des secondes ondes radio, et une seconde unité de traitement de détection permettant de détecter la présence/l'absence d'un objet dans la seconde plage et d'acquérir des secondes informations relatives à l'objet détecté dans la seconde plage, les premières informations comprenant des informations d'identification grâce auxquelles il est possible de déterminer si l'objet est un objet vivant.
PCT/JP2023/004584 2022-02-28 2023-02-10 Dispositif de surveillance, système de surveillance et procédé de surveillance WO2023162723A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022029909 2022-02-28
JP2022-029909 2022-02-28

Publications (1)

Publication Number Publication Date
WO2023162723A1 true WO2023162723A1 (fr) 2023-08-31

Family

ID=87765798

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/004584 WO2023162723A1 (fr) 2022-02-28 2023-02-10 Dispositif de surveillance, système de surveillance et procédé de surveillance

Country Status (1)

Country Link
WO (1) WO2023162723A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006109771A1 (fr) * 2005-04-11 2006-10-19 Optex Co., Ltd. Capteur d’intrusion
JP2007178140A (ja) * 2005-12-27 2007-07-12 Hitachi Ltd 物体検知センサ
CN110779150A (zh) * 2019-11-14 2020-02-11 宁波奥克斯电气股份有限公司 一种基于毫米波的空调器控制方法、装置及空调器
JP2020024185A (ja) * 2018-06-22 2020-02-13 旭化成エレクトロニクス株式会社 センサ装置およびシステムならびに生体センシング方法およびシステム

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006109771A1 (fr) * 2005-04-11 2006-10-19 Optex Co., Ltd. Capteur d’intrusion
JP2007178140A (ja) * 2005-12-27 2007-07-12 Hitachi Ltd 物体検知センサ
JP2020024185A (ja) * 2018-06-22 2020-02-13 旭化成エレクトロニクス株式会社 センサ装置およびシステムならびに生体センシング方法およびシステム
CN110779150A (zh) * 2019-11-14 2020-02-11 宁波奥克斯电气股份有限公司 一种基于毫米波的空调器控制方法、装置及空调器

Similar Documents

Publication Publication Date Title
US11164329B2 (en) Multi-channel spatial positioning system
US20210117585A1 (en) Method and apparatus for interacting with a tag in a cold storage area
US8884813B2 (en) Surveillance of stress conditions of persons using micro-impulse radar
US11514207B2 (en) Tracking safety conditions of an area
CN101221621B (zh) 警告监督用户受监视用户的行为的方法和系统
US11615620B2 (en) Systems and methods of enforcing distancing rules
Lin et al. WiAU: An accurate device-free authentication system with ResNet
US11727518B2 (en) Systems and methods for location fencing within a controlled environment
JPWO2007138811A1 (ja) 不審行動検知装置および方法、プログラムおよび記録媒体
AU2017376121A1 (en) Drone pre-surveillance
US11625510B2 (en) Method and apparatus for presentation of digital content
WO2021082112A1 (fr) Procédé d'entraînement de réseau neuronal, procédé de construction de diagramme de squelette et procédé et système de surveillance de comportement anormal
US11450197B2 (en) Apparatus and method of controlling a security system
JP2008204219A (ja) 防犯システム、それに用いる不審者検出装置及び防犯サーバ
CN105938648A (zh) 一种家庭安全综合管理系统及方法
US20110260859A1 (en) Indoor and outdoor security system and method of use
CN209312052U (zh) 一种人脸识别随身警务系统
EP3910539A1 (fr) Systèmes et procédés d'identification de personnes d'intérêt
Nadeem et al. A smart city application design for efficiently tracking missing person in large gatherings in Madinah using emerging IoT technologies
WO2023162723A1 (fr) Dispositif de surveillance, système de surveillance et procédé de surveillance
Johnson Jr et al. Social-distancing monitoring using portable electronic devices
CN111914050A (zh) 基于特定场所的可视化3d监控平台
CN108022411B (zh) 基于图像处理的监控系统
JP2019213116A (ja) 画像処理装置、画像処理方法およびプログラム
US11341741B2 (en) Arial based parolee tracking and pursuit

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23759727

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2024503019

Country of ref document: JP

Kind code of ref document: A