WO2023162723A1 - Monitoring device, monitoring system, and monitoring method - Google Patents

Monitoring device, monitoring system, and monitoring method Download PDF

Info

Publication number
WO2023162723A1
WO2023162723A1 PCT/JP2023/004584 JP2023004584W WO2023162723A1 WO 2023162723 A1 WO2023162723 A1 WO 2023162723A1 JP 2023004584 W JP2023004584 W JP 2023004584W WO 2023162723 A1 WO2023162723 A1 WO 2023162723A1
Authority
WO
WIPO (PCT)
Prior art keywords
range
information
processor
detection
radar
Prior art date
Application number
PCT/JP2023/004584
Other languages
French (fr)
Japanese (ja)
Inventor
雅士 古賀
Original Assignee
i-PRO株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by i-PRO株式会社 filed Critical i-PRO株式会社
Publication of WO2023162723A1 publication Critical patent/WO2023162723A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/117Identification of persons
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/50Systems of measurement based on relative movement of target
    • G01S13/52Discriminating between fixed and moving objects or between objects moving at different speeds
    • G01S13/56Discriminating between fixed and moving objects or between objects moving at different speeds for presence detection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/87Combinations of radar systems, e.g. primary radar and secondary radar
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B25/00Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
    • G08B25/01Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium
    • G08B25/04Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium using a single signalling line, e.g. in a closed loop

Definitions

  • the present disclosure relates to a monitoring device, a monitoring system and a monitoring method.
  • Patent Document 1 discloses a contactless vital signs monitoring system having a radar installed near a monitored object.
  • the non-contact vital signs monitoring system stores the output signals of the radar sequentially sampled within a predetermined period of time, determines the status of the monitored object based on the output signals, and determines the status of the monitored object based on the stationary output signals. Determine the subject's vital signs.
  • a non-contact vital signs monitoring system uses radar to detect and measure a monitored subject's health status and signs of illness, such as body temperature, blood pressure, heart rate, and respiratory rate. It is determined whether or not.
  • the surveillance radar distinguishes a background such as buildings and trees from a person by detecting a moving object.
  • the present disclosure has been devised in view of the above-described conventional circumstances, and aims to improve the detection accuracy of living things by radar.
  • the present disclosure includes a first transceiver that transmits a first radio wave in a first range and receives a reflected wave of the first radio wave, and a reflected wave of the first radio wave that is received by the first transceiver.
  • a first detection processing unit that detects the presence or absence of an object in the first range and acquires first information about the object detected in the first range; and a second radio wave in a second range that is wider than the first range. and detecting the presence or absence of an object in the second range based on the reflected wave of the second radio wave received by the second transmitting/receiving unit that receives the reflected wave of the second radio wave.
  • a second detection processing unit that acquires second information about the object detected in the second range, wherein the first information includes identification information that can determine whether the object is a living thing. , provides monitoring equipment.
  • the present disclosure is a monitoring system including a monitoring device and an information processing device communicably connected to the monitoring device, wherein the monitoring device transmits a first radio wave in a first range, a first transmission/reception unit that receives a reflected wave of a first radio wave; based on the reflected wave of the first radio wave received by the first transmission/reception unit, detects the presence or absence of an object in the first range; and a second transmission/reception unit that transmits a second radio wave in a second range wider than the first range and receives a reflected wave of the second radio wave.
  • the first information includes identification information capable of determining whether or not the object is a living thing.
  • the present disclosure is to transmit a first radio wave in a first range, receive a reflected wave of the first radio wave, and based on the reflected wave of the first radio wave, determine the position of an object in the first range. detecting the presence/absence; acquiring first information that is information about an object detected in the first range and that includes identification information that can determine whether the object is a living thing; Transmitting a second radio wave in a second range wider than the range, receiving a reflected wave of the second radio wave, and determining the presence or absence of an object in the second range based on the reflected wave of the second radio wave.
  • a method of monitoring is provided, comprising: sensing; and obtaining second information about an object sensed at the second range.
  • FIG. 1 is a block diagram showing a system configuration example of a detection system according to Embodiment 1.
  • FIG. 2 is a block diagram showing an internal configuration example of the surveillance radar device according to the first embodiment.
  • FIG. 3 is a diagram illustrating an example of detection areas of the long-distance detection radar and the short-distance detection radar.
  • FIG. 4 is a diagram illustrating an example of detection areas of the long-distance detection radar and the short-distance detection radar.
  • 5 is a flow chart showing an example of an operation procedure of the surveillance radar device according to Embodiment 1.
  • FIG. 6 is a flowchart showing an example of an operation procedure of the surveillance radar device in Modification 1 of Embodiment 1.
  • FIG. 1 is a block diagram showing a system configuration example of a detection system according to Embodiment 1.
  • FIG. 2 is a block diagram showing an internal configuration example of the surveillance radar device according to the first embodiment.
  • FIG. 3 is a diagram illustrating an example of detection areas of the
  • FIG. 7 is a flowchart illustrating an example of an operation procedure of a surveillance radar device according to Modification 2 of Embodiment 1.
  • FIG. 8 is a flowchart showing an example of an operation procedure of the surveillance radar device in Modification 2 of Embodiment 1.
  • FIG. 9 is a flowchart illustrating an example of an operation procedure of a surveillance radar device according to Modification 3 of Embodiment 1.
  • FIG. 10 is a flowchart showing an example of an operation procedure of the surveillance radar device in Modification 3 of Embodiment 1.
  • FIG. FIG. 11 is a diagram showing an example of a detection screen.
  • FIG. 1 is a diagram showing a system configuration example of a detection system 100 according to Embodiment 1.
  • the detection system 100 includes one or more surveillance radar devices RD1, a server S1, and a network NW.
  • the detection system 100 may also include a camera C1, a security drone DR1, and a security guard terminal TP1. Note that the camera C1, the security drone DR1, and the security guard terminal TP1 are not essential elements of the detection system 100 and may be omitted.
  • the detection system 100 is an example of a surveillance system, and uses at least one surveillance radar device RD1 installed indoors and outdoors to detect objects (for example, humans, animals, buildings, etc.).
  • the detection system 100 analyzes the vital information of the detected object by the surveillance radar device RD1.
  • the detection system 100 superimposes the position of the detected object (for example, a person) and vital information on a map (an example of map information) corresponding to the detection area AR0 of the surveillance radar device RD1, and a detection screen SC1 ( An example of notification information and superimposed map information) (see FIG. 11) is generated.
  • the detection system 100 displays the detection screen generated by the surveillance radar device RD1 on the monitor MN of the server S1.
  • the detection area AR0 corresponds to an area in which an object can be detected by the long-distance detection radar RA (see FIG. 2) and the short-distance detection radar RB (see FIG. 2).
  • the surveillance radar device RD1 is connected to the server S1 and the camera C1 via the network NW so as to be capable of wired communication or wireless communication.
  • Wireless communication is, for example, short-range wireless communication such as Bluetooth (registered trademark) or NFC (registered trademark), or communication via a wireless LAN (Local Area Network) such as Wi-Fi (registered trademark).
  • the surveillance radar device RD1 detects the position, moving speed, etc. of an object from within the second detection area AR2 (see FIG. 3) using the long-distance detection radar RA. Then, the surveillance radar device RD1 acquires vital information of an object located within the first detection area AR1 (see FIG. 3) detected by the short-range detection radar RB.
  • the surveillance radar device RD1 generates a detection screen SC1 (see FIG. 11) and transmits it to the server S1.
  • the detection screen SC1 superimposes the position of an object detected by the long-distance detection radar RA or the position and vital information of an object detected by the short-distance detection radar RB on a map corresponding to the detection area AR0. be. Note that the number of surveillance radar devices RD1 shown in FIG. 1 is one, but may be two or more.
  • the surveillance radar device RD1 may acquire a captured image captured by the camera C1 when it is connected to the camera C1 described later so that data can be transmitted and received. In such a case, the surveillance radar device RD1 displays the position of the object detected by the long-range detection radar RA or the position and vital information of the object detected by the short-range detection radar RB on the image captured by the camera C1. may be superimposed to generate a detection screen.
  • the camera C1 is connected to the surveillance radar device RD1, the server S1, or the security guard terminal TP1 via the network NW so as to be capable of wired or wireless communication.
  • the camera C1 is, for example, a monitoring camera such as a security camera.
  • the camera C1 captures an image of at least a part of the detection area AR0, and transmits the captured image to the monitoring radar device RD1, the server S1, or the security guard terminal TP1. Note that the number of cameras C1 shown in FIG. 1 is one, but may be two or more.
  • the camera C1 may be a camera equipped with artificial intelligence (AI).
  • AI artificial intelligence
  • the camera C1 uses a learned AI model to detect an object appearing in the captured image, and acquires the position of the object.
  • the camera C1 also acquires the detection result including the vital information of the object (for example, information such as breathing rate, heart rate, blood pressure, breathing interval, or heartbeat interval) transmitted from the surveillance radar device RD1.
  • the camera C1 may generate a detection screen in which the acquired position of the object and the vital information of the object are superimposed on the captured image, and transmit the detection screen to the server S1.
  • the server S1 is connected to the surveillance radar device RD1, the camera C1, the security drone DR1, or the security guard terminal TP1 via the network NW so as to be capable of wired communication or wireless communication.
  • the server S1 transmits data between an operation unit 23 that can be operated by a user (e.g., an employee of a management company who monitors the monitored area, a security guard, a manager, etc.) and a monitor MN that can be viewed by the user. Connected for transmission and reception.
  • the server S1 may be realized by an information processing device such as a PC (Personal Computer), a notebook PC, a tablet terminal, a smart phone, or the like.
  • the server S1 outputs the detection screen SC1 (see FIG. 11) and the like transmitted from the surveillance radar device RD1 or the camera C1 to the monitor MN for display. Further, when the user performs an operation to report to a predetermined destination (for example, a management company, a security company, an insurance company, a security guard, a security drone DR1, a security guard terminal TP1, etc.), the server S1 performs a predetermined report to the reporting destination. Based on the user's operation, the server S1 directs the security drone DR1 to the position where the object was detected, generates a control command for intimidating the object, giving a warning, etc., and transmits the control command to the security drone DR1. good too.
  • a predetermined destination for example, a management company, a security company, an insurance company, a security guard, a security drone DR1, a security guard terminal TP1, etc.
  • the server S1 includes at least a communication unit 20, a processor 21, and a memory 22.
  • the database DB2 may be mounted in an information processing apparatus different from the server S1 and connected to the server S1 so that data can be transmitted and received.
  • the communication unit 20 is configured using a communication interface circuit for executing data transmission/reception with the surveillance radar device RD1, the camera C1, the security drone DR1, or the security guard terminal TP1 via the network NW.
  • the communication unit 20 outputs the detection screen SC1 transmitted from the surveillance radar device RD1 or the camera C1 to the processor 21 through a wireless communication network or a wired communication network. Also, the communication unit 20 transmits a control command corresponding to the user operation output from the processor 21 to the corresponding device (for example, the security drone DR1, the security guard terminal TP1, etc.).
  • the processor 21 is, for example, a computing device such as a CPU (Central Processing Unit) or an FPGA (Field Programmable Gate Array).
  • the processor 21 cooperates with the memory 22 to perform various types of processing and control. Specifically, the processor 21 implements various functions by loading and executing programs and data held in the memory 22 .
  • the processor 21 Based on the electrical signal output from the operation unit 23, the processor 21 receives a preset report destination (e.g., a management company, a security company, an insurance company, or a security guard terminal TP1, etc.), and receives the mail address and telephone number of the report destination. etc.). Based on the electrical signal output from the operation unit 23, the processor 21 moves the security drone DR1 to the position of the detected object and generates a control command to threaten or warn the object.
  • a preset report destination e.g., a management company, a security company, an insurance company, or a security guard terminal TP1, etc.
  • the memory 22 includes, for example, a RAM (Random Access Memory) as a work memory used when executing each process of the processor 21, and a ROM (Read Only Memory) for storing programs and data that define the operation of the processor 21. have.
  • the memory 22 may have a storage device including either a storage device such as an SSD (Solid State Drive) or HDD (Hard Disk Drive). Data or information generated or obtained by the processor 21 is temporarily stored in the RAM.
  • a program that defines the operation of the processor 21 is written in the ROM.
  • the memory 22 stores the information of the report destination (for example, the management company, the security company, the insurance company, or the mail address and telephone number of the report destination of the security guard terminal TP1, etc.), the information regarding the security drone DR1, and the like.
  • the database DB2 is, for example, a storage device such as an HDD or SSD.
  • the database DB2 includes the detection area AR0, information (eg, serial number, ID, etc.) of the surveillance radar device RD1 that monitors the detection area AR0, and information (eg, , name, face image, vital information, etc.) are linked and registered (stored).
  • the operation unit 23 is a user interface configured using, for example, a touch panel, buttons, keyboard, and the like.
  • the operating unit 23 converts the accepted user's operation into an electrical signal (control command) and outputs the electrical signal to the processor 21 .
  • the operation unit 23 is a touch panel, the operation unit 23 is configured integrally with the monitor MN.
  • the monitor MN is a display such as LCD (Liquid Crystal Display) or organic EL (Electroluminescence).
  • the monitor MN displays the detection screen SC1 transmitted from the surveillance radar device RD1 or camera C1.
  • the security guard terminal TP1 is connected to the server S1 via the network NW so that data can be sent and received.
  • the security guard terminal TP1 is used by security guards guarding the detection area AR0 monitored by the surveillance radar device RD1, building employees, and the like, and is realized by, for example, a notebook PC, a tablet terminal, a smartphone, and the like.
  • the security terminal TP1 displays the detection screen SC1 or the alarm information transmitted from the server S1. Note that the number of security guard terminals TP1 shown in FIG. 1 is one, but may be two or more.
  • the security drone DR1 is connected to the surveillance radar device RD1 or the server S1 via the network NW so that data can be sent and received.
  • the security drone DR1 is equipped with a speaker, lighting, etc., and threatens, warns, etc. by voice or illumination light to an object.
  • the security drone DR1 flies to the position where the object is detected based on the control command transmitted from the surveillance radar device RD1 or the server S1.
  • the security drone DR1 flies to the position where the object is detected, it threatens, warns, or the like to the object.
  • the security drone DR1 may include a camera, capture an image of an object, and transmit the captured image (live video) to the server S1.
  • FIG. 2 is a block diagram showing an internal configuration example of the surveillance radar device RD1.
  • FIG. 3 is a diagram illustrating an example of detection areas of the long-distance detection radar RA and the short-distance detection radar RB.
  • FIG. 4 is a diagram illustrating an example of detection areas of the long-distance detection radar RA and the short-distance detection radar RB.
  • the surveillance radar device RD1 includes a communication unit 10, a processor 11, a memory 12, a long-range detection radar RA, a short-range detection radar RB, and a database DB11.
  • the database DB11 may be configured separately from the surveillance radar device RD1.
  • the database DB11 is not an essential component and may be omitted.
  • the communication unit 10 is configured using a communication interface circuit for executing data transmission/reception with the server S1 and the camera C1 via the network NW.
  • the communication unit 10 transmits the detection screen SC1 (see FIG. 11) generated by the processor 11 to the server S1 through a wireless communication network or a wired communication network.
  • the communication unit 10 also outputs the captured image transmitted from the camera C ⁇ b>1 to the processor 11 .
  • the processor 11 is configured using, for example, a CPU or FPGA, and cooperates with the memory 12 to perform various types of processing and control. Specifically, the processor 11 refers to the programs and data held in the memory 12 and executes the programs, thereby realizing various functions of the AI processing unit 13 and the like.
  • the processor 11 controls the long-range detection radar RA and the short-range detection radar RB.
  • the processor 11 may independently control the long-range detection radar RA and the short-range detection radar RB. Based on the detection result of the long-distance detection radar RA (an example of the first information) and the detection result of the short-range detection radar RB (an example of the second information), the processor 11 detects the distance between the long-distance detection radar RA and the near distance detection radar RA. It may be controlled cooperatively with the distance detection radar RB.
  • the processor 11 generates a detection screen SC1 (see FIG. 11) in which the detection result of the long-distance detection radar RA or the detection result of the short-distance detection radar RB is superimposed on the map, and outputs it to the communication unit 10. do.
  • the communication unit 10 transmits the detection screen SC1 output from the processor 11 to the server S1.
  • the processor 11 determines whether the person who has entered the first detection area AR1 is detected by the first detection based on the vital information of the person who has entered the first detection area AR1. Person authentication is performed to determine whether the person is permitted to enter the area AR1. If the processor 11 determines that the person who has entered the first detection area AR1 is not a person permitted to enter the first detection area AR1, the processor 11 generates a detection screen SC1 in which the position of the person is superimposed on a map, and sends the detection screen SC1 to the server S1. Send.
  • the memory 12 has, for example, a RAM as a work memory that is used when executing each process of the processor 11, and a ROM that stores programs and data that define the operation of the processor 11.
  • the memory 12 may have a storage device including either a storage device such as an SSD or an HDD.
  • the RAM temporarily stores data or information generated or acquired by the processor 11 .
  • a program that defines the operation of the processor 11 is written in the ROM.
  • the memory 12 stores map data corresponding to the detection area AR0 (that is, the first detection area AR1 and the second detection area AR2) of the surveillance radar device RD1.
  • the map may be two-dimensional map data or three-dimensional map data.
  • the first detection area AR1 is an area in which an object can be detected by the short-range detection radar RB.
  • the second detection area AR2 is an area in which an object can be detected by the long-distance detection radar RA.
  • the first detection area AR1 has a horizontal distance X1 from the installation position PS0 of the surveillance radar device RD1, and is a range in which the short-range detection radar RB can transmit and receive radio waves.
  • the second detection area AR2 has a horizontal distance X2 from the installation position PS0 of the surveillance radar device RD1, and is a range in which the long-distance detection radar RA can transmit and receive radio waves.
  • the distance X1 is a horizontal distance of 10 to 20 m from the installation position PS0 of the surveillance radar device RD1.
  • the distance X2 is greater than or equal to the distance X1, and the horizontal distance from the installation position PS0 of the surveillance radar device RD1 is, for example, 20 to 30 m.
  • the first detection area AR1 is smaller than the second detection area AR2 and is located closer to the installation position PS0 of the surveillance radar device RD1 than the second detection area AR2 (see FIG. 3). ). Also, the first detection area AR1 and the second detection area AR2 may partially overlap.
  • the above-described distances X1 and X2 are examples and are not limited to these. Further, if the distance X2 from the installation position PS0 of the surveillance radar device RD1 in the second detection area AR2 is equal to or greater than the distance X1 from the installation position PS0 of the surveillance radar device RD1, the radio waves are transmitted and received by the long-distance detection radar RA. is variable within the possible range.
  • the AI processing unit 13 forms a neural network based on at least one trained AI model.
  • the AI processing unit 13 forms a neural network corresponding to each trained AI model stored in the learning model memory 15 .
  • the AI processing unit 13 uses the formed neural network to perform signal analysis processing on the signals output from each of the long-distance detection radar RA and the short-distance detection radar RB.
  • AI processing unit 13 includes AI arithmetic processing unit 14 and learning model memory 15 .
  • the AI arithmetic processing unit 14 uses the learned AI model stored in the learning model memory 15, and based on the signal output from each of the long-distance detection radar RA and the short-distance detection radar RB, detected An object type determination process may be executed.
  • the type determination processing includes processing for determining whether or not a detected object is a living thing (examples of first determination processing and fourth determination processing), This includes processing for determining whether or not an object is a human (examples of third determination processing and sixth determination processing).
  • the AI arithmetic processing unit 14 uses the learned AI model stored in the learning model memory 15, and acquires vital information of the detected object based on the signal output from the short-range detection radar RB. may be executed.
  • the learning model memory 15 is composed of memories such as RAM, ROM, and flash memory.
  • the learning model memory 15 stores a learned AI model created in advance by learning processing.
  • the AI arithmetic processing unit 14 forms a neural network corresponding to the trained AI model stored in the learning model memory 15 and executes desired signal analysis processing.
  • the learning model memory 15 stores a trained AI model capable of determining the type of the detected object and a trained AI model capable of acquiring vital information of the detected object based on the signal output from the radar.
  • the learning model memory 15 may store a learned AI model with which the surveillance radar device RD1 can perform person authentication. Note that the above-described trained AI model is an example, and is not limited to this.
  • the learning model memory 15 may store other trained AI models used for other purposes.
  • the long-distance detection radar RA includes M (M: an integer equal to or greater than 1) radar ICs (Integrated Circuits) RA1, . Each radar IC is connected to transmitting antenna units RAT1, . . . , RATM and receiving antenna units RAR1, . The transmitting antenna units RAT1, . . . , RATM and the receiving antenna units RAR1, .
  • the long-distance detection radar RA detects objects within the second detection area AR2 (for example, vehicles, two-wheeled vehicles, etc.).
  • the second predetermined distance is, for example, several meters to 100 meters, and is a distance larger than the first predetermined distance at which the short-range detection radar RB can detect an object.
  • the second predetermined distance may be a distance that partially includes the first predetermined distance (see FIG. 3). That is, the second detection area AR2 may partially include the first detection area AR1 of the short-range detection radar RB.
  • each of the receiving antenna units RAR1, . . . , RARM converts the received reflected wave into an analog signal and outputs it to the corresponding radar IC.
  • Each of the M radar ICs RA1, . an example of processing), object type determination processing, and the like are executed.
  • the short-range detection radar RB includes N (N: an integer equal to or greater than 1) radar ICs RB1, ..., RBN (an example of a first detection processing unit). , RBN are connected to transmitting antenna units RBT1, . . . , RBTN and receiving antenna units RBR1, . The transmitting antenna units RBT1, . . . , RBTN and the receiving antenna units RBR1, .
  • the short-range detection radar RB detects an object within a first detection area AR1 near the installation position PS0 of the surveillance radar device RD1, and objects (for example, vehicles, motorcycles, etc.).
  • the first predetermined distance referred to here is, for example, several meters to 10 meters.
  • the transmission antenna units RAT1, ..., RATM transmit radio waves over a wider range than the transmission antenna units RBT1, ..., RBTN.
  • the receiving antenna units RAR1, . . . , RARM receive radio waves in a wider range than the receiving antenna units RBR1, .
  • the transmitting antenna units RBT1, . . . , RBTM transmit radio waves, and the receiving antenna units RBR1, . including).
  • the transmitting antenna units RAT1, . . . , RATM transmit radio waves, and the receiving antenna units RAR1, . (including waves). That is, the first detection area AR1 is closer to the installation position PS0 of the surveillance radar device RD1 than the second detection area AR2, and has a smaller object detectable range than the second detection area AR2.
  • Each of the N radar ICs RB1, ..., RBN is reflected by objects (for example, buildings, plants, humans, animals, etc.) existing in the irradiation direction of each antenna received by the receiving antenna units RBR1, ..., RBRN receive reflected waves.
  • Each of the receiving antenna units RBR1, . . . , RBRN further converts the received reflected wave into an analog signal and outputs it to the corresponding radar IC.
  • Each of the N radar ICs RB1, Acquisition processing of vital information of the object (for example, information such as respiratory rate, heart rate, blood pressure, breathing interval, or heartbeat interval) is executed.
  • the number of radar ICs provided in the surveillance radar device RD1 is only an example and is not limited to this. Further, the number M of radar ICs provided in the long-range detection radar RA, the number N of radar ICs provided in the short-range detection radar RB, and the number of antennas connected to each radar IC need not be the same, and may be different. may be
  • the database DB11 is configured using a storage device such as an HDD, SSD, or NAND.
  • the database DB11 registers (stores) information (for example, name, face image, vital information, etc.) about an object permitted to enter the first detection area AR1.
  • information for example, name, face image, vital information, etc.
  • the name, face image, vital information, etc. of a person who is permitted to enter the first detection area AR1 is stored. It is registered in association with an identifying identifier.
  • FIG. 5 is a flowchart showing an operation procedure example of the surveillance radar device RD1 according to the first embodiment. Note that the respective processes of steps St102 to St103 and steps St202 to St203 may be executed by the radar IC or by the processor 11. FIG.
  • Each of the radar ICs RA1, ..., RAM determines whether the object is human.
  • Each of the radar ICs RA1, . is generated and output to the processor 11 .
  • Each of the radar ICs RB1, . . . , RBN acquires reflected waves received by the receiving antenna units RBR1, .
  • Each of the radar ICs RB1, . . . , RBN performs clustering processing on the obtained point cloud data (step St203).
  • Each of the radar ICs RB1, ..., RBN determines whether or not the object is a living thing based on the size of the object indicated by the set of clustered point clouds, information on the calculated moving speed of the point cloud, etc. (step St203).
  • the size of the object indicated by the set of clustered point clouds, information on the calculated moving speed of the point cloud, and the like are examples of the identification information of the present embodiment.
  • the process of step St203 is an example of the first determination process or the third determination process.
  • each of the radar ICs RB1, ..., RBN determines whether an object (for example, an object determined to be a living thing) is a human. Analyze the signal from the object determined to be human among the set of clustered point clouds, and obtain the vital information of the object (for example, information such as respiratory rate, heart rate, blood pressure, breathing interval, or heartbeat interval) is calculated (step St203).
  • an object for example, an object determined to be a living thing
  • the vital information of the object for example, information such as respiratory rate, heart rate, blood pressure, breathing interval, or heartbeat interval
  • the surveillance radar device RD1 can obtain the vital information of the object as described above.
  • Vital information corresponds to respiratory rate, heart rate, blood pressure, breathing interval, heartbeat interval, or the like, that is, information that can determine whether an object is a living thing. Therefore, vital information is an example of identification information in this embodiment.
  • the respiratory rate, heart rate, blood pressure, breathing interval, heartbeat interval, etc. derived from living organisms fluctuate according to predetermined cycles and climatic conditions.
  • the vital information in which variation according to a predetermined period or climate conditions is observed is an example of the biological information of the present embodiment.
  • the biological information is information such as breathing rate, heart rate, blood pressure, breathing interval, heartbeat interval, extremely high/not too low breathing rate, heart rate, blood pressure, breathing interval, heartbeat interval, etc. It is information that has series variation.
  • Each of the radar ICs RB1, ..., RBN acquires information such as the position (azimuth, distance), movement speed, etc. of the object indicated by the point cloud set determined to be human from the clustered point cloud set do.
  • the processor 11 receives the coordinate information of the point cloud corresponding to the object output from each of the radar ICs RA1, . and get. Using the acquired coordinate information of the point cloud, the processor 11 superimposes the detection points of the object detected by the two types of radar on the same map to generate a detection screen SC1 (see FIG. 11) (step St300). At this time, if the vital information is associated with the coordinate information of the object, the processor 11 generates a detection screen SC1 in which the detection points of the object and the vital information are superimposed on the same map. Thereby, the processor 11 can display the detection point (coordinate information) of the object and the vital information of the object together in the detection screen SC1.
  • the two types of radar referred to here are a long-distance detection radar RA and a short-distance detection radar RB.
  • the processor 11 outputs the generated detection screen SC1 to the communication unit 10 and causes it to be transmitted to the server S1 (step St301).
  • the surveillance radar device RD1 repeatedly executes the above-described processing to detect an object positioned within the detection area AR0. At the timing when an object detected by the long-distance detection radar RA approaches the first detection area AR1, the surveillance radar device RD1 performs object detection processing by the short-distance detection radar RB and acquires vital information of the object. Processing may begin.
  • the surveillance radar device RD1 may perform person authentication based on the vital information acquired by the short-range detection radar RB.
  • the surveillance radar device RD1 uses different radars for long range (that is, second detection area AR2) and short range (first detection area AR1). By doing so, the surveillance radar device RD1 can simultaneously perform detection of an object positioned at a long distance, detection of an object positioned at a short distance, and vital sensing.
  • the surveillance radar device RD1 in Embodiment 1 described above detects an object positioned at a long distance (second detection area AR2), detects an object positioned at a short distance (first detection area AR1), and performs vital sensing. and an example of running An example in which the surveillance radar device RD1 in Modification 1 of Embodiment 1 determines whether or not a stationary object is a living thing will be described.
  • the configuration of the surveillance radar device RD1 according to Modification 1 has the same internal configuration as that of the surveillance radar device RD1 according to the first embodiment.
  • functions realized by each internal configuration of the surveillance radar device RD1 in Modification 1 will be described.
  • the surveillance radar device RD1 executes the second determination process or the fifth determination process for determining whether or not the detected object is stationary (that is, whether or not it is moving).
  • the surveillance radar device RD1 determines that the detected object is not moving (that is, is stationary) as a result of the second determination process or the fifth determination process, the object determined not to be moving is Determine whether or not it is a living thing.
  • a second determination process example and a fifth determination process example will be described below.
  • Each of the radar ICs RB1, ..., RBN of the short-range detection radar RB determines whether or not the detected object is stationary.
  • the wavelength of the reflected wave changes due to the Doppler phenomenon. Whether or not the object is moving can be determined based on the change in the wavelength of the reflected wave. That is, when the object is moving, the point cloud data includes information on the moving speed of the object in the direction toward the surveillance radar device RD1. Therefore, each of the radar ICs RB1, . Subsequently, each of the radar ICs RB1, ..., RBN analyzes the signal corresponding to the object determined to be stationary, and calculates the vital information of the object.
  • each of the radar ICs RB1, . position of the data to determine whether the detected object is stationary.
  • each of the radar ICs RB1 Based on the position of the point cloud data at t3 ⁇ t4)), it is determined whether or not the detected object is stationary. In such a case, each of the radar ICs RB1, . Each of the radar ICs RB1, .
  • Each of the radar ICs RB1, ..., RBN determines whether the object is a stationary organism (human, animal, etc.) based on the calculated vital information of the object.
  • the processor 11 acquires information on the coordinates of the point cloud determined to be a stationary organism output from each of the radar ICs RB1, . . . , RBN, and vital information.
  • the processor 11 superimposes the acquired position of the stationary organism and the vital information on the map to generate a detection screen SC1 (see FIG. 11).
  • the processor 11 transmits the generated detection screen SC1 to the server S1.
  • FIG. 6 is a flowchart showing an operation procedure example of the surveillance radar device RD1 according to Modification 1 of Embodiment 1. As shown in FIG. Note that the respective processes of steps St104 to St105 and steps St204 to St205 may be executed by the radar IC or by the processor 11. FIG.
  • steps St100 to St101 and steps St200 to St202 are the same as those of FIG. 5, so description thereof will be omitted.
  • each of the radar ICs RA1, ..., RAM applies a background removal algorithm that removes the background from the generated point cloud data.
  • the background removal algorithm here is specifically CFAR (Constant False Alarm Rate) or the like.
  • Each of the radar ICs RA1, ..., RAM determines whether the object is a person.
  • Each of the radar ICs RB1, . . . , RBN acquires reflected waves received by the receiving antenna units RBR1, .
  • Each of the radar ICs RB1, . . . , RBN performs clustering processing on the obtained point cloud data (step St205).
  • each of the radar ICs RB1, ..., RBN further determines whether an object determined as a stationary object among the object type determination results is a living thing such as a human being or an animal.
  • Each of the radar ICs RB1, . information) is calculated (step St205).
  • Each of the radar ICs RB1, ..., RBN indicates the position (azimuth, distance), movement speed, etc. of the creature indicated by the set of point clouds determined to be a stationary creature among the clustered point clouds. and the calculated vital information are associated with each living thing to generate a detection result of the living thing.
  • Each of the radar ICs RB1, . . . , RBN outputs to the processor 11 the generated organism detection results.
  • the processor 11 receives the information on the coordinates of the point cloud corresponding to the creature output from each of the radar ICs RA1, . information and vital information.
  • the processor 11 compares the coordinates of the organism detected in the overlap area where the second detection area AR2 of the long-range detection radar RA and the first detection area AR1 of the short-range detection radar RB overlap (step St 302). It should be noted that the processor 11 may omit the process of comparing the information regarding the organisms detected in the overlapping area when the overlapping area does not exist.
  • the processor 11 determines that the same living thing is detected by each of the long-distance detection radar RA and the short-distance detection radar RB among the living things detected in the overlapping area, the long-distance detection radar RA
  • the detection results e.g., position, distance, etc.
  • the processor 11 generates a detection screen SC1 in which the detection result of the long-range detection radar RA and the detection result of the short-range detection radar RB are superimposed on the map, and transmits the detection screen SC1 to the server S1 (step St303).
  • the surveillance radar device RD1 may detect both stationary and moving creatures. In such a case, the surveillance radar device RD1 generates a detection screen SC1 by superimposing the detection result of a stationary creature and the detection result of a non-stationary object on a map, and transmits the detection screen SC1 to the server S1.
  • the surveillance radar device RD1 can determine whether the stationary object detected by the short-range detection radar RB is a mere object or a creature such as a human being or an animal. In other words, the surveillance radar device RD1 can improve the detection accuracy of stationary living things.
  • the surveillance radar device RD1 in Embodiment 1 described above detects an object positioned at a long distance (second detection area AR2), detects an object positioned at a short distance (first detection area AR1), and performs vital sensing. and an example of running An example in which the surveillance radar device RD1 according to Modification 2 of Embodiment 1 determines whether a detected object is a human or an animal will be described.
  • the configuration of the surveillance radar device RD1 in Modification 2 of Embodiment 1 has the same internal configuration as that of the surveillance radar device RD1 in Embodiment 1.
  • functions realized by each internal configuration of the surveillance radar device RD1 in Modification 2 of Embodiment 1 will be described.
  • the surveillance radar device RD1 determines whether the detected object is human or animal.
  • the learning model memory 15 may further store a learned AI model that can determine whether an object is human or animal.
  • Each of the radar ICs RB1, . and vital information to the processor 11 are included in the radar ICs RB1, . and vital information to the processor 11 .
  • the processor 11 acquires the coordinate information and the vital information of the point cloud determined to be human output from each of the radar ICs RB1, . . . , RBN.
  • the processor 11 superimposes the acquired human position and vital information on the map to generate a detection screen SC1 (see FIG. 11).
  • the processor 11 transmits the generated detection screen SC1 to the server S1.
  • the memory 12 stores areas where human entry is prohibited (hereinafter referred to as "no-entry areas”) or lines where human entry is prohibited (hereinafter referred to as “no-entry lines”).
  • the entry prohibition area AR10 (see FIG. 4) and the entry prohibition line LN (see FIG. 4) may each be set in advance by the administrator, and a plurality of them may be set within the detection area AR0.
  • the entry prohibition line LN may include information on the direction in which entry is prohibited.
  • the entry prohibition line LN may be set to prohibit the human TG from passing through the entry prohibition line LN in the direction of the arrow.
  • entry prohibition area AR10 and entry prohibition line LN may be set so as to prohibit entry of animals.
  • the database DB11 also stores information on the prohibited entry area AR10 and information on persons permitted to enter the prohibited area AR10 (for example, facial images and vital information of persons permitted to enter the prohibited area AR10). etc.) may be stored in association with each other.
  • the information on the entry prohibition line LN includes information on persons permitted to pass through the entry prohibition line LN (for example, a face image, vital information, etc. of a person permitted to enter the entry prohibition area AR10). etc.) may be associated and stored.
  • the processor 11 determines whether or not a person has entered the no-entry area AR10 based on the detected position of the person.
  • the processor 11 determines that a person has entered the intrusion prohibited area AR10
  • the processor 11 includes the position information of the person who has entered the intrusion prohibited area AR10 and notifies that the person's entry into the intrusion prohibited area AR10 has been detected.
  • generate an alert to Processor 11 transmits the generated alert to server S1.
  • the warning may be the detection screen SC1.
  • the processor 11 may generate an alarm including a captured image obtained by cutting out at least a partial area in which a person who has entered the no-entry area AR10 is captured from the captured image transmitted from the camera C1.
  • the processor 11 acquires the person's flow line information (movement trajectory information) based on the time-series change in the detected position of the person. Based on the flow line information of the person, the processor 11 determines whether the detected person has passed the entry prohibition line LN, or whether the detected person has passed the entry prohibition line LN from a predetermined direction. judge.
  • the processor 11 determines that the person has passed through the entry prohibition line LN
  • the processor 11 includes the position information of the person who has passed through the entry prohibition line LN and notifies that the person has passed through the entry prohibition line LN. Generate an alert.
  • Processor 11 transmits the generated alert to server S1.
  • the warning may be the detection screen SC1.
  • the processor 11 may generate an alarm including a captured image obtained by cutting out at least a partial region in which a person who has passed through the no-entry line LN is captured from the captured image transmitted from the camera C1.
  • the processor 11 when the set intrusion prohibition area AR10 or the intrusion prohibition line LN is set within the first detection area AR1 detectable by the short-distance detection radar RB, prevents the processor 11 from entering the intrusion prohibition area AR10. It may be determined whether or not the person who has entered or passed through the entry prohibition line LN is a person registered in the database DB11. The processor 11 collates the vital information of the person detected to have entered the no-entry area AR10 or passed the no-entry line LN with the vital information of each of the plurality of persons registered in the database DB11.
  • the processor 11 determines that the vital information of each of the plurality of persons registered in the database DB11 includes vital information identical or similar to the detected human vital information, the processor 11 omits the alarm generation process. On the other hand, when the processor 11 determines that the vital information of each of the plurality of persons registered in the database DB11 does not contain the vital information identical or similar to the detected human vital information, the processor 11 generates an alarm and Send to
  • FIG. 7 is a flow chart showing an operation procedure example of the surveillance radar device RD1 according to the second modification of the first embodiment.
  • FIG. 8 is a flow chart showing an operation procedure example of the surveillance radar device RD1 according to the second modification of the first embodiment.
  • the processing of steps St102 to St103 and steps St203 to St206 may be executed by the radar IC or by the processor 11.
  • FIG. 8 is a flow chart showing an operation procedure example of the surveillance radar device RD1 according to the second modification of the first embodiment.
  • steps St100 to St103 and steps St200 to St203 are the same as those of FIG. 5, so description thereof will be omitted.
  • the process of step St206 is an example of the third determination process or the sixth determination process.
  • each of the radar ICs RB1, ..., RBN can determine whether the object is a human or an animal based on the size of the clustered point cloud, the radio wave reflection intensity, etc., as described above. good.
  • each of the radar ICs RB1, ..., RBN acquires the image analysis results transmitted from the camera C1.
  • Each of the radar ICs RB1, ..., RBN is an object (human) in which information such as the position (azimuth, distance) and movement speed of an object determined to be a human is associated with human vital information for each person is generated and output to the processor 11 .
  • the processor 11 acquires the object (for example, person, animal, etc.) detection result output from the long-range detection radar RA and the object (human) detection result output from the short-range detection radar RB.
  • the processor 11 generates a detection screen SC1 (see FIG. 11) in which these acquired detection results are superimposed on the map (step St304).
  • the processor 11 determines whether or not the entire set no-entry area AR10 is within the first detection area AR1 by the short-range detection radar RB (step St305).
  • step St305 determines in the processing of step St305 that the entire set no-entry area AR10 is within the first detection area AR1 by the short-distance detection radar RB (step St305, YES)
  • the short-distance detection It is determined whether or not the object detected by the radar RB has entered the no-entry area AR10 (step St306).
  • step St306 determines whether the detected object has entered the no-entry area AR10 (step St306, YES).
  • the process of step St307 may be omitted when the process of step St206 is executed before the process of step St307.
  • step St306 determines in the process of step St306 that the detected object has not entered the no-entry area AR10 (step St306, NO)
  • the processor 11 displays the detection screen SC1 generated in the process of step St304. It is transmitted to the server S1 (step St311).
  • step St307 When the processor 11 determines in the process of step St307 that the detected object is a human (step St307, YES), it generates an alarm notifying that a human has entered the no-entry area AR10. , to the server S1 (step St308). The processor 11 also transmits the detection screen SC1 generated in the process of step St304 to the server S1 (step St311).
  • step St307 determines in the process of step St307 that the detected person is not a person (step St307, NO)
  • the process proceeds to step St311.
  • step St305 determines in the process of step St305 that the entire area of the set no-entry area AR10 is not within the first detection area AR1 by the short-range detection radar RB (step St305, NO)
  • the long-distance It is determined whether or not an object detected by the detection radar RA or the short-range detection radar RB has entered the no-entry area AR10 (step St309).
  • step St309 When the processor 11 determines in the process of step St309 that the detected object has entered the entry-prohibited area AR10 (step St309, YES), it notifies that the entry of the object into the entry-prohibited area AR10 has been detected. An alarm is generated and transmitted (issued) to the server S1 (step St308).
  • Objects detected here include moving objects such as humans and animals for which vital information can be obtained, and vehicles and motorcycles for which vital information cannot be obtained.
  • step St309 determines in the process of step St309 that the detected object has not entered the no-entry area AR10 (step St309, NO), it proceeds to the process of step St311.
  • the surveillance radar device RD1 can detect an object entering the no-entry area AR10 when the no-no-entry area AR10 is set within the detection area AR0.
  • the monitoring radar device RD1 can detect an object passing through the entry prohibition line LN when the entry prohibition line LN or the like is set.
  • the surveillance radar device RD1 When the surveillance radar device RD1 detects an object entering the no-entry area AR10 or passing through the no-entry line LN, it generates an alarm different from the detection screen SC1 and transmits it to the server S1. This allows the administrator to intuitively understand that an object entering the prohibited entry area AR10 or an object passing through the prohibited entry line LN has been detected based on the alarm.
  • the entry prohibition area AR10 or the entry prohibition line LN may be set according to the example described in the first modification of the first and second embodiments.
  • the surveillance radar device RD1 in Embodiment 1 may execute the processes of steps St305 to St311 after the process of step St300. Note that the surveillance radar device RD1 may omit the process of step St307. As a result, the surveillance radar device RD1 can detect an object entering the prohibited entry area AR10 or an object passing through the prohibited entry line LN.
  • the surveillance radar device RD1 in Modification 1 of Embodiment 1 may execute the processes of steps St305 to St311 after the process of step St302.
  • the surveillance radar device RD1 may omit the process of step St307.
  • the surveillance radar device RD1 can detect an object entering the prohibited entry area AR10 or an object passing through the prohibited entry line LN.
  • the surveillance radar device RD1 in Embodiment 1 described above detects an object positioned at a long distance (second detection area AR2), detects an object positioned at a short distance (first detection area AR1), and performs vital sensing. and an example of running
  • the surveillance radar device RD1 in Modification 3 of Embodiment 1 determines whether or not the detected object is an object that should be warned to the administrator, based on the emotion index of the object using the acquired vital information.
  • the configuration of the surveillance radar device RD1 in Modification 3 of Embodiment 1 has the same internal configuration as that of the surveillance radar device RD1 in Embodiment 1.
  • functions realized by each internal configuration of the surveillance radar device RD1 in Modification 3 of Embodiment 1 will be described.
  • the surveillance radar device RD1 calculates a person's emotion index based on the chronological change in vital information. The surveillance radar device RD1 determines whether or not a warning determination is necessary based on the calculated emotion index of the person.
  • the short-range detection radar RB calculates an emotion index that estimates the emotion of each object (for example, a person detected by the short-range detection radar RB) based on the chronological change in the acquired vital information.
  • the short-range detection radar RB evaluates an attention index for determining whether to generate an alarm based on the calculated human emotion index.
  • the short-range detection radar RB highly evaluates the attention index. Also, the short-range detection radar RB may evaluate the attention index based on the human emotion index and the acquired vital information. The short-range detection radar RB determines whether to generate an alarm based on whether the evaluated caution index is greater than or equal to the threshold. When the short-distance detection radar RB determines that the evaluated caution index is equal to or greater than the threshold, the short-distance detection radar RB includes the position information of the person who passed the entry prohibition line LN, and the person detected by the short-distance detection radar RB is cautioned. An alarm (an example of a first notification) is generated and output to the processor 11 to notify that it has been detected that the person should be identified.
  • An alarm (an example of a first notification) is generated and output to the processor 11 to notify that it has been detected that the person should be identified.
  • the processor 11 may perform each of the processing of calculating the emotion index, the processing of evaluating the attention index, and the processing of determining whether to generate an alarm. Functions realized by the processor 11 and the learning model memory 15 when each of these processes is executed by the processor 11 will be described below.
  • the learning model memory 15 further stores a learned AI model capable of calculating a person's emotional index based on changes in the person's vital information over time.
  • the processor 11 calculates an emotion index that estimates the emotion of each object based on the chronological change in the acquired vital information. Processor 11 evaluates an attention index to determine whether to generate an alert based on the calculated human emotion index.
  • the processor 11 determines whether to generate an alert based on whether the evaluated caution index is equal to or greater than the threshold.
  • the processor 11 determines that the evaluated caution index is equal to or greater than the threshold, the processor 11 includes the position information of the person who passed the no-entry line LN, and the person detected by the short-distance detection radar RB is the person to whom attention should be paid.
  • An alarm (an example of a first notification) is generated to notify that a certain thing has been detected.
  • Processor 11 transmits the generated alert to server S1. Note that the warning may be the detection screen SC1. Further, the processor 11 may generate an alarm including a captured image obtained by cutting out at least a partial region of a person whose attention index is determined to be equal to or greater than the threshold from the captured image transmitted from the camera C1.
  • FIG. 9 is a flow chart showing an operation procedure example of the surveillance radar device RD1 according to the third modification of the first embodiment.
  • FIG. 10 is a flow chart showing an operation procedure example of the surveillance radar device RD1 according to the third modification of the first embodiment.
  • the processing of steps St102 to St103 and steps St203 to St206 may be executed by the radar IC or by the processor 11.
  • FIG. 10 is a flow chart showing an operation procedure example of the surveillance radar device RD1 according to the third modification of the first embodiment.
  • steps St100 to St103 and steps St200 to St203 are the same as those of FIG. 5, so description thereof will be omitted.
  • Each of the radar ICs RB1, ..., RBN generates time series data in which the acquired vital information of the object is arranged in time series.
  • Each of the radar ICs RB1, ..., RBN contains information as to whether or not the detected person is a person to be warned of, information such as the location (azimuth, distance) and movement speed of the person, and vital information of the person. are associated with each person, and output to the processor 11 .
  • the processor 11 acquires the object (for example, human, animal, etc.) detection result output from the long-range detection radar RA and the object (human) detection result output from the short-range detection radar RB.
  • the processor 11 generates a detection screen SC1 (see FIG. 11) in which the acquired detection results of these objects are superimposed on the map (step St312).
  • the processor 11 determines whether or not the person detected by the short-range detection radar RB has entered the no-entry area AR10 (step St313).
  • step St31313 When the processor 11 determines in the process of step St313 that the detected human has entered the no-entry area AR10 (step St313, YES), it calculates the caution index of the detected object. The processor 11 determines whether or not the calculated caution index is greater than or equal to the threshold (step St314).
  • step St313 when the processor 11 determines in the process of step St313 that the detected human has not entered the no-entry area AR10 (step St313, NO), the long-distance detection radar RA and the short-distance detection radar A detection screen SC1 is generated by superimposing the detection results of the objects detected by each of and on a map (step St315). The processor 11 transmits the generated detection screen SC1 to the server S1 (step St315).
  • step St314 When the processor 11 determines in the process of step St314 that the calculated caution index is equal to or greater than the threshold (step St314, YES), it generates an alarm notifying that a person to be warned of has been detected, and sends the warning to the server S1. It transmits (issues a report) (step St316).
  • step St314 determines in the process of step St314 that the calculated attention index is not equal to or greater than the threshold (step St314, NO), the process proceeds to step St315.
  • the surveillance radar device RD1 calculates a caution index based on the emotion index of the detected person and, based on the calculated caution index, detects the detection of a person requiring caution (that is, monitoring or vigilance). You can notify the administrator. As a result, the surveillance radar device RD1 can support the detection and surveillance of a person to whom attention should be paid in the surveillance work by the administrator.
  • FIG. 11 is a diagram showing an example of the detection screen SC1.
  • the detection screen SC1 is generated by the processor 11 of the surveillance radar device RD1 and displayed on the monitor MN by the server S1. Note that the detection screen SC1 may be generated by superimposing the detection result on a two-dimensional map or a three-dimensional map.
  • a detection screen SC1 shown in FIG. 11 shows an example generated by superimposing a detection result on a two-dimensional map.
  • the installation position PS0 of the surveillance radar device RD1 and the respective positions of the objects PS1, PS2, and PS3 detected by the surveillance radar device RD1 are superimposed on a map including the detection area AR0 of the surveillance radar device RD1. be.
  • the detection screen SC1 determines that the object PS3 detected by the short-range detection radar RB is a living thing (person, animal, etc.) as a result of the first determination process or the fourth determination process, the vital information INF is are further superimposed.
  • the vital information INF shown in FIG. 11 shows an example including heart rate information “heartbeat: ⁇ ” and respiration rate information “breathing: XXX” of the object PS3, but is not limited to this.
  • the vital information INF may include the result of the third determination process or the sixth determination process for determining whether or not the object detected in the first detection area AR1 is a human.
  • processor 11 may generate vital information INF (an example of first notification information) including information notifying that the object is human.
  • vital information INF an example of second notification information
  • the vital information INF (an example of the first notification) shown in FIG. 11 includes collation result information "unregistered person” indicating that the object PS3 is a person and not a person pre-registered in the database DB11. If the object PS3 is a human and is a person pre-registered in the database DB11, the vital information INF (an example of the first notification) is a collation indicating that the person is pre-registered in the database DB11. Result information (for example, a person's name, face image, employee number, etc. associated with the vital information stored in the database DB11) may be included.
  • the server S1 executes enlargement, reduction, rotation, etc., of the detection screen SC1 displayed on the monitor MN based on operations such as enlargement, reduction, rotation, etc. by the administrator via the operation unit 23. Display on monitor MN.
  • the surveillance radar device RD1 has a transmission antenna unit that transmits a first radio wave in a first detection area AR1 (an example of a first range).
  • a radar IC RB1 that detects the presence or absence of an object in the first detection area AR1 based on the reflected wave and acquires the detection result (an example of the first information) regarding the object detected in the first detection area AR1, ..., RBN (an example of a first detection processing unit) and a transmission antenna unit RAT1, .
  • the surveillance radar device RD1 according to the first embodiment and modifications 1 to 3 of the first embodiment has a long range (that is, the second detection area AR2) and a short range (the first detection area AR1 ), detection of an object located at a long distance, detection of an object located at a short distance, and vital sensing can be performed at the same time.
  • the surveillance radar device RD1 according to Embodiment 1 uses different radars for a long range (that is, the second detection area AR2) and for a short range (the first detection area AR1). While monitoring a wide range with one unit, it can determine whether an object is a living thing at a short distance.
  • the detection result regarding the object detected in the first detection area AR1 is further divided into the first It includes a position (an example of the first coordinate information) indicating the coordinates of the object detected in the detection area AR1.
  • the detection result regarding the object detected in the second detection area AR2 includes second coordinate information indicating the coordinates of the object detected in the second range.
  • the monitoring radar device RD1 according to the first embodiment and the first to third modifications of the first embodiment includes the detection results acquired by the radar ICs RB1, . . . , RBN and the radar ICs RA1, . and a processor 11 for generating and outputting a detection screen SC1 (an example of notification information) related to the detected object based on the detection result (an example of second information) obtained by.
  • the surveillance radar device RD1 according to the first embodiment and modifications 1 to 3 of the first embodiment outputs the detection result of the object detected in the detection area AR0, thereby It is possible to support the monitoring work performed by such as.
  • the radar ICs RB1, ..., RBN of the surveillance radar device RD1 according to Embodiment 1 and Modifications 1 to 3 of Embodiment 1 operate in the first detection area AR1 based on the identification information.
  • a first determination process is performed to determine whether the detected object is a living thing.
  • the processor 11 stores information (for example, vital information of the object) indicating that an object determined to be a living thing by the first determination process is a living thing, and information corresponding to the object determined to be a living thing by the first determination process.
  • a detection screen SC1 is generated by superimposing position information (an example of first coordinate information) indicating coordinates (that is, azimuth and distance) to a map including at least the first detection area AR1.
  • the surveillance radar device RD1 can detect an object (creature) using the short-range detection radar RB, and the object is detected. By outputting the position, it is possible to support monitoring work performed by administrators, security guards, and the like.
  • the radar ICs RB1, ..., RBN of the surveillance radar device RD1 according to the first modification of the first embodiment determine whether or not the object detected in the first detection area AR1 is moving. 2 determination processing is performed, and if the identification information corresponding to the object determined not to move in the second determination processing includes biological information indicating that the object is a living thing, in the second determination processing An object determined not to move is determined to be a living thing. Accordingly, the surveillance radar device RD1 according to Modification 1 of Embodiment 1 can determine whether or not the detected object is a living thing.
  • the radar ICs RB1, ..., RBN of the surveillance radar device RD1 according to Modification 2 of Embodiment 1 have the identification information, the size of the object, and the reception antenna units RBR1, ..., RBRN
  • a third determination process is performed for determining whether or not the object determined to be a living thing in the first determination process is a human based on at least one of the intensity of the reflected wave of the radio wave and the intensity of the reflected wave of the radio wave.
  • the surveillance radar device RD1 according to the second modification of the first embodiment can more accurately determine whether the detected object is a person or an animal.
  • the processor 11 of the surveillance radar device RD1 determines whether or not the object detected in the first detection area AR1 is a living thing based on the identification information.
  • Information indicating that an object determined to be a living thing in the fourth determination process is a living thing, and position information corresponding to the object determined to be a living thing in the fourth determination process is superimposed on a map showing at least the first detection area AR1 to generate a detection screen SC1.
  • the surveillance radar device RD1 can detect an object (creature) using the short-range detection radar RB, and by outputting the position at which this object is detected, the management It can support surveillance work performed by personnel, security guards, etc.
  • the processor 11 of the surveillance radar device RD1 performs the fifth determination process of determining whether or not the object detected in the first detection area AR1 is moving. and if it is determined that the identification information corresponding to the object determined not to move in the fifth determination process includes biological information indicating that the object is a living thing, the fifth determination process moves An object that is determined not to be is determined to be a living thing.
  • the surveillance radar device RD1 according to Modification 1 of Embodiment 1 can determine whether or not a stationary object is a living thing.
  • the processor 11 of the surveillance radar device RD1 according to the second modification of the first embodiment includes the identification information, the size of the object, and the reflected waves of the first radio waves received by the receiving antenna units RBR1, . . . , RBRN.
  • a sixth determination process is performed to determine whether or not the object determined to be a living thing in the fourth determination process is a human based on at least one of the intensity of the .
  • the surveillance radar device RD1 according to the second modification of the first embodiment can more accurately determine whether the detected object is a person or an animal.
  • the processor 11 of the surveillance radar device RD1 according to the second modification of the first embodiment uses the radar ICs RB1, . . . , RBN or the first Generate first notification information for notification and second notification information for second notification different from the first notification regarding an object not determined to be human by the radar IC RB1, . . . , RBN or processor 11 .
  • the surveillance radar device RD1 according to the second modification of the first embodiment can notify the administrator, security guard, etc. of whether or not the creature is a human (person).
  • the processor 11 of the surveillance radar device RD1 according to the second modification of the first embodiment has determined that an object determined to be a human has entered the no-entry area AR10 into which humans are prohibited from entering.
  • the first notification is that a person has entered the no-entry area AR10.
  • the surveillance radar device RD1 according to Modification 2 of Embodiment 1 notifies the administrator, security guards, etc. of the detection of the intrusion of a person into the intrusion-prohibited area AR10 that has been set in advance, so that surveillance work can be carried out. can support
  • the processor 11 of the surveillance radar device RD1 according to the third modification of the first embodiment based on the vital information of the object determined to be human by the radar ICs RB1, .
  • the emotion index of the object determined to be human is obtained, and the emotion index related to the object determined to be human is notified as a first notification.
  • the surveillance radar device RD1 according to the third modification of the first embodiment notifies the administrator, the security guard, etc. of the detection of a person for whom an excited state or an aggressive emotion index has been calculated, thereby performing surveillance work. can support
  • the processor 11 of the surveillance radar device RD1 according to the third modification of the first embodiment is the radar IC RB1, .
  • the first notification is that an object requiring attention has entered the no-entry area.
  • the surveillance radar device RD1 according to the third modification of the first embodiment prevents a person whose excited state or aggressive emotion index has been calculated from entering the preset no-entry area AR10. Monitoring work can be supported by notifying a security guard or the like.
  • the processor 11 of the surveillance radar device RD1 according to the second modification of the first embodiment includes the radar ICs RB1, . If the registered biometric information is similar, as the first notification, the radar IC RB1, .
  • the surveillance radar device RD1 according to Modification 2 of Embodiment 1 a person located within the first detection area AR1 is registered in the database DB11 in advance, and the person is prevented from entering the first detection area AR1. It can be determined whether the person is an authorized person.
  • the first radio wave and the second radio wave in the surveillance radar device RD1 according to Modifications 1 to 3 of Embodiment 1 are millimeter waves or microwaves.
  • the surveillance radar device RD1 according to Embodiment 1 and Modifications 1 to 3 of Embodiment 1 emits radio waves in a wavelength band more suitable for acquiring vital information, thereby detecting objects in the detection area AR0. (People, animals, etc.) can be detected with higher accuracy.
  • the detection results acquired by the radar ICs RB1, The distance from PS0 to the object detected in the first detection area AR1, the azimuth from the installation position PS0 of the surveillance radar device RD1 toward the object detected in the first detection area AR1, and the height of the detected object.
  • the surveillance radar device RD1 according to Embodiment 1 and Modifications 1 to 3 of Embodiment 1 can determine the type of object based on the position of the object based on the position and orientation of the object and the height of the object. (eg, person, animal, vehicle, etc.).
  • the detection results obtained by the radar ICs RB1, . . . , RBN further include the moving speed of the object detected in the first detection area AR1.
  • the radar ICs RA1, . . . , RAM obtain the moving speed of the object detected in the second detection area AR2.
  • the detection results obtained by the radar ICs RA1, . . . , RAM further include the moving speed of the object detected in the second detection area AR2.
  • the surveillance radar device RD1 according to Modifications 1 to 3 of Embodiment 1 can determine the type of object (for example, person, animal, vehicle, etc.) to a higher degree based on the calculated moving speed of the object. Accurate judgment is possible.
  • the map of the surveillance radar device RD1 according to Embodiment 1 and Modifications 1 to 3 of Embodiment 1 is two-dimensional or three-dimensional map data.
  • the monitoring radar device RD1 according to the first embodiment and modifications 1 to 3 of the first embodiment can be monitored by an administrator using the detection screen SC1 in which detection information is superimposed on a two-dimensional or three-dimensional map. I can support my business.
  • the identification information includes the respiration rate, the heart rate, the blood pressure, the respiration interval, and the heartbeat interval. and at least one of
  • the surveillance radar device RD1 according to the first embodiment and modifications 1 to 3 of the first embodiment can classify the type of detected object (for example, person, animal, etc.) to a higher level based on the identification information. Accurate judgment is possible.
  • the detection result regarding the object detected in the second detection area AR2 is the identification information (for example, breathing rate, heart rate).
  • the surveillance radar device RD1 according to the first embodiment and modifications 1 to 3 of the first embodiment can more accurately identify the type of detected object (for example, a living thing, a stationary object, etc.) based on the identification information. It can be determined with high accuracy.
  • the detection system 100 (an example of a monitoring system) according to Embodiment 1 and Modifications 1 to 3 of Embodiment 1 emits the first radio wave in the first detection area AR1 (an example of the first range). transmitting; receiving a reflected wave of the first radio wave; detecting presence or absence of an object in the first detection area AR1 based on the reflected wave of the first radio wave; Acquiring first information that is information about a detected object and includes identification information that can determine whether or not the object is a living thing; obtaining a second detection area AR2 that is wider than the first detection area AR1; Transmitting the second radio wave in (an example of the second range), receiving the reflected wave of the second radio wave, and detecting the presence or absence of the object in the second detection area AR2 based on the reflected wave of the second radio wave and obtaining second information about the object detected in the second detection area AR2.
  • the detection system 100 according to Embodiment 1 and Modifications 1 to 3 of Embodiment 1 has a long range (that is, the second detection area AR2) and a short range (detection area AR0), respectively.
  • the surveillance radar device RD1 according to the first embodiment uses different radars for a long range (that is, the second detection area AR2) and for a short range (detection area AR0). While monitoring a wide range with , it can determine whether an object is a living thing at a short distance.
  • the present disclosure is useful as a monitoring device, a monitoring system, and a monitoring method that improve the detection accuracy of living things by radar.
  • AI processing unit 14 AI arithmetic processing unit 15 learning model memory 100 detection system AR0 detection area AR1 first detection area AR2 second detection area C1 camera DB11, DB2 database DR1 security drone RA1, RAM, RB1, RBN radar IC RAT1, RATM, RBT1, RBTN Transmitting antenna unit RAR1, RARM, RBR1, RBRN Receiving antenna unit RD1 Monitoring radar device MN Monitor NW Network S1 Server TP1 Security guard terminal

Abstract

This monitoring device comprises a first transmission/reception unit for transmitting first radio waves in a first range and receiving reflected waves of the first radio waves, a first sensing processing unit for sensing the presence/absence of an object in the first range and acquiring first information relating to the object sensed in the first range, a second transmission/reception unit for transmitting second radio waves in a second range that is larger than the first range and receiving reflected waves of the second radio waves, and a second sensing processing unit for sensing the presence/absence of an object in the second range and acquiring second information relating to the object sensed in the second range, the first information including identification information with which it is possible to determine whether the object is a living thing.

Description

監視装置、監視システム及び監視方法MONITORING DEVICE, MONITORING SYSTEM AND MONITORING METHOD
 本開示は、監視装置、監視システム及び監視方法に関する。 The present disclosure relates to a monitoring device, a monitoring system and a monitoring method.
 特許文献1には、被監視対象の付近に設置されるレーダを有する非接触型バイタルサイン監視システムが開示されている。非接触型バイタルサイン監視システムは、所定期間内に順にサンプリングされたレーダの出力信号を保存し、出力信号に基づいて被監視対象のステータスを判定し、静止状態での出力信号に基づいて被監視対象のバイタルサインを判定する。非接触型バイタルサイン監視システムは、被監視対象の健康状態、病気の兆候を示す体温、血圧、心拍数、及び呼吸数等を検出,測定するために、レーダを用いて被監視対象が静止状態であるか否かを判定する。 Patent Document 1 discloses a contactless vital signs monitoring system having a radar installed near a monitored object. The non-contact vital signs monitoring system stores the output signals of the radar sequentially sampled within a predetermined period of time, determines the status of the monitored object based on the output signals, and determines the status of the monitored object based on the stationary output signals. Determine the subject's vital signs. A non-contact vital signs monitoring system uses radar to detect and measure a monitored subject's health status and signs of illness, such as body temperature, blood pressure, heart rate, and respiratory rate. It is determined whether or not.
日本国特開2020-99661号公報Japanese Patent Application Laid-Open No. 2020-99661
 ここで、近年、レーダを用いて人物(例えば、歩行者、被介護者等)を検知したり、検知された人物のバイタル情報を取得したりする技術が開示されている。しかし、監視用レーダは、移動する物体を検知することにより建物や木々等の背景と人物を区別しており、人物が静止している場合、人物を検知できないという課題がある。 Here, in recent years, technologies have been disclosed that use radar to detect people (for example, pedestrians, care recipients, etc.) and acquire vital information of the detected people. However, the surveillance radar distinguishes a background such as buildings and trees from a person by detecting a moving object.
 本開示は、上述した従来の事情に鑑みて案出され、レーダによる生物の検知精度を向上させることを目的とする。 The present disclosure has been devised in view of the above-described conventional circumstances, and aims to improve the detection accuracy of living things by radar.
 本開示は、第1範囲において第1電波を送信し、前記第1電波の反射波を受信する第1送受信部と、前記第1送受信部により受信された前記第1電波の反射波に基づいて、前記第1範囲における物体の有無を検知し、前記第1範囲で検知された物体に関する第1情報を取得する第1検知処理部と、前記第1範囲よりも広い第2範囲において第2電波を送信し、前記第2電波の反射波を受信する第2送受信部と、前記第2送受信部により受信された前記第2電波の反射波に基づいて、前記第2範囲における物体の有無を検知し、前記第2範囲で検知された物体に関する第2情報を取得する第2検知処理部と、を備え、前記第1情報は、物体が生物であるか否かを判定可能な識別情報を含む、監視装置を提供する。 The present disclosure includes a first transceiver that transmits a first radio wave in a first range and receives a reflected wave of the first radio wave, and a reflected wave of the first radio wave that is received by the first transceiver. , a first detection processing unit that detects the presence or absence of an object in the first range and acquires first information about the object detected in the first range; and a second radio wave in a second range that is wider than the first range. and detecting the presence or absence of an object in the second range based on the reflected wave of the second radio wave received by the second transmitting/receiving unit that receives the reflected wave of the second radio wave. and a second detection processing unit that acquires second information about the object detected in the second range, wherein the first information includes identification information that can determine whether the object is a living thing. , provides monitoring equipment.
 また、本開示は、監視装置と、前記監視装置と通信可能に接続される情報処理装置と、を備える監視システムであって、前記監視装置は、第1範囲において第1電波を送信し、前記第1電波の反射波を受信する第1送受信部と、前記第1送受信部が受信した前記第1電波の反射波に基づいて、前記第1範囲における物体の有無を検知し、前記第1範囲で検知された物体に関する第1情報を取得する第1検知処理部と、前記第1範囲よりも広い第2範囲において第2電波を送信し、前記第2電波の反射波を受信する第2送受信部と、前記第2送受信部により受信された前記第2電波の反射波に基づいて、前記第2範囲における物体の有無を検知し、前記第2範囲で検知された物体に関する第2情報を取得する第2検知処理部と、を備え、前記第1情報は、物体が生物であるか否かを判定可能な識別情報を含む、監視システムを提供する。 Further, the present disclosure is a monitoring system including a monitoring device and an information processing device communicably connected to the monitoring device, wherein the monitoring device transmits a first radio wave in a first range, a first transmission/reception unit that receives a reflected wave of a first radio wave; based on the reflected wave of the first radio wave received by the first transmission/reception unit, detects the presence or absence of an object in the first range; and a second transmission/reception unit that transmits a second radio wave in a second range wider than the first range and receives a reflected wave of the second radio wave. and a reflected wave of the second radio wave received by the second transmitting/receiving unit to detect the presence or absence of an object in the second range, and obtain second information about the object detected in the second range. and a second detection processing unit, wherein the first information includes identification information capable of determining whether or not the object is a living thing.
 また、本開示は、第1範囲において第1電波を送信することと、前記第1電波の反射波を受信することと、前記第1電波の反射波に基づいて、前記第1範囲における物体の有無を検知することと、前記第1範囲で検知された物体に関する情報であり、かつ物体が生物であるか否かを判定可能な識別情報を含む第1情報を取得することと、前記第1範囲よりも広い第2範囲において第2電波を送信することと、前記第2電波の反射波を受信することと、前記第2電波の反射波に基づいて、前記第2範囲における物体の有無を検知することと、前記第2範囲で検知された物体に関する第2情報を取得することと、を備える、監視方法を提供する。 Further, the present disclosure is to transmit a first radio wave in a first range, receive a reflected wave of the first radio wave, and based on the reflected wave of the first radio wave, determine the position of an object in the first range. detecting the presence/absence; acquiring first information that is information about an object detected in the first range and that includes identification information that can determine whether the object is a living thing; Transmitting a second radio wave in a second range wider than the range, receiving a reflected wave of the second radio wave, and determining the presence or absence of an object in the second range based on the reflected wave of the second radio wave. A method of monitoring is provided, comprising: sensing; and obtaining second information about an object sensed at the second range.
 本開示によれば、レーダによる生物の検知精度を向上させることができる。 According to the present disclosure, it is possible to improve the detection accuracy of living things by radar.
図1は、実施の形態1に係る検知システムのシステム構成例を示すブロック図である。FIG. 1 is a block diagram showing a system configuration example of a detection system according to Embodiment 1. As shown in FIG. 図2は、実施の形態1における監視レーダ装置の内部構成例を示すブロック図である。FIG. 2 is a block diagram showing an internal configuration example of the surveillance radar device according to the first embodiment. 図3は、遠距離検知用レーダ及び近距離検知用レーダのそれぞれの検知エリアの一例を説明する図である。FIG. 3 is a diagram illustrating an example of detection areas of the long-distance detection radar and the short-distance detection radar. 図4は、遠距離検知用レーダ及び近距離検知用レーダのそれぞれの検知エリアの一例を説明する図である。FIG. 4 is a diagram illustrating an example of detection areas of the long-distance detection radar and the short-distance detection radar. 図5は、実施の形態1における監視レーダ装置の動作手順例を示すフローチャートである。5 is a flow chart showing an example of an operation procedure of the surveillance radar device according to Embodiment 1. FIG. 図6は、実施の形態1の変更例1における監視レーダ装置の動作手順例を示すフローチャートである。6 is a flowchart showing an example of an operation procedure of the surveillance radar device in Modification 1 of Embodiment 1. FIG. 図7は、実施の形態1の変更例2における監視レーダ装置の動作手順例を示すフローチャートである。7 is a flowchart illustrating an example of an operation procedure of a surveillance radar device according to Modification 2 of Embodiment 1. FIG. 図8は、実施の形態1の変更例2における監視レーダ装置の動作手順例を示すフローチャートである。8 is a flowchart showing an example of an operation procedure of the surveillance radar device in Modification 2 of Embodiment 1. FIG. 図9は、実施の形態1の変更例3における監視レーダ装置の動作手順例を示すフローチャートである。9 is a flowchart illustrating an example of an operation procedure of a surveillance radar device according to Modification 3 of Embodiment 1. FIG. 図10は、実施の形態1の変更例3における監視レーダ装置の動作手順例を示すフローチャートである。10 is a flowchart showing an example of an operation procedure of the surveillance radar device in Modification 3 of Embodiment 1. FIG. 図11は、検知画面の一例を示す図である。FIG. 11 is a diagram showing an example of a detection screen.
 以下、添付図面を適宜参照しながら、本開示に係る監視装置、監視システム及び監視方法を具体的に開示した実施の形態を詳細に説明する。但し、必要以上に詳細な説明は省略する場合がある。例えば、既によく知られた事項の詳細説明や実質的に同一の構成に対する重複説明を省略する場合がある。これは、以下の説明が不必要に冗長になるのを避け、当業者の理解を容易にするためである。なお、添付図面及び以下の説明は、当業者が本開示を十分に理解するために提供されるのであって、これらにより特許請求の範囲に記載の主題を限定することは意図されていない。 Hereinafter, embodiments specifically disclosing a monitoring device, a monitoring system, and a monitoring method according to the present disclosure will be described in detail with reference to the accompanying drawings as appropriate. However, more detailed description than necessary may be omitted. For example, detailed descriptions of well-known matters and redundant descriptions of substantially the same configurations may be omitted. This is to avoid unnecessary verbosity in the following description and to facilitate understanding by those skilled in the art. It should be noted that the accompanying drawings and the following description are provided for a thorough understanding of the present disclosure by those skilled in the art and are not intended to limit the claimed subject matter.
(実施の形態1)
 図1は、実施の形態1に係る検知システム100のシステム構成例を示す図である。図1に示すように検知システム100は、1台以上の監視レーダ装置RD1と、サーバS1と、ネットワークNWと、を含む。また、検知システム100は、カメラC1と、警備ドローンDR1と、警備員端末TP1とを含んでいてもよい。なお、カメラC1、警備ドローンDR1、及び警備員端末TP1は、検知システム100に必須の要素ではなく、省略されてもよい。
(Embodiment 1)
FIG. 1 is a diagram showing a system configuration example of a detection system 100 according to Embodiment 1. As shown in FIG. As shown in FIG. 1, the detection system 100 includes one or more surveillance radar devices RD1, a server S1, and a network NW. The detection system 100 may also include a camera C1, a security drone DR1, and a security guard terminal TP1. Note that the camera C1, the security drone DR1, and the security guard terminal TP1 are not essential elements of the detection system 100 and may be omitted.
 検知システム100は、監視システムの一例であって、屋内外に設置された少なくとも1台の監視レーダ装置RD1により、監視レーダ装置RD1の検知エリアAR0(図4参照)内に位置する物体(例えば、ヒト、動物、建物等)を検知する。検知システム100は、監視レーダ装置RD1により、検知された物体のバイタル情報を解析する。そして、検知システム100は、監視レーダ装置RD1の検知エリアAR0に対応するマップ(マップ情報の一例)上に、検知された物体(例えば、ヒト)の位置とバイタル情報とを重畳した検知画面SC1(通知情報,重畳地図情報の一例)(図11参照)を生成する。続いて、検知システム100は、監視レーダ装置RD1により生成された検知画面を、サーバS1のモニタMNに表示する。検知エリアAR0は、遠距離検知用レーダRA(図2参照)と近距離検知用レーダRB(図2参照)とが物体を検知可能なエリアに相当する。 The detection system 100 is an example of a surveillance system, and uses at least one surveillance radar device RD1 installed indoors and outdoors to detect objects (for example, humans, animals, buildings, etc.). The detection system 100 analyzes the vital information of the detected object by the surveillance radar device RD1. Then, the detection system 100 superimposes the position of the detected object (for example, a person) and vital information on a map (an example of map information) corresponding to the detection area AR0 of the surveillance radar device RD1, and a detection screen SC1 ( An example of notification information and superimposed map information) (see FIG. 11) is generated. Subsequently, the detection system 100 displays the detection screen generated by the surveillance radar device RD1 on the monitor MN of the server S1. The detection area AR0 corresponds to an area in which an object can be detected by the long-distance detection radar RA (see FIG. 2) and the short-distance detection radar RB (see FIG. 2).
 監視レーダ装置RD1は、ネットワークNWを介して、サーバS1及びカメラC1との間でそれぞれ有線通信あるいは無線通信可能に接続される。無線通信は、例えばBluetooth(登録商標)、NFC(登録商標)等の近距離無線通信、又はWi-Fi(登録商標)など、無線LAN(Local Area Network)を介した通信である。 The surveillance radar device RD1 is connected to the server S1 and the camera C1 via the network NW so as to be capable of wired communication or wireless communication. Wireless communication is, for example, short-range wireless communication such as Bluetooth (registered trademark) or NFC (registered trademark), or communication via a wireless LAN (Local Area Network) such as Wi-Fi (registered trademark).
 監視レーダ装置RD1は、遠距離検知用レーダRAにより第2の検知エリアAR2(図3参照)内から物体の位置、移動速度等を検知する。そして、監視レーダ装置RD1は、近距離検知用レーダRBにより検知された第1の検知エリアAR1(図3参照)内に位置する物体のバイタル情報を取得する。 The surveillance radar device RD1 detects the position, moving speed, etc. of an object from within the second detection area AR2 (see FIG. 3) using the long-distance detection radar RA. Then, the surveillance radar device RD1 acquires vital information of an object located within the first detection area AR1 (see FIG. 3) detected by the short-range detection radar RB.
 監視レーダ装置RD1は、検知画面SC1(図11参照)を生成して、サーバS1に送信する。検知画面SC1は、遠距離検知用レーダRAにより検知された物体の位置、あるいは近距離検知用レーダRBにより検知された物体の位置及びバイタル情報を検知エリアAR0に対応するマップ上に重畳したものである。なお、図1に示す監視レーダ装置RD1の台数は、一例として1台である例を示すが、2台以上であってもよい。 The surveillance radar device RD1 generates a detection screen SC1 (see FIG. 11) and transmits it to the server S1. The detection screen SC1 superimposes the position of an object detected by the long-distance detection radar RA or the position and vital information of an object detected by the short-distance detection radar RB on a map corresponding to the detection area AR0. be. Note that the number of surveillance radar devices RD1 shown in FIG. 1 is one, but may be two or more.
 なお、監視レーダ装置RD1は、後述のカメラC1とデータ送受信可能に接続されている場合、カメラC1により撮像された撮像画像を取得してもよい。このような場合、監視レーダ装置RD1は、カメラC1の撮像画像上に、遠距離検知用レーダRAにより検知された物体の位置、あるいは近距離検知用レーダRBにより検知された物体の位置及びバイタル情報を重畳することで検知画面を生成してもよい。 Note that the surveillance radar device RD1 may acquire a captured image captured by the camera C1 when it is connected to the camera C1 described later so that data can be transmitted and received. In such a case, the surveillance radar device RD1 displays the position of the object detected by the long-range detection radar RA or the position and vital information of the object detected by the short-range detection radar RB on the image captured by the camera C1. may be superimposed to generate a detection screen.
 カメラC1は、ネットワークNWを介して、監視レーダ装置RD1、サーバS1、あるいは警備員端末TP1との間で有線通信あるいは無線通信可能に接続される。カメラC1は、例えば防犯カメラ等の監視カメラである。カメラC1は、検知エリアAR0の少なくとも一部の領域を撮像し、撮像された撮像画像を、監視レーダ装置RD1、サーバS1、あるいは警備員端末TP1に送信する。なお、図1に示すカメラC1の台数は、一例として1台である例を示すが、2台以上であってもよい。 The camera C1 is connected to the surveillance radar device RD1, the server S1, or the security guard terminal TP1 via the network NW so as to be capable of wired or wireless communication. The camera C1 is, for example, a monitoring camera such as a security camera. The camera C1 captures an image of at least a part of the detection area AR0, and transmits the captured image to the monitoring radar device RD1, the server S1, or the security guard terminal TP1. Note that the number of cameras C1 shown in FIG. 1 is one, but may be two or more.
 なお、カメラC1は、人工知能(AI:Artificial Intelligence)が搭載されたカメラであってもよい。人工知能を搭載している場合、カメラC1は、撮像された撮像画像に映る物体を、学習済みAIモデルを用いて検出し、物体の位置を取得する。また、カメラC1は、監視レーダ装置RD1から送信された物体のバイタル情報(例えば、呼吸数、心拍数、血圧、呼吸間隔、又は心拍間隔等の情報)を含む検知結果を取得する。カメラC1は、取得された物体の位置及び物体のバイタル情報を、撮像画像上に重畳した検知画面を生成して、サーバS1に送信してもよい。 Note that the camera C1 may be a camera equipped with artificial intelligence (AI). When the camera C1 is equipped with artificial intelligence, the camera C1 uses a learned AI model to detect an object appearing in the captured image, and acquires the position of the object. The camera C1 also acquires the detection result including the vital information of the object (for example, information such as breathing rate, heart rate, blood pressure, breathing interval, or heartbeat interval) transmitted from the surveillance radar device RD1. The camera C1 may generate a detection screen in which the acquired position of the object and the vital information of the object are superimposed on the captured image, and transmit the detection screen to the server S1.
 サーバS1は、ネットワークNWを介して、監視レーダ装置RD1、カメラC1、警備ドローンDR1、あるいは警備員端末TP1との間で有線通信あるいは無線通信可能に接続される。また、サーバS1は、ユーザ(例えば、監視エリアの監視を行う管理会社の従業員、警備員、管理者等)により操作可能な操作部23と、ユーザが視聴可能なモニタMNとの間でデータ送受信可能に接続される。なお、サーバS1は、操作部23と、モニタMNとを含む場合、例えばPC(Personal Computer)、ノートPC、タブレット端末、スマートフォン等の情報処理装置により実現されてよい。 The server S1 is connected to the surveillance radar device RD1, the camera C1, the security drone DR1, or the security guard terminal TP1 via the network NW so as to be capable of wired communication or wireless communication. In addition, the server S1 transmits data between an operation unit 23 that can be operated by a user (e.g., an employee of a management company who monitors the monitored area, a security guard, a manager, etc.) and a monitor MN that can be viewed by the user. Connected for transmission and reception. When the server S1 includes the operation unit 23 and the monitor MN, the server S1 may be realized by an information processing device such as a PC (Personal Computer), a notebook PC, a tablet terminal, a smart phone, or the like.
 サーバS1は、監視レーダ装置RD1あるいはカメラC1から送信された検知画面SC1(図11参照)等をモニタMNに出力して表示させる。また、サーバS1は、ユーザにより所定の通報先(例えば、管理会社、警備会社、保険会社、警備員、警備ドローンDR1、警備員端末TP1等)へ通報する操作が行われた場合には、所定の通報先に通報を行う。サーバS1は、ユーザ操作に基づいて、物体が検知された位置へ警備ドローンDR1を向かわせて、物体への威嚇、警告等を実行させるための制御指令を生成し、警備ドローンDR1に送信してもよい。 The server S1 outputs the detection screen SC1 (see FIG. 11) and the like transmitted from the surveillance radar device RD1 or the camera C1 to the monitor MN for display. Further, when the user performs an operation to report to a predetermined destination (for example, a management company, a security company, an insurance company, a security guard, a security drone DR1, a security guard terminal TP1, etc.), the server S1 performs a predetermined report to the reporting destination. Based on the user's operation, the server S1 directs the security drone DR1 to the position where the object was detected, generates a control command for intimidating the object, giving a warning, etc., and transmits the control command to the security drone DR1. good too.
 サーバS1は、通信部20と、プロセッサ21と、メモリ22と、を少なくとも含む。なお、データベースDB2は、サーバS1とは異なる情報処理装置に実装し、サーバS1との間でデータ送受信可能に接続されてもよい。 The server S1 includes at least a communication unit 20, a processor 21, and a memory 22. Note that the database DB2 may be mounted in an information processing apparatus different from the server S1 and connected to the server S1 so that data can be transmitted and received.
 通信部20は、ネットワークNWを介して、監視レーダ装置RD1、カメラC1、警備ドローンDR1、あるいは警備員端末TP1との間でデータ送受信を実行するための通信インターフェース回路を用いて構成される。通信部20は、無線通信ネットワーク又は有線通信ネットワークを通じて、監視レーダ装置RD1又はカメラC1から送信された検知画面SC1をプロセッサ21に出力する。また、通信部20は、プロセッサ21から出力されたユーザ操作に対応する制御指令を、対応する装置(例えば、警備ドローンDR1、警備員端末TP1等)に送信する。 The communication unit 20 is configured using a communication interface circuit for executing data transmission/reception with the surveillance radar device RD1, the camera C1, the security drone DR1, or the security guard terminal TP1 via the network NW. The communication unit 20 outputs the detection screen SC1 transmitted from the surveillance radar device RD1 or the camera C1 to the processor 21 through a wireless communication network or a wired communication network. Also, the communication unit 20 transmits a control command corresponding to the user operation output from the processor 21 to the corresponding device (for example, the security drone DR1, the security guard terminal TP1, etc.).
 プロセッサ21は、例えばCPU(Central Processing Unit)又はFPGA(Field Programmable Gate Array)等の演算装置である。プロセッサ21は、メモリ22と協働して、各種の処理及び制御を行う。具体的に、プロセッサ21は、メモリ22に保持されたプログラム及びデータをロードして実行することにより、各種機能を実現する。 The processor 21 is, for example, a computing device such as a CPU (Central Processing Unit) or an FPGA (Field Programmable Gate Array). The processor 21 cooperates with the memory 22 to perform various types of processing and control. Specifically, the processor 21 implements various functions by loading and executing programs and data held in the memory 22 .
 プロセッサ21は、操作部23から出力された電気信号に基づいて、事前に設定された通報先(例えば、管理会社、警備会社、保険会社あるいは警備員端末TP1等の通報先のメールアドレス、電話番号等)に通報する。プロセッサ21は、操作部23から出力された電気信号に基づいて、検知された物体の位置へ警備ドローンDR1を移動させ、物体に威嚇,警告等を行わせる制御指令を生成する。 Based on the electrical signal output from the operation unit 23, the processor 21 receives a preset report destination (e.g., a management company, a security company, an insurance company, or a security guard terminal TP1, etc.), and receives the mail address and telephone number of the report destination. etc.). Based on the electrical signal output from the operation unit 23, the processor 21 moves the security drone DR1 to the position of the detected object and generates a control command to threaten or warn the object.
 メモリ22は、例えばプロセッサ21の各処理を実行する際に用いられるワークメモリとしてのRAM(Random Access Memory)と、プロセッサ21の動作を規定したプログラム及びデータを格納するROM(Read Only Memory)とを有する。なお、メモリ22は、SSD(Solid State Drive)あるいはHDD(Hard Disk Drive)等によるストレージデバイスのうちいずれかを含む記憶デバイスを有してもよい。RAMには、プロセッサ21により生成あるいは取得されたデータもしくは情報が一時的に保存される。ROMには、プロセッサ21の動作を規定するプログラムが書き込まれている。メモリ22は、通報先の情報(例えば、管理会社、警備会社、保険会社あるいは警備員端末TP1等の通報先のメールアドレス、電話番号等)、警備ドローンDR1に関する情報等を記憶する。 The memory 22 includes, for example, a RAM (Random Access Memory) as a work memory used when executing each process of the processor 21, and a ROM (Read Only Memory) for storing programs and data that define the operation of the processor 21. have. Note that the memory 22 may have a storage device including either a storage device such as an SSD (Solid State Drive) or HDD (Hard Disk Drive). Data or information generated or obtained by the processor 21 is temporarily stored in the RAM. A program that defines the operation of the processor 21 is written in the ROM. The memory 22 stores the information of the report destination (for example, the management company, the security company, the insurance company, or the mail address and telephone number of the report destination of the security guard terminal TP1, etc.), the information regarding the security drone DR1, and the like.
 データベースDB2は、例えばHDDあるいはSSD等の記憶デバイスである。データベースDB2は、検知エリアAR0と、検知エリアAR0を監視する監視レーダ装置RD1の情報(例えば、製造番号、ID等)と、第1の検知エリアAR1への侵入が許可された物体に関する情報(例えば、氏名、顔画像、バイタル情報等)とを紐付けて登録(格納)する。 The database DB2 is, for example, a storage device such as an HDD or SSD. The database DB2 includes the detection area AR0, information (eg, serial number, ID, etc.) of the surveillance radar device RD1 that monitors the detection area AR0, and information (eg, , name, face image, vital information, etc.) are linked and registered (stored).
 操作部23は、例えばタッチパネル、ボタン、キーボード等を用いて構成されたユーザインタフェースである。操作部23は、受け付けられたユーザ操作を電気信号(制御指令)に変換して、プロセッサ21に出力する。なお、操作部23がタッチパネルである場合、操作部23は、モニタMNと一体的に構成される。 The operation unit 23 is a user interface configured using, for example, a touch panel, buttons, keyboard, and the like. The operating unit 23 converts the accepted user's operation into an electrical signal (control command) and outputs the electrical signal to the processor 21 . Note that when the operation unit 23 is a touch panel, the operation unit 23 is configured integrally with the monitor MN.
 モニタMNは、例えばLCD(Liquid Crystal Display)又は有機EL(Electroluminescence)等のディスプレイである。モニタMNは、監視レーダ装置RD1あるいはカメラC1から送信された検知画面SC1を表示する。 The monitor MN is a display such as LCD (Liquid Crystal Display) or organic EL (Electroluminescence). The monitor MN displays the detection screen SC1 transmitted from the surveillance radar device RD1 or camera C1.
 警備員端末TP1は、ネットワークNWを介して、サーバS1との間でデータ送受信可能に接続される。警備員端末TP1は、監視レーダ装置RD1により監視される検知エリアAR0を警備する警備員、建物の従業員等により使用され、例えばノートPC、タブレット端末、スマートフォン等により実現される。警備員端末TP1は、サーバS1から送信された検知画面SC1あるいは警報情報を表示する。なお、図1に示す警備員端末TP1の台数は、一例として1台である例を示すが、2台以上であってもよい。 The security guard terminal TP1 is connected to the server S1 via the network NW so that data can be sent and received. The security guard terminal TP1 is used by security guards guarding the detection area AR0 monitored by the surveillance radar device RD1, building employees, and the like, and is realized by, for example, a notebook PC, a tablet terminal, a smartphone, and the like. The security terminal TP1 displays the detection screen SC1 or the alarm information transmitted from the server S1. Note that the number of security guard terminals TP1 shown in FIG. 1 is one, but may be two or more.
 警備ドローンDR1は、ネットワークNWを介して、監視レーダ装置RD1あるいはサーバS1との間でデータ送受信可能に接続される。警備ドローンDR1は、スピーカ、照明等を備え、物体に対して音声あるいは照明光による威嚇,警告等を実行する。警備ドローンDR1は、監視レーダ装置RD1あるいはサーバS1から送信された制御指令に基づいて、物体が検知された位置まで飛行する。警備ドローンDR1は、物体が検知された位置まで飛行した場合、物体に対して威嚇,警告等を実行する。なお、警備ドローンDR1は、カメラを備え、物体を撮像し、撮像された撮像画像(ライブ映像)をサーバS1に送信してもよい。 The security drone DR1 is connected to the surveillance radar device RD1 or the server S1 via the network NW so that data can be sent and received. The security drone DR1 is equipped with a speaker, lighting, etc., and threatens, warns, etc. by voice or illumination light to an object. The security drone DR1 flies to the position where the object is detected based on the control command transmitted from the surveillance radar device RD1 or the server S1. When the security drone DR1 flies to the position where the object is detected, it threatens, warns, or the like to the object. Note that the security drone DR1 may include a camera, capture an image of an object, and transmit the captured image (live video) to the server S1.
 次に、図2~図4を順次参照して、監視レーダ装置RD1について説明する。図2は、監視レーダ装置RD1の内部構成例を示すブロック図である。図3は、遠距離検知用レーダRA及び近距離検知用レーダRBのそれぞれの検知エリアの一例を説明する図である。図4は、遠距離検知用レーダRA及び近距離検知用レーダRBのそれぞれの検知エリアの一例を説明する図である。 Next, the surveillance radar device RD1 will be described with reference to FIGS. 2 to 4. FIG. FIG. 2 is a block diagram showing an internal configuration example of the surveillance radar device RD1. FIG. 3 is a diagram illustrating an example of detection areas of the long-distance detection radar RA and the short-distance detection radar RB. FIG. 4 is a diagram illustrating an example of detection areas of the long-distance detection radar RA and the short-distance detection radar RB.
 監視レーダ装置RD1は、通信部10と、プロセッサ11と、メモリ12と、遠距離検知用レーダRAと、近距離検知用レーダRBと、データベースDB11と、を含む。なお、データベースDB11は、監視レーダ装置RD1と別体で構成されてもよい。データベースDB11は、必須の構成でなく省略されてもよい。 The surveillance radar device RD1 includes a communication unit 10, a processor 11, a memory 12, a long-range detection radar RA, a short-range detection radar RB, and a database DB11. The database DB11 may be configured separately from the surveillance radar device RD1. The database DB11 is not an essential component and may be omitted.
 通信部10は、ネットワークNWを介して、サーバS1及びカメラC1との間でそれぞれデータ送受信を実行するための通信インターフェース回路を用いて構成される。通信部10は、無線通信ネットワーク又は有線通信ネットワークを通じて、プロセッサ11により生成された検知画面SC1(図11参照)をサーバS1に送信する。また、通信部10は、カメラC1から送信された撮像画像をプロセッサ11に出力する。 The communication unit 10 is configured using a communication interface circuit for executing data transmission/reception with the server S1 and the camera C1 via the network NW. The communication unit 10 transmits the detection screen SC1 (see FIG. 11) generated by the processor 11 to the server S1 through a wireless communication network or a wired communication network. The communication unit 10 also outputs the captured image transmitted from the camera C<b>1 to the processor 11 .
 プロセッサ11は、例えばCPU又はFPGAを用いて構成されて、メモリ12と協働して、各種の処理及び制御を行う。具体的には、プロセッサ11はメモリ12に保持されたプログラム及びデータを参照し、そのプログラムを実行することにより、AI処理部13等の各種機能を実現する。 The processor 11 is configured using, for example, a CPU or FPGA, and cooperates with the memory 12 to perform various types of processing and control. Specifically, the processor 11 refers to the programs and data held in the memory 12 and executes the programs, thereby realizing various functions of the AI processing unit 13 and the like.
 プロセッサ11は、遠距離検知用レーダRAと近距離検知用レーダRBとを制御する。なお、プロセッサ11は、遠距離検知用レーダRAと近距離検知用レーダRBとをそれぞれ独立制御してもよい。プロセッサ11は、遠距離検知用レーダRAの検知結果(第1情報の一例)と近距離検知用レーダRBの検知結果(第2情報の一例)とに基づいて、遠距離検知用レーダRAと近距離検知用レーダRBとを協調制御してもよい。 The processor 11 controls the long-range detection radar RA and the short-range detection radar RB. The processor 11 may independently control the long-range detection radar RA and the short-range detection radar RB. Based on the detection result of the long-distance detection radar RA (an example of the first information) and the detection result of the short-range detection radar RB (an example of the second information), the processor 11 detects the distance between the long-distance detection radar RA and the near distance detection radar RA. It may be controlled cooperatively with the distance detection radar RB.
 また、プロセッサ11は、遠距離検知用レーダRAの検知結果あるいは近距離検知用レーダRBの検知結果を、マップ上に重畳した検知画面SC1(図11参照)を生成して、通信部10に出力する。通信部10は、プロセッサ11から出力された検知画面SC1をサーバS1に送信する。 Further, the processor 11 generates a detection screen SC1 (see FIG. 11) in which the detection result of the long-distance detection radar RA or the detection result of the short-distance detection radar RB is superimposed on the map, and outputs it to the communication unit 10. do. The communication unit 10 transmits the detection screen SC1 output from the processor 11 to the server S1.
 また、プロセッサ11は、データベースDB11に複数の人物のそれぞれのバイタル情報が格納されている場合、第1の検知エリアAR1内へ侵入した人物のバイタル情報に基づいて、侵入した人物が第1の検知エリアAR1内への侵入が許可された人物であるか否かの人物認証を実行する。プロセッサ11は、侵入した人物が第1の検知エリアAR1内への侵入が許可された人物でないと判定した場合、この人物の位置をマップ上に重畳した検知画面SC1を生成して、サーバS1に送信する。 In addition, when the database DB11 stores the vital information of each of a plurality of persons, the processor 11 determines whether the person who has entered the first detection area AR1 is detected by the first detection based on the vital information of the person who has entered the first detection area AR1. Person authentication is performed to determine whether the person is permitted to enter the area AR1. If the processor 11 determines that the person who has entered the first detection area AR1 is not a person permitted to enter the first detection area AR1, the processor 11 generates a detection screen SC1 in which the position of the person is superimposed on a map, and sends the detection screen SC1 to the server S1. Send.
 メモリ12は、例えばプロセッサ11の各処理を実行する際に用いられるワークメモリとしてのRAMと、プロセッサ11の動作を規定したプログラム及びデータを格納するROMとを有する。なお、メモリ12は、SSDあるいはHDD等によるストレージデバイスのうちいずれかを含む記憶デバイスを有してもよい。RAMには、プロセッサ11により生成あるいは取得されたデータもしくは情報が一時的に保存される。ROMには、プロセッサ11の動作を規定するプログラムが書き込まれている。 The memory 12 has, for example, a RAM as a work memory that is used when executing each process of the processor 11, and a ROM that stores programs and data that define the operation of the processor 11. Note that the memory 12 may have a storage device including either a storage device such as an SSD or an HDD. The RAM temporarily stores data or information generated or acquired by the processor 11 . A program that defines the operation of the processor 11 is written in the ROM.
 メモリ12は、監視レーダ装置RD1の検知エリアAR0(つまり、第1の検知エリアAR1と、第2の検知エリアAR2)に対応するマップのデータを記憶する。なお、マップは、2次元のマップデータであってもよいし、又は3次元のマップデータであってもよい。 The memory 12 stores map data corresponding to the detection area AR0 (that is, the first detection area AR1 and the second detection area AR2) of the surveillance radar device RD1. The map may be two-dimensional map data or three-dimensional map data.
 なお、第1の検知エリアAR1は、近距離検知用レーダRBにより物体を検知可能なエリアである。第2の検知エリアAR2は、遠距離検知用レーダRAにより物体を検知可能なエリアである。 Note that the first detection area AR1 is an area in which an object can be detected by the short-range detection radar RB. The second detection area AR2 is an area in which an object can be detected by the long-distance detection radar RA.
 第1の検知エリアAR1は、監視レーダ装置RD1の設置位置PS0からの水平距離が距離X1であり、かつ近距離検知用レーダRBが電波を送受信可能な範囲である。また、第2の検知エリアAR2は、監視レーダ装置RD1の設置位置PS0からの水平距離が距離X2であり、かつ遠距離検知用レーダRAが電波を送受信可能な範囲である。例えば、距離X1は、監視レーダ装置RD1の設置位置PS0からの水平距離が10~20mである。距離X2は、距離X1以上であって、例えば、監視レーダ装置RD1の設置位置PS0からの水平距離が20~30mである。つまり、第1の検知エリアAR1は、第2の検知エリアAR2よりも小さく、かつ、第2の検知エリアAR2よりも監視レーダ装置RD1の設置位置PS0の近傍に位置する領域である(図3参照)。また、第1の検知エリアAR1と第2の検知エリアAR2とは、一部が重複していてもよい。 The first detection area AR1 has a horizontal distance X1 from the installation position PS0 of the surveillance radar device RD1, and is a range in which the short-range detection radar RB can transmit and receive radio waves. The second detection area AR2 has a horizontal distance X2 from the installation position PS0 of the surveillance radar device RD1, and is a range in which the long-distance detection radar RA can transmit and receive radio waves. For example, the distance X1 is a horizontal distance of 10 to 20 m from the installation position PS0 of the surveillance radar device RD1. The distance X2 is greater than or equal to the distance X1, and the horizontal distance from the installation position PS0 of the surveillance radar device RD1 is, for example, 20 to 30 m. That is, the first detection area AR1 is smaller than the second detection area AR2 and is located closer to the installation position PS0 of the surveillance radar device RD1 than the second detection area AR2 (see FIG. 3). ). Also, the first detection area AR1 and the second detection area AR2 may partially overlap.
 なお、上述した距離X1,X2は一例であって、これに限定されない。また、第2の検知エリアAR2の監視レーダ装置RD1の設置位置PS0からの距離X2は、監視レーダ装置RD1の設置位置PS0からの距離X1以上であれば、遠距離検知用レーダRAによる電波の送受信が可能な範囲において可変である。 It should be noted that the above-described distances X1 and X2 are examples and are not limited to these. Further, if the distance X2 from the installation position PS0 of the surveillance radar device RD1 in the second detection area AR2 is equal to or greater than the distance X1 from the installation position PS0 of the surveillance radar device RD1, the radio waves are transmitted and received by the long-distance detection radar RA. is variable within the possible range.
 AI処理部13は、少なくとも1つの学習済みAIモデルに基づいて、ニューラルネットワークを形成する。AI処理部13は、学習モデルメモリ15に記憶された各学習済みAIモデルに対応するニューラルネットワークを形成する。AI処理部13は、形成されたニューラルネットワークを用いて、遠距離検知用レーダRA及び近距離検知用レーダRBのそれぞれから出力された信号に信号解析処理を実行する。AI処理部13は、AI演算処理部14と、学習モデルメモリ15とを含む。 The AI processing unit 13 forms a neural network based on at least one trained AI model. The AI processing unit 13 forms a neural network corresponding to each trained AI model stored in the learning model memory 15 . The AI processing unit 13 uses the formed neural network to perform signal analysis processing on the signals output from each of the long-distance detection radar RA and the short-distance detection radar RB. AI processing unit 13 includes AI arithmetic processing unit 14 and learning model memory 15 .
 AI演算処理部14は、学習モデルメモリ15に記憶された学習済みAIモデルを使用し、遠距離検知用レーダRAあるいは近距離検知用レーダRBのそれぞれから出力された信号に基づいて、検知された物体の種別判定処理を実行してもよい。具体的に、本実施形態に係る種別判定処理は、検知された物体が生物であるか否かを判定する処理(第1判定処理、第4判定処理の一例)、生物であると判定された物体がヒトであるか否かを判定する処理(第3判定処理、第6判定処理の一例)等である。また、AI演算処理部14は、学習モデルメモリ15に記憶された学習済みAIモデルを使用し、近距離検知用レーダRBから出力された信号に基づいて、検知された物体のバイタル情報の取得処理を実行してもよい。 The AI arithmetic processing unit 14 uses the learned AI model stored in the learning model memory 15, and based on the signal output from each of the long-distance detection radar RA and the short-distance detection radar RB, detected An object type determination process may be executed. Specifically, the type determination processing according to the present embodiment includes processing for determining whether or not a detected object is a living thing (examples of first determination processing and fourth determination processing), This includes processing for determining whether or not an object is a human (examples of third determination processing and sixth determination processing). In addition, the AI arithmetic processing unit 14 uses the learned AI model stored in the learning model memory 15, and acquires vital information of the detected object based on the signal output from the short-range detection radar RB. may be executed.
 学習モデルメモリ15は、例えばRAM、ROM、フラッシュメモリなどのメモリによって構成される。学習モデルメモリ15は、予め学習処理によって作成された学習済みAIモデルを格納している。AI演算処理部14は、学習モデルメモリ15に格納された学習済みAIモデルに対応するニューラルネットワークを形成し、所望の信号解析処理を実行する。 The learning model memory 15 is composed of memories such as RAM, ROM, and flash memory. The learning model memory 15 stores a learned AI model created in advance by learning processing. The AI arithmetic processing unit 14 forms a neural network corresponding to the trained AI model stored in the learning model memory 15 and executes desired signal analysis processing.
 具体的に、学習モデルメモリ15は、レーダから出力された信号に基づいて、検知された物体の種別判定可能な学習済みAIモデル、検知された物体のバイタル情報を取得可能な学習済みAIモデルを記憶する。また、学習モデルメモリ15は、監視レーダ装置RD1が人物認証を実行可能な学習済みAIモデルを記憶していてもよい。なお、上述した学習済みAIモデルは、一例であって、これに限定されない。学習モデルメモリ15は、他の用途で使用される他の学習済みAIモデルを記憶していてもよい。 Specifically, the learning model memory 15 stores a trained AI model capable of determining the type of the detected object and a trained AI model capable of acquiring vital information of the detected object based on the signal output from the radar. Remember. Also, the learning model memory 15 may store a learned AI model with which the surveillance radar device RD1 can perform person authentication. Note that the above-described trained AI model is an example, and is not limited to this. The learning model memory 15 may store other trained AI models used for other purposes.
 遠距離検知用レーダRAは、M(M:1以上の整数)個のレーダIC(Integrated Circuit)RA1,…,RAM(第2検知処理部の一例)を備える。それぞれのレーダICは、送信アンテナ部RAT1,…,RATM、受信アンテナ部RAR1,…,RARM(第2送受信部の一例)に接続される(図2参照)。送信アンテナ部RAT1,…,RATM、受信アンテナ部RAR1,…,RARMは、複数のアンテナを用いて構成されてもよい。遠距離検知用レーダRAは、第2の検知エリアAR2内の物体,物体(例えば、車両、二輪車等)を検知する。 The long-distance detection radar RA includes M (M: an integer equal to or greater than 1) radar ICs (Integrated Circuits) RA1, . Each radar IC is connected to transmitting antenna units RAT1, . . . , RATM and receiving antenna units RAR1, . The transmitting antenna units RAT1, . . . , RATM and the receiving antenna units RAR1, . The long-distance detection radar RA detects objects within the second detection area AR2 (for example, vehicles, two-wheeled vehicles, etc.).
 具体的に、遠距離検知用レーダRAは、監視レーダ装置RD1の設置位置PS0を中心として水平方向における所定の視野角(例えば、180°等)内、かつ、設置位置PS0を基準(=0(ゼロ)m)として第2の所定距離内に囲まれた範囲に位置する物体を検知する。第2の所定距離とは、例えば、数m~100mであって、近距離検知用レーダRBが物体を検知可能な第1の所定距離よりも大きい距離である。 Specifically, the long-distance detection radar RA is positioned within a predetermined viewing angle (for example, 180°) in the horizontal direction centered on the installation position PS0 of the surveillance radar device RD1, and with the installation position PS0 as a reference (=0( Detect an object located in an enclosed area within a second predetermined distance as zero)m). The second predetermined distance is, for example, several meters to 100 meters, and is a distance larger than the first predetermined distance at which the short-range detection radar RB can detect an object.
 なお、第2の所定距離は、第1の所定距離を一部含む距離であってもよい(図3参照)。つまり、第2の検知エリアAR2は、近距離検知用レーダRBの第1の検知エリアAR1を一部含んでいてもよい。 The second predetermined distance may be a distance that partially includes the first predetermined distance (see FIG. 3). That is, the second detection area AR2 may partially include the first detection area AR1 of the short-range detection radar RB.
 M個のレーダIC RA1,…,RAMのそれぞれは、プロセッサ11による制御に基づいて、各レーダICに対応する送信アンテナ部RAT1,…,RATMから、波長が1~10mmの電波を照射する。レーダIC RA1,…,RAMのそれぞれは、受信アンテナ部RAR1,…,RARMで受信した、各アンテナの照射方向に存在する物体(例えば、人物等)により反射された反射信号を取得する。 Each of the M radar ICs RA1, . Each of the radar ICs RA1, .
 送信アンテナ部RAT1,…,RATMのそれぞれは、対応するレーダICから出力されたアナログ信号を電波(第2電波の一例)に変換して照射する。 Each of the transmission antenna units RAT1, .
 受信アンテナ部RAR1,…,RARMのそれぞれは、照射された電波が物体(例えば、人物等)により反射された反射波を受信する。さらに、受信アンテナ部RAR1,…,RARMのそれぞれは、受信された反射波をアナログ信号に変換して、対応するレーダICに出力する。 Each of the receiving antenna units RAR1, . Furthermore, each of the receiving antenna units RAR1, . . . , RARM converts the received reflected wave into an analog signal and outputs it to the corresponding radar IC.
 M個のレーダIC RA1,…,RAMのそれぞれは、各アンテナで受信したアナログ信号に基づいて、物体までの距離及び方角、物体の相対速度のそれぞれの算出処理(第2判定処理、第5判定処理の一例)、物体の種別判定処理等を実行する。M個のレーダIC RA1,…,RAMのそれぞれは、物体までの距離、方角情報及び物体の相対速度情報と、物体の種別情報など、レーダICにより算出された情報をプロセッサ11に出力する。 Each of the M radar ICs RA1, . an example of processing), object type determination processing, and the like are executed. Each of the M radar ICs RA1, .
 近距離検知用レーダRBは、N(N:1以上の整数)個のレーダIC RB1,…,RBN(第1検知処理部の一例)を備える。それぞれのレーダIC RB1,…,RBNは、送信アンテナ部RBT1,…,RBTNと受信アンテナ部RBR1,…,RBRN(第1送受信部の一例)とに接続される。送信アンテナ部RBT1,…,RBTN、及び受信アンテナ部RBR1,…,RBRNは、複数のアンテナを用いて構成されてもよい。近距離検知用レーダRBは、監視レーダ装置RD1の設置位置PS0の付近の第1の検知エリアAR1内の物体,物体(例えば、車両、二輪車等)を検知する。 The short-range detection radar RB includes N (N: an integer equal to or greater than 1) radar ICs RB1, ..., RBN (an example of a first detection processing unit). , RBN are connected to transmitting antenna units RBT1, . . . , RBTN and receiving antenna units RBR1, . The transmitting antenna units RBT1, . . . , RBTN and the receiving antenna units RBR1, . The short-range detection radar RB detects an object within a first detection area AR1 near the installation position PS0 of the surveillance radar device RD1, and objects (for example, vehicles, motorcycles, etc.).
 具体的に、近距離検知用レーダRBは、監視レーダ装置RD1の設置位置PS0を中心として水平方向における所定の視野角(例えば、180°等)内、かつ、設置位置PS0を基準(=0(ゼロ)m)として第1の所定距離内に囲まれた範囲に位置する物体を検知する。ここでいう第1の所定距離は、例えば、数m~10mである。 Specifically, the short-range detection radar RB is positioned within a predetermined viewing angle (for example, 180°) in the horizontal direction around the installation position PS0 of the surveillance radar device RD1, and the installation position PS0 as a reference (=0 ( Detect an object located in an enclosed area within a first predetermined distance as zero)m). The first predetermined distance referred to here is, for example, several meters to 10 meters.
 このように、送信アンテナ部RAT1,…,RATMは、送信アンテナ部RBT1,…,RBTNよりも、広い範囲に電波を送信する。また、受信アンテナ部RAR1,…,RARMは、受信アンテナ部RBR1,…,RBRNよりも、広い範囲において電波を受信する。 In this way, the transmission antenna units RAT1, ..., RATM transmit radio waves over a wider range than the transmission antenna units RBT1, ..., RBTN. Also, the receiving antenna units RAR1, . . . , RARM receive radio waves in a wider range than the receiving antenna units RBR1, .
 本実施形態の第1範囲は、送信アンテナ部RBT1,…,RBTMが電波を送信し、受信アンテナ部RBR1,…,RBRNが電波(送信アンテナ部RBT1,…,RBTMが送信した電波の反射波を含む)を受信する範囲である。また、本実施形態の第2範囲は、送信アンテナ部RAT1,…,RATMが電波を送信し、受信アンテナ部RAR1,…,RARMが電波(送信アンテナ部RAT1,…,RATMが送信した電波の反射波を含む)を受信する範囲である。つまり、第1の検知エリアAR1は、第2の検知エリアAR2よりも監視レーダ装置RD1の設置位置PS0の近傍にあり、第2の検知エリアAR2よりも物体を検知可能な範囲が小さい。 In the first range of the present embodiment, the transmitting antenna units RBT1, . . . , RBTM transmit radio waves, and the receiving antenna units RBR1, . including). In the second range of the present embodiment, the transmitting antenna units RAT1, . . . , RATM transmit radio waves, and the receiving antenna units RAR1, . (including waves). That is, the first detection area AR1 is closer to the installation position PS0 of the surveillance radar device RD1 than the second detection area AR2, and has a smaller object detectable range than the second detection area AR2.
 N個のレーダIC RB1,…,RBNのそれぞれは、プロセッサ11からの制御に基づいて、各レーダICに対応する送信アンテナ部RBT1,…,RBTNから、波長が1~10mmの電波を照射する。N個のレーダIC RB1,…,RBNのそれぞれは、受信アンテナ部RBR1,…,RBRNで受信した、各アンテナの照射方向に存在する物体(例えば、建物、草木、ヒト、動物等)により反射された反射波を受信する。 Each of the N radar ICs RB1, . Each of the N radar ICs RB1, ..., RBN is reflected by objects (for example, buildings, plants, humans, animals, etc.) existing in the irradiation direction of each antenna received by the receiving antenna units RBR1, ..., RBRN receive reflected waves.
 送信アンテナ部RBT1,…,RBTNのそれぞれは、対応するレーダICから出力されたアナログ信号を電波(第1電波の一例)に変換して照射する。 Each of the transmitting antenna units RBT1, .
 受信アンテナ部RBR1,…,RBRNのそれぞれは、照射された電波が物体により反射された反射波を受信する。受信アンテナ部RBR1,…,RBRNのそれぞれは、さらに、受信された反射波をアナログ信号に変換して、対応するレーダICに出力する。 Each of the receiving antenna units RBR1, . Each of the receiving antenna units RBR1, . . . , RBRN further converts the received reflected wave into an analog signal and outputs it to the corresponding radar IC.
 N個のレーダIC RB1,…,RBNのそれぞれは、各アンテナで受信したアナログ信号に基づいて、物体までの距離及び方角、物体の相対速度のそれぞれの算出処理、物体の種別の判定処理と、物体のバイタル情報(例えば、呼吸数、心拍数、血圧、呼吸間隔、又は心拍間隔等の情報)の取得処理等を実行する。N個のレーダIC RB1,…,RBNのそれぞれは、物体までの距離、方角情報及び物体の相対速度情報、物体の種別情報、物体のバイタル情報等、レーダICにより算出された情報をプロセッサ11に出力する。 Each of the N radar ICs RB1, . Acquisition processing of vital information of the object (for example, information such as respiratory rate, heart rate, blood pressure, breathing interval, or heartbeat interval) is executed. Each of the N radar ICs RB1, . Output.
 なお、監視レーダ装置RD1が備えるレーダICの数は、それぞれ一例であってこれに限定されないことは言うまでもない。また、遠距離検知用レーダRAが備えるレーダICの数Mと、近距離検知用レーダRBが備えるレーダICの数Nや、各レーダICに接続されるアンテナ数は同一である必要はなく、異なっていてもよい。 It goes without saying that the number of radar ICs provided in the surveillance radar device RD1 is only an example and is not limited to this. Further, the number M of radar ICs provided in the long-range detection radar RA, the number N of radar ICs provided in the short-range detection radar RB, and the number of antennas connected to each radar IC need not be the same, and may be different. may be
 データベースDB11は、例えばHDDあるいはSSD、NAND等の記憶デバイスを用いて構成される。データベースDB11は、第1の検知エリアAR1への侵入が許可された物体に関する情報(例えば、氏名、顔画像、バイタル情報等)登録(格納)する。例えば、データベースDB11には、第1の検知エリアAR1への侵入を許可されている人物に関する氏名、顔画像、バイタル情報等が、当該第1の検知エリアAR1への侵入を許可されている人物を識別する識別子に関連付けて登録されている。 The database DB11 is configured using a storage device such as an HDD, SSD, or NAND. The database DB11 registers (stores) information (for example, name, face image, vital information, etc.) about an object permitted to enter the first detection area AR1. For example, in the database DB11, the name, face image, vital information, etc. of a person who is permitted to enter the first detection area AR1 is stored. It is registered in association with an identifying identifier.
 次に、図5を参照して、監視レーダ装置RD1の動作手順例について説明する。図5は、実施の形態1における監視レーダ装置RD1の動作手順例を示すフローチャートである。なお、ステップSt102~ステップSt103、及びステップSt202~ステップSt203のそれぞれの処理は、レーダICにより実行されてもよいし、プロセッサ11により実行されてもよい。 Next, with reference to FIG. 5, an operation procedure example of the surveillance radar device RD1 will be described. FIG. 5 is a flowchart showing an operation procedure example of the surveillance radar device RD1 according to the first embodiment. Note that the respective processes of steps St102 to St103 and steps St202 to St203 may be executed by the radar IC or by the processor 11. FIG.
 まず、遠距離検知用レーダRAにより実行される動作手順について説明する。 First, the operation procedure executed by the long-distance detection radar RA will be described.
 遠距離検知用レーダRAの送信アンテナ部RAT1,…,RATMのそれぞれは、波長が1~10mmの電波を照射する(ステップSt100)。受信アンテナ部RAR1,…,RARMのそれぞれは、物体等の物体によって反射された電波の反射波を受信する(ステップSt101)。 Each of the transmission antenna units RAT1, . Each of the receiving antenna units RAR1, .
 レーダIC RA1,…,RAMのそれぞれは、各レーダICに接続された受信アンテナ部RAR1,…,RARMにより受信された反射波を取得する。レーダIC RA1,…,RAMのそれぞれは、取得された信号に基づいて、物体により電波が反射された反射点までの距離を算出する(ステップSt102)。また、レーダIC RA1,…,RAMのそれぞれは、所定周期で電波の照射、及び電波の反射波の受信を繰り返し実行する。レーダIC RA1,…,RAMのそれぞれは、反射点の時系列変化に基づいて、物体の移動速度を算出する(ステップSt102)。さらに、レーダIC RA1,…,RAMのそれぞれは、各レーダICに接続された複数の受信アンテナ間での受信信号の位相差に基づいて反射点までの方位を算出する(ステップSt102)。 Each of the radar ICs RA1, . Each of the radar ICs RA1, . Also, each of the radar ICs RA1, . Each of the radar ICs RA1, . Furthermore, each of the radar ICs RA1, .
 レーダIC RA1,…,RAMのそれぞれは、算出した反射点の距離、方位から物体が検知された検知点の座標を示す点群データを生成する(ステップSt102)。 Each of the radar ICs RA1, .
 レーダIC RA1,…,RAMのそれぞれは、取得された点群データに対し、クラスタリングの処理を行う(ステップSt103)。レーダIC RA1,…,RAMのそれぞれは、クラスタリングされた点群の集合が示す物体の大きさ、算出された点群の移動速度の情報等に基づいて、物体の種別判定を実行する(ステップSt103)。 Each of the radar ICs RA1, . Each of the radar ICs RA1, . ).
 レーダIC RA1,…,RAMのそれぞれは、物体がヒトであるか否かを判定する。レーダIC RA1,…,RAMのそれぞれは、クラスタリングされた点群の集合のうちヒトであると判定された点群の集合が示す物体の位置(方位、距離),移動速度等の情報を物体ごとに対応付けた物体の検知結果を生成し、プロセッサ11に出力する。 Each of the radar ICs RA1, ..., RAM determines whether the object is human. Each of the radar ICs RA1, . is generated and output to the processor 11 .
 次に、近距離検知用レーダRBにより実行される動作手順について説明する。 Next, the operation procedure executed by the short-range detection radar RB will be described.
 近距離検知用レーダRBの送信アンテナ部RBT1,…,RBTNのそれぞれは、波長が1~10mmの電波を照射する(ステップSt200)。受信アンテナ部RBR1,…,RBRNのそれぞれは、物体等の物体によって反射された電波の反射波を受信する(ステップSt201)。 Each of the transmitting antenna units RBT1, . Each of the receiving antenna units RBR1, .
 レーダIC RB1,…,RBNのそれぞれは、各レーダICに接続された受信アンテナ部RBR1,…,RBRNにより受信された反射波を取得する。レーダIC RB1,…,RBNのそれぞれは、取得された信号に基づいて、物体により電波が反射された反射点までの距離を算出する(ステップSt202)。また、レーダIC RB1,…,RBNのそれぞれは、所定周期で電波の照射、及び電波の反射波の受信を繰り返し実行する。レーダIC RB1,…,RBNのそれぞれは、反射点の時系列変化に基づいて、物体の移動速度を算出する(ステップSt202)。さらに、レーダIC RB1,…,RBNのそれぞれは、各レーダICに接続された複数の受信アンテナ間での受信信号の位相差に基づいて反射点までの方位を算出する(ステップSt202)。 Each of the radar ICs RB1, . . . , RBN acquires reflected waves received by the receiving antenna units RBR1, . Each of the radar ICs RB1, . Also, each of the radar ICs RB1, . Each of the radar ICs RB1, . Furthermore, each of the radar ICs RB1, .
 レーダIC RB1,…,RBNのそれぞれは、算出した反射点の距離、方位から物体が検知された検知点の座標を示す点群データを生成する(ステップSt202)。 Each of the radar ICs RB1, .
 レーダIC RB1,…,RBNのそれぞれは、取得された点群データに対し、クラスタリングの処理を行う(ステップSt203)。レーダIC RB1,…,RBNのそれぞれは、クラスタリングされた点群の集合が示す物体の大きさ、算出された点群の移動速度の情報等に基づいて、物体が生物であるか否かを判定する(ステップSt203)。つまり、クラスタリングされた点群の集合が示す物体の大きさ、算出された点群の移動速度の情報等は、本実施の形態の識別情報の一例である。なお、ステップSt203の処理は、第1判定処置又は第3判定処理の一例である。 Each of the radar ICs RB1, . . . , RBN performs clustering processing on the obtained point cloud data (step St203). Each of the radar ICs RB1, ..., RBN determines whether or not the object is a living thing based on the size of the object indicated by the set of clustered point clouds, information on the calculated moving speed of the point cloud, etc. (step St203). In other words, the size of the object indicated by the set of clustered point clouds, information on the calculated moving speed of the point cloud, and the like are examples of the identification information of the present embodiment. Note that the process of step St203 is an example of the first determination process or the third determination process.
 また、レーダIC RB1,…,RBNのそれぞれは、物体(例えば、生物であると判定された物体)がヒトであるか否かを判定する。クラスタリングされた点群の集合のうちヒトであると判定された物体からの信号を解析し、当該物体のバイタル情報(例えば、呼吸数、心拍数、血圧、呼吸間隔、又は心拍間隔等の情報)を算出する(ステップSt203)。 Also, each of the radar ICs RB1, ..., RBN determines whether an object (for example, an object determined to be a living thing) is a human. Analyze the signal from the object determined to be human among the set of clustered point clouds, and obtain the vital information of the object (for example, information such as respiratory rate, heart rate, blood pressure, breathing interval, or heartbeat interval) is calculated (step St203).
 監視レーダ装置RD1では、物体からの信号を解析することにより、上述したような物体のバイタル情報を得ることができる。バイタル情報とは、呼吸数、心拍数、血圧、呼吸間隔、又は心拍間隔等、つまり物体が生物であるか否かを判別可能な情報に相当する。したがって、バイタル情報は、本実施の形態の識別情報の一例である。 By analyzing the signal from the object, the surveillance radar device RD1 can obtain the vital information of the object as described above. Vital information corresponds to respiratory rate, heart rate, blood pressure, breathing interval, heartbeat interval, or the like, that is, information that can determine whether an object is a living thing. Therefore, vital information is an example of identification information in this embodiment.
 また、生存している生物に由来する呼吸数、心拍数、血圧、呼吸間隔、又は心拍間隔等は、所定の周期や気候条件によって変動する。つまり、所定の周期や気候条件に応じた変動が観測されたバイタル情報は、本実施形態の生体情報の一例である。具体的に、生体情報は、呼吸数、心拍数、血圧、呼吸間隔、心拍間隔、極度に高い/低すぎない呼吸数、心拍数、血圧、呼吸間隔、心拍間隔等の情報であり、かつ時系列変化を有する情報である。 In addition, the respiratory rate, heart rate, blood pressure, breathing interval, heartbeat interval, etc. derived from living organisms fluctuate according to predetermined cycles and climatic conditions. In other words, the vital information in which variation according to a predetermined period or climate conditions is observed is an example of the biological information of the present embodiment. Specifically, the biological information is information such as breathing rate, heart rate, blood pressure, breathing interval, heartbeat interval, extremely high/not too low breathing rate, heart rate, blood pressure, breathing interval, heartbeat interval, etc. It is information that has series variation.
 レーダIC RB1,…,RBNのそれぞれは、クラスタリングされた点群の集合のうち、ヒトであると判定された点群の集合が示す物体の位置(方位、距離),移動速度等の情報を取得する。レーダIC RB1,…,RBNのそれぞれは、物体の位置(方位、距離),移動速度等の情報と、算出されたバイタル情報とを物体ごとに対応付けて物体の検知結果を生成し、プロセッサ11に出力する。 Each of the radar ICs RB1, ..., RBN acquires information such as the position (azimuth, distance), movement speed, etc. of the object indicated by the point cloud set determined to be human from the clustered point cloud set do. Each of the radar ICs RB1, . output to
 プロセッサ11は、レーダIC RA1,…,RAMのそれぞれから出力された物体に対応する点群の座標情報と、レーダIC RB1,…,RBNのそれぞれから出力された物体に対応する点群の座標情報とを取得する。プロセッサ11は、取得された点群の座標情報を用いて、同一のマップ上に2種類のレーダで検出された物体の検出点を重畳して、検知画面SC1(図11参照)を生成する(ステップSt300)。このとき、プロセッサ11は、物体の座標情報にバイタル情報が対応付けられている場合は、同一のマップ上に物体の検出点とバイタル情報とを重畳した検知画面SC1を生成する。これにより、プロセッサ11は、物体の検知点(座標情報)と、物体のバイタル情報とを合わせて検知画面SC1内で表示できる。なお、ここでいう2種類のレーダは、遠距離検知用レーダRA、及び近距離検知用レーダRBである。 The processor 11 receives the coordinate information of the point cloud corresponding to the object output from each of the radar ICs RA1, . and get. Using the acquired coordinate information of the point cloud, the processor 11 superimposes the detection points of the object detected by the two types of radar on the same map to generate a detection screen SC1 (see FIG. 11) ( step St300). At this time, if the vital information is associated with the coordinate information of the object, the processor 11 generates a detection screen SC1 in which the detection points of the object and the vital information are superimposed on the same map. Thereby, the processor 11 can display the detection point (coordinate information) of the object and the vital information of the object together in the detection screen SC1. The two types of radar referred to here are a long-distance detection radar RA and a short-distance detection radar RB.
 プロセッサ11は、生成された検知画面SC1を通信部10に出力し、サーバS1に送信させる(ステップSt301)。 The processor 11 outputs the generated detection screen SC1 to the communication unit 10 and causes it to be transmitted to the server S1 (step St301).
 監視レーダ装置RD1は、上述した処理を繰り返し実行し、検知エリアAR0内に位置する物体を検知する。なお、監視レーダ装置RD1は、遠距離検知用レーダRAにより検知した物体が第1の検知エリアAR1に近づいたタイミングで、近距離検知用レーダRBによる物体の検知処理、及び物体のバイタル情報の取得処理を開始してもよい。 The surveillance radar device RD1 repeatedly executes the above-described processing to detect an object positioned within the detection area AR0. At the timing when an object detected by the long-distance detection radar RA approaches the first detection area AR1, the surveillance radar device RD1 performs object detection processing by the short-distance detection radar RB and acquires vital information of the object. Processing may begin.
 また、監視レーダ装置RD1は、近距離検知用レーダRBにより取得されたバイタル情報に基づいて、人物の認証を実行してもよい。 Also, the surveillance radar device RD1 may perform person authentication based on the vital information acquired by the short-range detection radar RB.
 以上により、監視レーダ装置RD1は、遠距離(つまり、第2の検知エリアAR2)と、近距離(第1の検知エリアAR1)とでそれぞれレーダを使い分ける。このようにすることで、監視レーダ装置RD1は、遠距離に位置する物体の検知と、近距離に位置する物体の検知及びバイタルセンシングとを同時に実行できる。 As described above, the surveillance radar device RD1 uses different radars for long range (that is, second detection area AR2) and short range (first detection area AR1). By doing so, the surveillance radar device RD1 can simultaneously perform detection of an object positioned at a long distance, detection of an object positioned at a short distance, and vital sensing.
(実施の形態1の変形例1)
 上述した実施の形態1における監視レーダ装置RD1は、遠距離(第2の検知エリアAR2)に位置する物体の検知と、近距離(第1の検知エリアAR1)に位置する物体の検知及びバイタルセンシングとを実行する例を示した。実施の形態1の変形例1における監視レーダ装置RD1は、静止している物体が生物であるか否かを判定する例について説明する。
(Modification 1 of Embodiment 1)
The surveillance radar device RD1 in Embodiment 1 described above detects an object positioned at a long distance (second detection area AR2), detects an object positioned at a short distance (first detection area AR1), and performs vital sensing. and an example of running An example in which the surveillance radar device RD1 in Modification 1 of Embodiment 1 determines whether or not a stationary object is a living thing will be described.
 本変形例1における監視レーダ装置RD1の構成は、実施の形態1における監視レーダ装置RD1と同様の内部構成を有する。以降、本変形例1における監視レーダ装置RD1の各内部構成が実現する機能について説明する。 The configuration of the surveillance radar device RD1 according to Modification 1 has the same internal configuration as that of the surveillance radar device RD1 according to the first embodiment. Hereinafter, functions realized by each internal configuration of the surveillance radar device RD1 in Modification 1 will be described.
 本変形例1に係る監視レーダ装置RD1は、検知された物体が静止しているか否か(つまり、移動していないか否か)を判定する第2判定処理又は第5判定処理を実行する。監視レーダ装置RD1は、第2判定処理又は第5判定処理の結果、検知された物体が移動していない(つまり、静止している)と判定した場合、移動していないと判定された物体が生物であるか否かを判定する。以下、第2判定処理例及び第5判定処理例について説明する。 The surveillance radar device RD1 according to Modification 1 executes the second determination process or the fifth determination process for determining whether or not the detected object is stationary (that is, whether or not it is moving). When the surveillance radar device RD1 determines that the detected object is not moving (that is, is stationary) as a result of the second determination process or the fifth determination process, the object determined not to be moving is Determine whether or not it is a living thing. A second determination process example and a fifth determination process example will be described below.
 近距離検知用レーダRBのレーダIC RB1,…,RBNのそれぞれは、検知された物体が静止しているか否かを判定する。レーダIC RB1,…,RBNのそれぞれは、検知された物体のうち、静止している物体に対応する信号を解析し、物体のバイタル情報を算出する。 Each of the radar ICs RB1, ..., RBN of the short-range detection radar RB determines whether or not the detected object is stationary. Each of the radar ICs RB1, .
 ところで、物体が移動している場合、ドップラー現象により反射波の波長が変化する。反射波の波長の変化に基づいて、物体が移動しているか否かを判定することができる。つまり、物体が移動している場合、点群データには、監視レーダ装置RD1に向かう方向に関する物体の移動速度の情報が含まれることとなる。したがって、レーダIC RB1,…,RBNのそれぞれは、点群データとして取得された反射波の波長の変化を検知した場合、検知された物体が移動していると判定する。続いて、レーダIC RB1,…,RBNのそれぞれは、静止していると判定された物体に対応する信号を解析し、物体のバイタル情報を算出する。 By the way, when the object is moving, the wavelength of the reflected wave changes due to the Doppler phenomenon. Whether or not the object is moving can be determined based on the change in the wavelength of the reflected wave. That is, when the object is moving, the point cloud data includes information on the moving speed of the object in the direction toward the surveillance radar device RD1. Therefore, each of the radar ICs RB1, . Subsequently, each of the radar ICs RB1, ..., RBN analyzes the signal corresponding to the object determined to be stationary, and calculates the vital information of the object.
 また、例えば、レーダIC RB1,…,RBNのそれぞれは、物体が検知された検知点の座標を示す点群データの位置の時系列変化(例えば、時刻t1、t2(t1<t2)における点群データの位置)に基づいて、検知された物体が静止しているか否かを判定する。このような場合、レーダIC RB1,…,RBNのそれぞれは、点群データの位置が時系列変化しない物体を、静止していると判定する。レーダIC RB1,…,RBNのそれぞれは、静止していると判定された物体に対応する信号を解析し、物体のバイタル情報を算出する。 Also, for example, each of the radar ICs RB1, . position of the data) to determine whether the detected object is stationary. In such a case, each of the radar ICs RB1, . Each of the radar ICs RB1, .
 また、例えば、レーダIC RB1,…,RBNのそれぞれは、検知された物体の点群データをクラスタリングし、クラスタリングされた点群データの領域(範囲)の時系列変化(例えば、時刻t3、t4(t3<t4)における点群データの位置)に基づいて、検知された物体が静止しているか否かを判定する。このような場合、レーダIC RB1,…,RBNのそれぞれは、点群データの領域(範囲)が時系列変化しない物体を、静止していると判定する。レーダIC RB1,…,RBNのそれぞれは、静止していると判定された物体に対応する信号を解析し、物体のバイタル情報を算出する。 Also, for example, each of the radar ICs RB1, . Based on the position of the point cloud data at t3<t4)), it is determined whether or not the detected object is stationary. In such a case, each of the radar ICs RB1, . Each of the radar ICs RB1, .
 レーダIC RB1,…,RBNのそれぞれは、算出された物体のバイタル情報に基づいて、物体が静止している生物(ヒト、動物等)であるか否かを判定する。レーダIC RB1,…,RBNのそれぞれは、静止している生物であると判定された点群の座標の情報と、バイタル情報とを対応付けてプロセッサ11に出力する。 Each of the radar ICs RB1, ..., RBN determines whether the object is a stationary organism (human, animal, etc.) based on the calculated vital information of the object. Each of the radar ICs RB1, .
 プロセッサ11は、レーダIC RB1,…,RBNのそれぞれから出力された静止している生物であると判定された点群の座標の情報と、バイタル情報とを取得する。プロセッサ11は、取得された静止している生物の位置と、バイタル情報とをマップ上に重畳して、検知画面SC1(図11参照)を生成する。プロセッサ11は、生成された検知画面SC1をサーバS1に送信する。 The processor 11 acquires information on the coordinates of the point cloud determined to be a stationary organism output from each of the radar ICs RB1, . . . , RBN, and vital information. The processor 11 superimposes the acquired position of the stationary organism and the vital information on the map to generate a detection screen SC1 (see FIG. 11). The processor 11 transmits the generated detection screen SC1 to the server S1.
 次に、図6を参照して、監視レーダ装置RD1の動作手順例について説明する。図6は、実施の形態1の変形例1における監視レーダ装置RD1の動作手順例を示すフローチャートである。なお、ステップSt104~ステップSt105、及びステップSt204~ステップSt205のそれぞれの処理は、レーダICにより実行されてもよいし、プロセッサ11により実行されてもよい。 Next, with reference to FIG. 6, an operation procedure example of the surveillance radar device RD1 will be described. FIG. 6 is a flowchart showing an operation procedure example of the surveillance radar device RD1 according to Modification 1 of Embodiment 1. As shown in FIG. Note that the respective processes of steps St104 to St105 and steps St204 to St205 may be executed by the radar IC or by the processor 11. FIG.
 図6に示すフローチャートの説明において、ステップSt100~ステップSt101、及びステップSt200~ステップSt202のそれぞれの処理については、図5と同様の処理であるため、説明を省略する。  In the description of the flowchart shown in FIG. 6, the processes of steps St100 to St101 and steps St200 to St202 are the same as those of FIG. 5, so description thereof will be omitted.
 まず、遠距離検知用レーダRAにより実行される動作手順について説明する。 First, the operation procedure executed by the long-distance detection radar RA will be described.
 レーダIC RA1,…,RAMのそれぞれは、各レーダICに接続された受信アンテナ部RAR1,…,RARMにより受信された反射波を取得する。レーダIC RA1,…,RAMのそれぞれは、取得された信号に基づいて、物体により電波が反射された反射点までの距離を算出する(ステップSt104)。また、レーダIC RA1,…,RAMのそれぞれは、所定周期で電波の照射、及び電波の反射波の受信を繰り返し実行する。レーダIC RA1,…,RAMのそれぞれは、反射点の時系列変化に基づいて、物体の移動速度を算出する(ステップSt104)。さらに、レーダIC RA1,…,RAMのそれぞれは、各レーダICに接続された複数の受信アンテナ間での受信信号の位相差に基づいて反射点までの方位を算出する(ステップSt104)。 Each of the radar ICs RA1, . Each of the radar ICs RA1, . Also, each of the radar ICs RA1, . Each of the radar ICs RA1, . Furthermore, each of the radar ICs RA1, .
 レーダIC RA1,…,RAMのそれぞれは、算出した反射点の距離、方位から物体が検知された検知点の座標を示す点群データを生成する(ステップSt104)。 Each of the radar ICs RA1, .
 また、レーダIC RA1,…,RAMのそれぞれは、生成された点群データから背景を除去する背景除去アルゴリズムを適用する。レーダIC RA1,…,RAMのそれぞれは、点群データに含まれる静止した状態の物体(以降、「静止物」と表記)に対応する点群を除去(フィルタリング)する(ステップSt104)。ここでいう背景除去アルゴリズムは、具体的にはCFAR(Constant False Alarm Rate)などである。 Also, each of the radar ICs RA1, ..., RAM applies a background removal algorithm that removes the background from the generated point cloud data. Each of the radar ICs RA1, . The background removal algorithm here is specifically CFAR (Constant False Alarm Rate) or the like.
 レーダIC RA1,…,RAMのそれぞれは、フィルタリング後の点群データに対し、クラスタリングの処理を行う(ステップSt105)。レーダIC RA1,…,RAMのそれぞれは、クラスタリングされた点群の集合が示す物体の大きさ、算出された点群の移動速度の情報等に基づいて、物体の種別判定を実行する(ステップSt105)。 Each of the radar ICs RA1, . Each of the radar ICs RA1, . ).
 レーダIC RA1,…,RAMのそれぞれは、物体が人物であるか否かを判定する。レーダIC RA1,…,RAMのそれぞれは、クラスタリングされた点群の集合のうち人物であると判定された点群の集合が示す物体の位置(方位、距離)、移動速度等の情報を物体ごとにプロセッサ11に出力する。 Each of the radar ICs RA1, ..., RAM determines whether the object is a person. Each of the radar ICs RA1, . to the processor 11.
 次に、近距離検知用レーダRBにより実行される動作手順について説明する。 Next, the operation procedure executed by the short-range detection radar RB will be described.
 レーダIC RB1,…,RBNのそれぞれは、各レーダICに接続された受信アンテナ部RBR1,…,RBRNにより受信された反射波を取得する。レーダIC RB1,…,RBNのそれぞれは、取得された信号に基づいて、物体により電波が反射された反射点までの距離を算出する(ステップSt204)。また、レーダIC RB1,…,RBNのそれぞれは、所定周期で電波の照射、及び電波の反射波の受信を繰り返し実行する。レーダIC RB1,…,RBNのそれぞれは、物体の位置を示す電波の反射点の時系列変化に基づいて、物体の移動速度を算出する(ステップSt204)。さらに、レーダIC RB1,…,RBNのそれぞれは、各レーダICに接続された複数の受信アンテナ間での受信信号の位相差に基づいて反射点までの方位を算出する(ステップSt204)。 Each of the radar ICs RB1, . . . , RBN acquires reflected waves received by the receiving antenna units RBR1, . Each of the radar ICs RB1, . Also, each of the radar ICs RB1, . Each of the radar ICs RB1, . Furthermore, each of the radar ICs RB1, .
 レーダIC RB1,…,RBNのそれぞれは、算出した反射点の距離、方位から物体が検知された検知点の座標を示す点群データを生成する(ステップSt204)。 Each of the radar ICs RB1, .
 レーダIC RB1,…,RBNのそれぞれは、取得された点群データに対し、クラスタリングの処理を行う(ステップSt205)。レーダIC RB1,…,RBNのそれぞれは、クラスタリングされた点群の集合が示す物体の大きさ、算出された点群の移動速度の情報等に基づいて、物体の種別判定を実行する(ステップSt205)。 Each of the radar ICs RB1, . . . , RBN performs clustering processing on the obtained point cloud data (step St205). Each of the radar ICs RB1, . ).
 また、レーダIC RB1,…,RBNのそれぞれは、物体の種別判定結果のうち静止物であると判定された物体がヒトあるいは動物等の生物であるか否かをさらに判定する。レーダIC RB1,…,RBNのそれぞれは、静止物であると判定された物体からの信号を解析し、静止物のバイタル情報(例えば、呼吸数、心拍数、血圧、呼吸間隔、又は心拍間隔等の情報)を算出する(ステップSt205)。レーダIC RB1,…,RBNのそれぞれは、算出結果がバイタル情報であると判断された静止物を、静止している生物(例えば、ヒト、動物)であると判定する(ステップSt205)。なお、ステップSt205の処理は、第1判定処理、第4判定処理の一例である。 In addition, each of the radar ICs RB1, ..., RBN further determines whether an object determined as a stationary object among the object type determination results is a living thing such as a human being or an animal. Each of the radar ICs RB1, . information) is calculated (step St205). Each of the radar ICs RB1, . Note that the process of step St205 is an example of the first determination process and the fourth determination process.
 レーダIC RB1,…,RBNのそれぞれは、クラスタリングされた点群の集合のうち、静止している生物であると判定された点群の集合が示す生物の位置(方位、距離),移動速度等の情報と、算出されたバイタル情報とを生物ごとに対応付けた生物の検知結果を生成する。レーダIC RB1,…,RBNのそれぞれは、生成された生物の検知結果をプロセッサ11に出力する。 Each of the radar ICs RB1, ..., RBN indicates the position (azimuth, distance), movement speed, etc. of the creature indicated by the set of point clouds determined to be a stationary creature among the clustered point clouds. and the calculated vital information are associated with each living thing to generate a detection result of the living thing. Each of the radar ICs RB1, . . . , RBN outputs to the processor 11 the generated organism detection results.
 プロセッサ11は、レーダIC RA1,…,RAMのそれぞれから出力された生物に対応する点群の座標の情報と、レーダIC RB1,…,RBNのそれぞれから出力された生物に対応する点群の座標情報及びバイタル情報とを取得する。プロセッサ11は、遠距離検知用レーダRAの第2の検知エリアAR2と、近距離検知用レーダRBの第1の検知エリアAR1とが重複する重複エリアで検知された生物に関する座標を比較する(ステップSt302)。なお、プロセッサ11は、重複エリアが存在しない場合、重複エリアで検知された生物に関する情報を比較する処理を省略してもよい。 The processor 11 receives the information on the coordinates of the point cloud corresponding to the creature output from each of the radar ICs RA1, . information and vital information. The processor 11 compares the coordinates of the organism detected in the overlap area where the second detection area AR2 of the long-range detection radar RA and the first detection area AR1 of the short-range detection radar RB overlap (step St 302). It should be noted that the processor 11 may omit the process of comparing the information regarding the organisms detected in the overlapping area when the overlapping area does not exist.
 プロセッサ11は、比較の結果、重複エリアで検知された生物のうち遠距離検知用レーダRA及び近距離検知用レーダRBのそれぞれで同一の生物が検知されたと判定した場合、遠距離検知用レーダRAにより検知された生物の検知結果(例えば、位置,距離等)に、近距離検知用レーダRBのみで検知された生物の検知結果(例えば、バイタル情報等)をさらに重畳して、マップ上に重畳する(ステップSt302)。プロセッサ11は、マップ上に、遠距離検知用レーダRAの検知結果と近距離検知用レーダRBの検知結果とを重畳した検知画面SC1を生成して、サーバS1に送信する(ステップSt303)。 As a result of the comparison, when the processor 11 determines that the same living thing is detected by each of the long-distance detection radar RA and the short-distance detection radar RB among the living things detected in the overlapping area, the long-distance detection radar RA The detection results (e.g., position, distance, etc.) of organisms detected by the radar RB for short-range detection (e.g., vital information, etc.) are further superimposed on the map. (step St302). The processor 11 generates a detection screen SC1 in which the detection result of the long-range detection radar RA and the detection result of the short-range detection radar RB are superimposed on the map, and transmits the detection screen SC1 to the server S1 (step St303).
 なお、監視レーダ装置RD1は、静止している生物と、動いている生物とを検知してもよい。このような場合、監視レーダ装置RD1は、静止している生物の検知結果と、静止していない物体の検知結果とをマップ上に重畳した検知画面SC1を生成して、サーバS1に送信する。 Note that the surveillance radar device RD1 may detect both stationary and moving creatures. In such a case, the surveillance radar device RD1 generates a detection screen SC1 by superimposing the detection result of a stationary creature and the detection result of a non-stationary object on a map, and transmits the detection screen SC1 to the server S1.
 以上により、監視レーダ装置RD1は、近距離検知用レーダRBにより検知された静止物が単なる物であるか、ヒトあるいは動物等の生物であるかを判定できる。つまり、監視レーダ装置RD1は、静止している生物の検知精度を向上できる。 As described above, the surveillance radar device RD1 can determine whether the stationary object detected by the short-range detection radar RB is a mere object or a creature such as a human being or an animal. In other words, the surveillance radar device RD1 can improve the detection accuracy of stationary living things.
(実施の形態1の変形例2)
 上述した実施の形態1における監視レーダ装置RD1は、遠距離(第2の検知エリアAR2)に位置する物体の検知と、近距離(第1の検知エリアAR1)に位置する物体の検知及びバイタルセンシングとを実行する例を示した。実施の形態1の変形例2における監視レーダ装置RD1は、検知された物体がヒトであるか動物であるかを判定する例について説明する。
(Modification 2 of Embodiment 1)
The surveillance radar device RD1 in Embodiment 1 described above detects an object positioned at a long distance (second detection area AR2), detects an object positioned at a short distance (first detection area AR1), and performs vital sensing. and an example of running An example in which the surveillance radar device RD1 according to Modification 2 of Embodiment 1 determines whether a detected object is a human or an animal will be described.
 実施の形態1の変形例2における監視レーダ装置RD1の構成は、実施の形態1における監視レーダ装置RD1と同様の内部構成を有する。以降、実施の形態1の変形例2における監視レーダ装置RD1の各内部構成が実現する機能について説明する。 The configuration of the surveillance radar device RD1 in Modification 2 of Embodiment 1 has the same internal configuration as that of the surveillance radar device RD1 in Embodiment 1. Hereinafter, functions realized by each internal configuration of the surveillance radar device RD1 in Modification 2 of Embodiment 1 will be described.
 監視レーダ装置RD1は、検知された物体がヒトであるか動物であるかを判定する。 The surveillance radar device RD1 determines whether the detected object is human or animal.
 学習モデルメモリ15は、物体がヒトであるか動物であるかを判定可能な学習済みAIモデルをさらに記憶してもよい。 The learning model memory 15 may further store a learned AI model that can determine whether an object is human or animal.
 レーダIC RB1,…,RBNのそれぞれは、クラスタリングされた点群の集合の大きさ、解析されたバイタル情報(例えば、呼吸数、心拍数、血圧、呼吸間隔、又は心拍間隔等の情報)、あるいは電波の反射強度等に基づいて、物体がヒトであるか動物であるかを判定する。また、レーダIC RB1,…,RBNのそれぞれは、カメラC1内で撮像画像を画像解析した結果を受信し、その結果を利用して物体がヒトであるか動物であるかを判定してもよい。なお、これらの判定処理は、プロセッサ11により実行されてもよい。 Each of the radar ICs RB1, . Whether the object is a human or an animal is determined based on the intensity of reflected radio waves. Also, each of the radar ICs RB1, ..., RBN may receive the result of image analysis of the captured image in the camera C1 and use the result to determine whether the object is human or animal. . Note that these determination processes may be executed by the processor 11 .
 レーダIC RB1,…,RBNのそれぞれは、クラスタリングされた点群の集合のうち、ヒトであると判定された点群の集合が示す物体の位置(方位、距離)、移動速度等の情報と、バイタル情報とをプロセッサ11に出力する。 Each of the radar ICs RB1, . and vital information to the processor 11 .
 プロセッサ11は、レーダIC RB1,…,RBNのそれぞれから出力されたヒトであると判定された点群の座標情報と、バイタル情報とを取得する。プロセッサ11は、取得されたヒトの位置と、バイタル情報とをマップ上に重畳して、検知画面SC1(図11参照)を生成する。プロセッサ11は、生成された検知画面SC1をサーバS1に送信する。 The processor 11 acquires the coordinate information and the vital information of the point cloud determined to be human output from each of the radar ICs RB1, . . . , RBN. The processor 11 superimposes the acquired human position and vital information on the map to generate a detection screen SC1 (see FIG. 11). The processor 11 transmits the generated detection screen SC1 to the server S1.
 メモリ12は、ヒトの侵入が禁止されている領域(以降、「侵入禁止領域」と表記)、あるいはヒトの侵入が禁止されるライン(以降、「侵入禁止ライン」と表記)を記憶する。侵入禁止領域AR10(図4参照)及び侵入禁止ラインLN(図4参照)のそれぞれは、事前に管理者により設定され、検知エリアAR0内に複数設定されてもよい。 The memory 12 stores areas where human entry is prohibited (hereinafter referred to as "no-entry areas") or lines where human entry is prohibited (hereinafter referred to as "no-entry lines"). The entry prohibition area AR10 (see FIG. 4) and the entry prohibition line LN (see FIG. 4) may each be set in advance by the administrator, and a plurality of them may be set within the detection area AR0.
 なお、侵入禁止ラインLNは、侵入が禁止される方向の情報を含んでいてもよい。例えば、図4に示す例において、侵入禁止ラインLNは、ヒトTGが矢印の方向から侵入禁止ラインLNを通過することを禁止するように設定されていてもよい。なお、侵入禁止領域AR10や侵入禁止ラインLNは、動物の侵入を禁止するように設定されていてもよい。 It should be noted that the entry prohibition line LN may include information on the direction in which entry is prohibited. For example, in the example shown in FIG. 4, the entry prohibition line LN may be set to prohibit the human TG from passing through the entry prohibition line LN in the direction of the arrow. In addition, entry prohibition area AR10 and entry prohibition line LN may be set so as to prohibit entry of animals.
 また、データベースDB11は、侵入禁止領域AR10の情報と、侵入禁止領域AR10への侵入が許可される人物に関する情報(例えば、侵入禁止領域AR10への侵入が許可されているヒトの顔画像、バイタル情報等)とを対応付けて記憶してもよい。同様に、データベースDB11は、侵入禁止ラインLNの情報に、侵入禁止ラインLNの通過が許可される人物に関する情報(例えば、侵入禁止領域AR10への侵入が許可されているヒトの顔画像、バイタル情報等)を対応付けて記憶してもよい。 The database DB11 also stores information on the prohibited entry area AR10 and information on persons permitted to enter the prohibited area AR10 (for example, facial images and vital information of persons permitted to enter the prohibited area AR10). etc.) may be stored in association with each other. Similarly, in the database DB11, the information on the entry prohibition line LN includes information on persons permitted to pass through the entry prohibition line LN (for example, a face image, vital information, etc. of a person permitted to enter the entry prohibition area AR10). etc.) may be associated and stored.
 プロセッサ11は、侵入禁止領域AR10が設定されている場合、検知されたヒトの位置に基づいて、侵入禁止領域AR10内にヒトが侵入したか否かを判定する。 When the no-entry area AR10 is set, the processor 11 determines whether or not a person has entered the no-entry area AR10 based on the detected position of the person.
 プロセッサ11は、侵入禁止領域AR10内にヒトが侵入したと判定した場合、侵入禁止領域AR10内へ侵入したヒトの位置情報を含み、侵入禁止領域AR10内への人物の侵入を検知した旨を通知する警報を生成する。プロセッサ11は、生成された警報をサーバS1に送信する。なお、警報は、検知画面SC1であってもよい。また、プロセッサ11は、カメラC1から送信された撮像画像から侵入禁止領域AR10内に侵入したヒトが映る少なくとも一部の領域が切り出された撮像画像を含む警報を生成してもよい。 When the processor 11 determines that a person has entered the intrusion prohibited area AR10, the processor 11 includes the position information of the person who has entered the intrusion prohibited area AR10 and notifies that the person's entry into the intrusion prohibited area AR10 has been detected. generate an alert to Processor 11 transmits the generated alert to server S1. Note that the warning may be the detection screen SC1. Further, the processor 11 may generate an alarm including a captured image obtained by cutting out at least a partial area in which a person who has entered the no-entry area AR10 is captured from the captured image transmitted from the camera C1.
 また、プロセッサ11は、侵入禁止ラインLNが設定されている場合、検知されたヒトの位置の時系列変化に基づいて、人物の動線情報(移動軌跡の情報)を取得する。プロセッサ11は、人物の動線情報に基づいて、検知されたヒトが侵入禁止ラインLNを通過したか否か、あるいは検知されたヒトが所定の方向から侵入禁止ラインLNを通過したか否かを判定する。 In addition, when the entry prohibition line LN is set, the processor 11 acquires the person's flow line information (movement trajectory information) based on the time-series change in the detected position of the person. Based on the flow line information of the person, the processor 11 determines whether the detected person has passed the entry prohibition line LN, or whether the detected person has passed the entry prohibition line LN from a predetermined direction. judge.
 プロセッサ11は、侵入禁止ラインLN内をヒトが通過したと判定した場合、侵入禁止ラインLNを通過したヒトの位置情報を含み、侵入禁止ラインLNをヒトが通過したことを検知した旨を通知する警報を生成する。プロセッサ11は、生成された警報をサーバS1に送信する。なお、警報は、検知画面SC1であってもよい。また、プロセッサ11は、カメラC1から送信された撮像画像から侵入禁止ラインLNを通過したヒトが映る少なくとも一部の領域が切り出された撮像画像を含む警報を生成してもよい。 When the processor 11 determines that the person has passed through the entry prohibition line LN, the processor 11 includes the position information of the person who has passed through the entry prohibition line LN and notifies that the person has passed through the entry prohibition line LN. Generate an alert. Processor 11 transmits the generated alert to server S1. Note that the warning may be the detection screen SC1. Further, the processor 11 may generate an alarm including a captured image obtained by cutting out at least a partial region in which a person who has passed through the no-entry line LN is captured from the captured image transmitted from the camera C1.
 なお、プロセッサ11は、設定された侵入禁止領域AR10あるいは侵入禁止ラインLNが近距離検知用レーダRBにより検知可能な第1の検知エリアAR1内に設定されている場合、侵入禁止領域AR10内に侵入したヒト、あるいは侵入禁止ラインLNを通過したヒトがデータベースDB11に登録された人物であるか否かを判定してもよい。プロセッサ11は、侵入禁止領域AR10内への侵入、あるいは侵入禁止ラインLNの通過が検知されたヒトのバイタル情報と、データベースDB11に登録された複数の人物のそれぞれのバイタル情報とを照合する。 It should be noted that the processor 11, when the set intrusion prohibition area AR10 or the intrusion prohibition line LN is set within the first detection area AR1 detectable by the short-distance detection radar RB, prevents the processor 11 from entering the intrusion prohibition area AR10. It may be determined whether or not the person who has entered or passed through the entry prohibition line LN is a person registered in the database DB11. The processor 11 collates the vital information of the person detected to have entered the no-entry area AR10 or passed the no-entry line LN with the vital information of each of the plurality of persons registered in the database DB11.
 プロセッサ11は、データベースDB11に登録された複数の人物のそれぞれのバイタル情報に、検知されたヒトのバイタル情報と同一あるいは類似するバイタル情報があると判定した場合、警報の生成処理を省略する。一方、プロセッサ11は、データベースDB11に登録された複数の人物のそれぞれのバイタル情報に、検知されたヒトのバイタル情報と同一あるいは類似するバイタル情報がないと判定した場合、警報を生成し、サーバS1に送信する。 When the processor 11 determines that the vital information of each of the plurality of persons registered in the database DB11 includes vital information identical or similar to the detected human vital information, the processor 11 omits the alarm generation process. On the other hand, when the processor 11 determines that the vital information of each of the plurality of persons registered in the database DB11 does not contain the vital information identical or similar to the detected human vital information, the processor 11 generates an alarm and Send to
 次に、図7及び図8のそれぞれを参照して、監視レーダ装置RD1の動作手順例について説明する。図7は、実施の形態1の変形例2における監視レーダ装置RD1の動作手順例を示すフローチャートである。図8は、実施の形態1の変形例2における監視レーダ装置RD1の動作手順例を示すフローチャートである。なお、ステップSt102~ステップSt103、及びステップSt203~ステップSt206のそれぞれの処理は、レーダICにより実行されてもよいし、プロセッサ11により実行されてもよい。 Next, an operation procedure example of the surveillance radar device RD1 will be described with reference to FIGS. 7 and 8, respectively. FIG. 7 is a flow chart showing an operation procedure example of the surveillance radar device RD1 according to the second modification of the first embodiment. FIG. 8 is a flow chart showing an operation procedure example of the surveillance radar device RD1 according to the second modification of the first embodiment. The processing of steps St102 to St103 and steps St203 to St206 may be executed by the radar IC or by the processor 11. FIG.
 図7に示すフローチャートの説明において、ステップSt100~ステップSt103、及びステップSt200~ステップSt203のそれぞれの処理については、図5と同様の処理であるため、説明を省略する。  In the description of the flowchart shown in FIG. 7, the processes of steps St100 to St103 and steps St200 to St203 are the same as those of FIG. 5, so description thereof will be omitted.
 近距離検知用レーダRBにより実行される動作手順について説明する。 The operation procedure executed by the short-range detection radar RB will be explained.
 レーダIC RB1,…,RBNのそれぞれは、物体のバイタル情報に基づいて、物体がヒト又は動物(つまり、ヒト以外の動物)であるか否かを判定する(ステップSt206)。ステップSt206の処理は、第3判定処理又は第6判定処理の一例である。なお、レーダIC RB1,…,RBNのそれぞれは、上述したように、クラスタリングされた点群の大きさや電波の反射強度等に基づいて、物体がヒトであるか動物であるかを判定してもよい。 Each of the radar ICs RB1, . The process of step St206 is an example of the third determination process or the sixth determination process. In addition, each of the radar ICs RB1, ..., RBN can determine whether the object is a human or an animal based on the size of the clustered point cloud, the radio wave reflection intensity, etc., as described above. good.
 また、レーダIC RB1,…,RBNのそれぞれは、カメラC1から送信された画像解析結果を取得する。レーダIC RB1,…,RBNのそれぞれは、カメラC1から取得した、物体の位置と物体の種別を含む画像解析結果を用いて、物体がヒトであるか動物であるかを判定してもよい。 Also, each of the radar ICs RB1, ..., RBN acquires the image analysis results transmitted from the camera C1. Each of the radar ICs RB1, .
 レーダIC RB1,…,RBNのそれぞれは、ヒトであると判定された物体の位置(方位、距離),移動速度等の情報と、ヒトのバイタル情報とを人物ごとに対応付けた物体(ヒト)の検知結果を生成し、プロセッサ11に出力する。 Each of the radar ICs RB1, ..., RBN is an object (human) in which information such as the position (azimuth, distance) and movement speed of an object determined to be a human is associated with human vital information for each person is generated and output to the processor 11 .
 プロセッサ11は、遠距離検知用レーダRAから出力された物体(例えば、人物,動物等)の検知結果と、近距離検知用レーダRBから出力された物体(ヒト)の検知結果とを取得する。プロセッサ11は、取得されたこれらの検知結果をマップ上に重畳した検知画面SC1(図11参照)を生成する(ステップSt304)。 The processor 11 acquires the object (for example, person, animal, etc.) detection result output from the long-range detection radar RA and the object (human) detection result output from the short-range detection radar RB. The processor 11 generates a detection screen SC1 (see FIG. 11) in which these acquired detection results are superimposed on the map (step St304).
 プロセッサ11は、設定された侵入禁止領域AR10の全域が近距離検知用レーダRBにより第1の検知エリアAR1内であるか否かを判定する(ステップSt305)。 The processor 11 determines whether or not the entire set no-entry area AR10 is within the first detection area AR1 by the short-range detection radar RB (step St305).
 プロセッサ11は、ステップSt305の処理において、設定された侵入禁止領域AR10の全域が近距離検知用レーダRBにより第1の検知エリアAR1内であると判定した場合(ステップSt305,YES)、近距離検知用レーダRBにより検知された物体が侵入禁止領域AR10に侵入したか否かを判定する(ステップSt306)。 When the processor 11 determines in the processing of step St305 that the entire set no-entry area AR10 is within the first detection area AR1 by the short-distance detection radar RB (step St305, YES), the short-distance detection It is determined whether or not the object detected by the radar RB has entered the no-entry area AR10 (step St306).
 プロセッサ11は、ステップSt306の処理において、検知された物体が侵入禁止領域AR10内に侵入したと判定した場合(ステップSt306,YES)、検知された物体がヒトであるか否かを判定する(ステップSt307)。ここで、ステップSt307の処理は、ステップSt206の処理がステップSt307の処理の前に実行される場合には省略されてもよい。 When the processor 11 determines in the process of step St306 that the detected object has entered the no-entry area AR10 (step St306, YES), it determines whether the detected object is a person (step St 307). Here, the process of step St307 may be omitted when the process of step St206 is executed before the process of step St307.
 一方、プロセッサ11は、ステップSt306の処理において、検知された物体が侵入禁止領域AR10内に侵入していないと判定した場合(ステップSt306,NO)、ステップSt304の処理で生成された検知画面SC1をサーバS1に送信する(ステップSt311)。 On the other hand, when the processor 11 determines in the process of step St306 that the detected object has not entered the no-entry area AR10 (step St306, NO), the processor 11 displays the detection screen SC1 generated in the process of step St304. It is transmitted to the server S1 (step St311).
 プロセッサ11は、ステップSt307の処理において、検知された物体がヒトであると判定した場合(ステップSt307,YES)、侵入禁止領域AR10内へのヒトの侵入を検知した旨を通知する警報を生成し、サーバS1に送信(発報)する(ステップSt308)。また、プロセッサ11は、ステップSt304の処理で生成された検知画面SC1をサーバS1に送信する(ステップSt311)。 When the processor 11 determines in the process of step St307 that the detected object is a human (step St307, YES), it generates an alarm notifying that a human has entered the no-entry area AR10. , to the server S1 (step St308). The processor 11 also transmits the detection screen SC1 generated in the process of step St304 to the server S1 (step St311).
 一方、プロセッサ11は、ステップSt307の処理において、検知されたヒトが人物でないと判定した場合(ステップSt307,NO)、ステップSt311の処理に移行する。 On the other hand, when the processor 11 determines in the process of step St307 that the detected person is not a person (step St307, NO), the process proceeds to step St311.
 また、プロセッサ11は、ステップSt305の処理において、設定された侵入禁止領域AR10の全域が近距離検知用レーダRBにより第1の検知エリアAR1内でないと判定した場合(ステップSt305,NO)、遠距離検知用レーダRAあるいは近距離検知用レーダRBにより検知された物体が侵入禁止領域AR10に侵入したか否かを判定する(ステップSt309)。 In addition, when the processor 11 determines in the process of step St305 that the entire area of the set no-entry area AR10 is not within the first detection area AR1 by the short-range detection radar RB (step St305, NO), the long-distance It is determined whether or not an object detected by the detection radar RA or the short-range detection radar RB has entered the no-entry area AR10 (step St309).
 プロセッサ11は、ステップSt309の処理において、検知された物体が侵入禁止領域AR10内に侵入したと判定した場合(ステップSt309,YES)、侵入禁止領域AR10内への物体の侵入を検知した旨を通知する警報を生成し、サーバS1に送信(発報)する(ステップSt308)。なお、ここで検知される物体は、バイタル情報を取得可能であるヒト,動物、あるいはバイタル情報を取得不可である車両,二輪車等の動体を含む。 When the processor 11 determines in the process of step St309 that the detected object has entered the entry-prohibited area AR10 (step St309, YES), it notifies that the entry of the object into the entry-prohibited area AR10 has been detected. An alarm is generated and transmitted (issued) to the server S1 (step St308). Objects detected here include moving objects such as humans and animals for which vital information can be obtained, and vehicles and motorcycles for which vital information cannot be obtained.
 一方、プロセッサ11は、ステップSt309の処理において、検知された物体が侵入禁止領域AR10内に侵入していないと判定した場合(ステップSt309,NO)、ステップSt311の処理に移行する。 On the other hand, when the processor 11 determines in the process of step St309 that the detected object has not entered the no-entry area AR10 (step St309, NO), it proceeds to the process of step St311.
 以上により、監視レーダ装置RD1は、検知エリアAR0内に侵入禁止領域AR10が設定されている場合には、侵入禁止領域AR10へ侵入する物体を検知できる。また、監視レーダ装置RD1は、侵入禁止ラインLN等が設定されている場合にも同様に、侵入禁止ラインLNを通過する物体を検知できる。 As described above, the surveillance radar device RD1 can detect an object entering the no-entry area AR10 when the no-no-entry area AR10 is set within the detection area AR0. Similarly, the monitoring radar device RD1 can detect an object passing through the entry prohibition line LN when the entry prohibition line LN or the like is set.
 監視レーダ装置RD1は、侵入禁止領域AR10へ侵入する物体、あるいは侵入禁止ラインLNを通過する物体を検知したタイミングで、検知画面SC1と異なる警報を生成してサーバS1に送信する。これにより、管理者は、警報に基づいて、侵入禁止領域AR10へ侵入する物体、あるいは侵入禁止ラインLNを通過する物体が検知されたことを直感的に理解することができる。 When the surveillance radar device RD1 detects an object entering the no-entry area AR10 or passing through the no-entry line LN, it generates an alarm different from the detection screen SC1 and transmits it to the server S1. This allows the administrator to intuitively understand that an object entering the prohibited entry area AR10 or an object passing through the prohibited entry line LN has been detected based on the alarm.
 なお、侵入禁止領域AR10、あるいは侵入禁止ラインLNの設定は、実施の形態1及び実施の形態2の変形例1で説明した例で実行されてもよい。 The entry prohibition area AR10 or the entry prohibition line LN may be set according to the example described in the first modification of the first and second embodiments.
 例えば、実施の形態1における監視レーダ装置RD1は、ステップSt300の処理の後に、ステップSt305~ステップSt311のそれぞれの処理を実行してもよい。なお、監視レーダ装置RD1は、ステップSt307の処理を省略してもよい。これにより、監視レーダ装置RD1は、侵入禁止領域AR10へ侵入する物体、あるいは侵入禁止ラインLNを通過する物体を検知できる。 For example, the surveillance radar device RD1 in Embodiment 1 may execute the processes of steps St305 to St311 after the process of step St300. Note that the surveillance radar device RD1 may omit the process of step St307. As a result, the surveillance radar device RD1 can detect an object entering the prohibited entry area AR10 or an object passing through the prohibited entry line LN.
 また、例えば、実施の形態1の変形例1における監視レーダ装置RD1は、ステップSt302の処理の後に、ステップSt305~ステップSt311のそれぞれの処理を実行してもよい。なお、監視レーダ装置RD1は、ステップSt307の処理を省略してもよい。これにより、監視レーダ装置RD1は、侵入禁止領域AR10へ侵入する物体、あるいは侵入禁止ラインLNを通過する物体を検知できる。 Further, for example, the surveillance radar device RD1 in Modification 1 of Embodiment 1 may execute the processes of steps St305 to St311 after the process of step St302. Note that the surveillance radar device RD1 may omit the process of step St307. As a result, the surveillance radar device RD1 can detect an object entering the prohibited entry area AR10 or an object passing through the prohibited entry line LN.
(実施の形態1の変形例3)
 上述した実施の形態1における監視レーダ装置RD1は、遠距離(第2の検知エリアAR2)に位置する物体の検知と、近距離(第1の検知エリアAR1)に位置する物体の検知及びバイタルセンシングとを実行する例を示した。実施の形態1の変形例3における監視レーダ装置RD1は、取得されたバイタル情報を用いた物体の感情指標に基づいて、検知された物体が管理者に警報すべき物体であるか否かの判定を実行する例について説明する。
(Modification 3 of Embodiment 1)
The surveillance radar device RD1 in Embodiment 1 described above detects an object positioned at a long distance (second detection area AR2), detects an object positioned at a short distance (first detection area AR1), and performs vital sensing. and an example of running The surveillance radar device RD1 in Modification 3 of Embodiment 1 determines whether or not the detected object is an object that should be warned to the administrator, based on the emotion index of the object using the acquired vital information. An example of executing
 実施の形態1の変形例3における監視レーダ装置RD1の構成は、実施の形態1における監視レーダ装置RD1と同様の内部構成を有する。以降、実施の形態1の変形例3における監視レーダ装置RD1の各内部構成が実現する機能について説明する。 The configuration of the surveillance radar device RD1 in Modification 3 of Embodiment 1 has the same internal configuration as that of the surveillance radar device RD1 in Embodiment 1. Hereinafter, functions realized by each internal configuration of the surveillance radar device RD1 in Modification 3 of Embodiment 1 will be described.
 監視レーダ装置RD1は、バイタル情報の時系列変化に基づいて、人物の感情指標を算出する。監視レーダ装置RD1は、算出された人物の感情指標に基づいて、警報判定が必要であるか否かを判定する。 The surveillance radar device RD1 calculates a person's emotion index based on the chronological change in vital information. The surveillance radar device RD1 determines whether or not a warning determination is necessary based on the calculated emotion index of the person.
 近距離検知用レーダRBは、取得されたバイタル情報の時系列変化に基づいて、物体(例えば、近距離検知用レーダRBにより検知されたヒト)ごとの感情を推定した感情指標を算出する。近距離検知用レーダRBは、算出されたヒトの感情指標に基づいて、警報を生成するか否かを判定するための注意指標を評価する。 The short-range detection radar RB calculates an emotion index that estimates the emotion of each object (for example, a person detected by the short-range detection radar RB) based on the chronological change in the acquired vital information. The short-range detection radar RB evaluates an attention index for determining whether to generate an alarm based on the calculated human emotion index.
 例えば、近距離検知用レーダRBは、算出されたヒトの感情指標が緊張,怒り等を示す場合、注意指標を高く評価する。また、近距離検知用レーダRBは、ヒトの感情指標と取得されたバイタル情報とに基づいて、注意指標を評価してもよい。近距離検知用レーダRBは、評価された注意指標が閾値以上であるか否かに基づいて、警報を生成するか否かを判定する。近距離検知用レーダRBは、評価された注意指標が閾値以上であると判定した場合、侵入禁止ラインLNを通過した人物の位置情報を含み、近距離検知用レーダRBにより検知されたヒトが注意すべき人物であることを検知した旨を通知する警報(第1通知の一例)を生成して、プロセッサ11に出力する。 For example, if the calculated human emotional index indicates tension, anger, etc., the short-range detection radar RB highly evaluates the attention index. Also, the short-range detection radar RB may evaluate the attention index based on the human emotion index and the acquired vital information. The short-range detection radar RB determines whether to generate an alarm based on whether the evaluated caution index is greater than or equal to the threshold. When the short-distance detection radar RB determines that the evaluated caution index is equal to or greater than the threshold, the short-distance detection radar RB includes the position information of the person who passed the entry prohibition line LN, and the person detected by the short-distance detection radar RB is cautioned. An alarm (an example of a first notification) is generated and output to the processor 11 to notify that it has been detected that the person should be identified.
 なお、上述した感情指標を算出する処理、注意指標を評価する処理、及び警報の生成を判定する処理のそれぞれは、プロセッサ11により実行されてもよい。以下、これらの処理のそれぞれがプロセッサ11により実行される場合におけるプロセッサ11及び学習モデルメモリ15のそれぞれが実現する機能について説明する。 Note that the processor 11 may perform each of the processing of calculating the emotion index, the processing of evaluating the attention index, and the processing of determining whether to generate an alarm. Functions realized by the processor 11 and the learning model memory 15 when each of these processes is executed by the processor 11 will be described below.
 学習モデルメモリ15は、ヒトのバイタル情報の時系列変化に基づいて、この人物の感情指標を算出可能な学習済みAIモデルと、をさらに記憶する。 The learning model memory 15 further stores a learned AI model capable of calculating a person's emotional index based on changes in the person's vital information over time.
 プロセッサ11は、取得されたバイタル情報の時系列変化に基づいて、物体ごとの感情を推定した感情指標を算出する。プロセッサ11は、算出されたヒトの感情指標に基づいて、警報を生成するか否かを判定するための注意指標を評価する。 The processor 11 calculates an emotion index that estimates the emotion of each object based on the chronological change in the acquired vital information. Processor 11 evaluates an attention index to determine whether to generate an alert based on the calculated human emotion index.
 プロセッサ11は、評価された注意指標が閾値以上であるか否かに基づいて、警報を生成するか否かを判定する。プロセッサ11は、評価された注意指標が閾値以上であると判定した場合、侵入禁止ラインLNを通過した人物の位置情報を含み、近距離検知用レーダRBにより検知されたヒトが注意すべき人物であることを検知した旨を通知する警報(第1通知の一例)を生成する。プロセッサ11は、生成された警報をサーバS1に送信する。なお、警報は、検知画面SC1であってもよい。また、プロセッサ11は、カメラC1から送信された撮像画像から注意指標が閾値以上であると判定された人物が映る少なくとも一部の領域が切り出された撮像画像を含む警報を生成してもよい。 The processor 11 determines whether to generate an alert based on whether the evaluated caution index is equal to or greater than the threshold. When the processor 11 determines that the evaluated caution index is equal to or greater than the threshold, the processor 11 includes the position information of the person who passed the no-entry line LN, and the person detected by the short-distance detection radar RB is the person to whom attention should be paid. An alarm (an example of a first notification) is generated to notify that a certain thing has been detected. Processor 11 transmits the generated alert to server S1. Note that the warning may be the detection screen SC1. Further, the processor 11 may generate an alarm including a captured image obtained by cutting out at least a partial region of a person whose attention index is determined to be equal to or greater than the threshold from the captured image transmitted from the camera C1.
 次に、図9及び図10のそれぞれを参照して、監視レーダ装置RD1の動作手順例について説明する。図9は、実施の形態1の変形例3における監視レーダ装置RD1の動作手順例を示すフローチャートである。図10は、実施の形態1の変形例3における監視レーダ装置RD1の動作手順例を示すフローチャートである。なお、ステップSt102~ステップSt103、及びステップSt203~ステップSt206のそれぞれの処理は、レーダICにより実行されてもよいし、プロセッサ11により実行されてもよい。 Next, an operation procedure example of the surveillance radar device RD1 will be described with reference to FIGS. 9 and 10, respectively. FIG. 9 is a flow chart showing an operation procedure example of the surveillance radar device RD1 according to the third modification of the first embodiment. FIG. 10 is a flow chart showing an operation procedure example of the surveillance radar device RD1 according to the third modification of the first embodiment. The processing of steps St102 to St103 and steps St203 to St206 may be executed by the radar IC or by the processor 11. FIG.
 図9に示すフローチャートの説明において、ステップSt100~ステップSt103、及びステップSt200~ステップSt203のそれぞれの処理については、図5と同様の処理であるため、説明を省略する。  In the description of the flowchart shown in FIG. 9, the processes of steps St100 to St103 and steps St200 to St203 are the same as those of FIG. 5, so description thereof will be omitted.
 近距離検知用レーダRBにより実行される動作手順について説明する。 The operation procedure executed by the short-range detection radar RB will be explained.
 レーダIC RB1,…,RBNのそれぞれは、取得された物体のバイタル情報を時系列に並べた時系列データを生成する。レーダIC RB1,…,RBNのそれぞれは、物体のバイタル情報の時系列データに基づいて、ヒトの感情指標を算出する(ステップSt207)。レーダIC RB1,…,RBNのそれぞれは、算出された人物の感情指標に基づいて、検知されたヒトが注意(監視)すべき人物であるか否かを判定する(ステップSt207)。 Each of the radar ICs RB1, ..., RBN generates time series data in which the acquired vital information of the object is arranged in time series. Each of the radar ICs RB1, . Each of the radar ICs RB1, .
 レーダIC RB1,…,RBNのそれぞれは、検知されたヒトが注意すべき人物であるか否かの情報と、ヒトの位置(方位、距離),移動速度等の情報と、ヒトのバイタル情報とを人物ごとに対応付けた人物の検知結果を生成し、プロセッサ11に出力する。 Each of the radar ICs RB1, ..., RBN contains information as to whether or not the detected person is a person to be warned of, information such as the location (azimuth, distance) and movement speed of the person, and vital information of the person. are associated with each person, and output to the processor 11 .
 プロセッサ11は、遠距離検知用レーダRAから出力された物体(例えば、ヒト,動物等)の検知結果と、近距離検知用レーダRBから出力された物体(ヒト)の検知結果とを取得する。プロセッサ11は、取得されたこれらの物体の検知結果をマップ上に重畳した検知画面SC1(図11参照)を生成する(ステップSt312)。 The processor 11 acquires the object (for example, human, animal, etc.) detection result output from the long-range detection radar RA and the object (human) detection result output from the short-range detection radar RB. The processor 11 generates a detection screen SC1 (see FIG. 11) in which the acquired detection results of these objects are superimposed on the map (step St312).
 プロセッサ11は、近距離検知用レーダRBにより検知されたヒトが侵入禁止領域AR10に侵入したか否かを判定する(ステップSt313)。 The processor 11 determines whether or not the person detected by the short-range detection radar RB has entered the no-entry area AR10 (step St313).
 プロセッサ11は、ステップSt313の処理において、検知されたヒトが侵入禁止領域AR10内に侵入したと判定した場合(ステップSt313,YES)、検知された物体の注意指標を算出する。プロセッサ11は、算出された注意指標が閾値以上であるか否かを判定する(ステップSt314)。 When the processor 11 determines in the process of step St313 that the detected human has entered the no-entry area AR10 (step St313, YES), it calculates the caution index of the detected object. The processor 11 determines whether or not the calculated caution index is greater than or equal to the threshold (step St314).
 一方、プロセッサ11は、ステップSt313の処理において、検知されたヒトが侵入禁止領域AR10内に侵入していないと判定した場合(ステップSt313,NO)、遠距離検知用レーダRAと近距離検知用レーダとのそれぞれにより検知された物体の検知結果をマップ上に重畳した検知画面SC1を生成する(ステップSt315)。プロセッサ11は、生成された検知画面SC1をサーバS1に送信する(ステップSt315)。 On the other hand, when the processor 11 determines in the process of step St313 that the detected human has not entered the no-entry area AR10 (step St313, NO), the long-distance detection radar RA and the short-distance detection radar A detection screen SC1 is generated by superimposing the detection results of the objects detected by each of and on a map (step St315). The processor 11 transmits the generated detection screen SC1 to the server S1 (step St315).
 プロセッサ11は、ステップSt314の処理において、算出された注意指標が閾値以上であると判定した場合(ステップSt314,YES)、注意すべきヒトを検知した旨を通知する警報を生成し、サーバS1に送信(発報)する(ステップSt316)。 When the processor 11 determines in the process of step St314 that the calculated caution index is equal to or greater than the threshold (step St314, YES), it generates an alarm notifying that a person to be warned of has been detected, and sends the warning to the server S1. It transmits (issues a report) (step St316).
 一方、プロセッサ11は、ステップSt314の処理において、算出された注意指標が閾値以上でないと判定した場合(ステップSt314,NO)、ステップSt315の処理に移行する。 On the other hand, when the processor 11 determines in the process of step St314 that the calculated attention index is not equal to or greater than the threshold (step St314, NO), the process proceeds to step St315.
 以上により、監視レーダ装置RD1は、検知された人物の感情指標に基づく注意指標を算出し、算出された注意指標に基づいて、注意(つまり、監視,警戒)が必要な人物を検知したことを管理者に発報できる。これにより、監視レーダ装置RD1は、管理者による監視業務において注意すべき人物の監視の検知及び監視を支援できる。 As described above, the surveillance radar device RD1 calculates a caution index based on the emotion index of the detected person and, based on the calculated caution index, detects the detection of a person requiring caution (that is, monitoring or vigilance). You can notify the administrator. As a result, the surveillance radar device RD1 can support the detection and surveillance of a person to whom attention should be paid in the surveillance work by the administrator.
 次に、図11を参照して、監視レーダ装置RD1により生成される検知画面SC1について説明する。図11は、検知画面SC1の一例を示す図である。 Next, the detection screen SC1 generated by the surveillance radar device RD1 will be described with reference to FIG. FIG. 11 is a diagram showing an example of the detection screen SC1.
 検知画面SC1は、監視レーダ装置RD1のプロセッサ11により生成され、サーバS1によってモニタMNに表示される。なお、検知画面SC1は、2次元のマップ、あるいは3次元のマップ上に検知結果が重畳されて生成されてもよい。図11に示す検知画面SC1は、2次元マップ上に検知結果が重畳されて生成された例を示す。 The detection screen SC1 is generated by the processor 11 of the surveillance radar device RD1 and displayed on the monitor MN by the server S1. Note that the detection screen SC1 may be generated by superimposing the detection result on a two-dimensional map or a three-dimensional map. A detection screen SC1 shown in FIG. 11 shows an example generated by superimposing a detection result on a two-dimensional map.
 検知画面SC1は、監視レーダ装置RD1の検知エリアAR0を含むマップ上に、監視レーダ装置RD1の設置位置PS0と、監視レーダ装置RD1により検知された物体PS1,PS2,PS3のそれぞれの位置が重畳される。また、検知画面SC1は、第1判定処理又は第4判定処理の結果、近距離検知用レーダRBにより検知された物体PS3が生物(人物、動物等)であると判定した場合、バイタル情報INFがさらに重畳される。なお、図11に示すバイタル情報INFは、物体PS3の心拍数情報「心拍:○○○」と、呼吸数情報「呼吸:×××」とを含む例を示すが、これに限定されない。 In the detection screen SC1, the installation position PS0 of the surveillance radar device RD1 and the respective positions of the objects PS1, PS2, and PS3 detected by the surveillance radar device RD1 are superimposed on a map including the detection area AR0 of the surveillance radar device RD1. be. Further, when the detection screen SC1 determines that the object PS3 detected by the short-range detection radar RB is a living thing (person, animal, etc.) as a result of the first determination process or the fourth determination process, the vital information INF is are further superimposed. Note that the vital information INF shown in FIG. 11 shows an example including heart rate information “heartbeat: ◯◯◯” and respiration rate information “breathing: XXX” of the object PS3, but is not limited to this.
 例えば、バイタル情報INFは、第1の検知エリアAR1において検知された物体がヒトであるか否かを判定する第3判定処置又は第6判定処理の結果を含んでもよい。例えば、プロセッサ11は、検知された物体がヒトであると判定された場合、物体がヒトであることを通知する情報を含むバイタル情報INF(第1通知情報の一例)を生成してもよい。また、例えば、プロセッサ11は、検知された物体が動物であると判定された場合、物体が動物であることを通知する情報を含むバイタル情報INF(第2通知情報の一例)を生成してもよい。 For example, the vital information INF may include the result of the third determination process or the sixth determination process for determining whether or not the object detected in the first detection area AR1 is a human. For example, when the detected object is determined to be human, processor 11 may generate vital information INF (an example of first notification information) including information notifying that the object is human. Further, for example, when the detected object is determined to be an animal, processor 11 may generate vital information INF (an example of second notification information) including information notifying that the object is an animal. good.
 また、図11に示すバイタル情報INF(第1通知の一例)は、物体PS3がヒトであって、データベースDB11に事前に登録された人物でないことを示す照合結果情報「未登録人物」を含む。なお、物体PS3がヒトであって、データベースDB11に事前に登録された人物である場合、バイタル情報INF(第1通知の一例)は、データベースDB11に事前に登録された人物であることを示す照合結果情報(例えば、データベースDB11に記憶されたバイタル情報に対応付けらえた人物の氏名、顔画像、社員番号等)を含んでもよい。 In addition, the vital information INF (an example of the first notification) shown in FIG. 11 includes collation result information "unregistered person" indicating that the object PS3 is a person and not a person pre-registered in the database DB11. If the object PS3 is a human and is a person pre-registered in the database DB11, the vital information INF (an example of the first notification) is a collation indicating that the person is pre-registered in the database DB11. Result information (for example, a person's name, face image, employee number, etc. associated with the vital information stored in the database DB11) may be included.
 サーバS1は、モニタMNに表示された検知画面SC1は、操作部23を介した管理者による拡大,縮小,回転等の操作に基づいて、検知画面SC1の拡大,縮小,回転等を実行してモニタMNに表示する。 The server S1 executes enlargement, reduction, rotation, etc., of the detection screen SC1 displayed on the monitor MN based on operations such as enlargement, reduction, rotation, etc. by the administrator via the operation unit 23. Display on monitor MN.
 以上により、実施の形態1、及び実施の形態1の変形例1~3に係る監視レーダ装置RD1は、第1の検知エリアAR1(第1範囲の一例)において第1電波を送信する送信アンテナ部RBT1,…,RBTN、物体に反射した第1電波を受信する受信アンテナ部RBR1,…,RBRN(第1送受信部の一例)と、受信アンテナ部RBR1,…,RBRNにより受信された第1電波の反射波に基づいて、第1の検知エリアAR1における物体の有無の検知し、第1の検知エリアAR1で検知された物体に関する検知結果(第1情報の一例)を取得するレーダIC RB1,…,RBN(第1検知処理部の一例)と、第1の検知エリアAR1よりも広い第2の検知エリアAR2(第2範囲の一例)において第2電波を送信する送信アンテナ部RAT1,…,RATM、物体に反射した第2電波を受信する受信アンテナ部RAR1,…,RARM(第2送受信部の一例)と、受信アンテナ部RAR1,…,RARMにより受信された第2電波の反射波に基づいて、第2の検知エリアAR2における物体の有無を検知し、第2の検知エリアAR2で検知された物体に関する検知結果(第2情報の一例)を取得するレーダIC RA1,…,RAM(第2検知処理部の一例)と、を備える。レーダIC RB1,…,RBNにより取得された検知結果(第1情報の一例)は、第1の検知エリアAR1で検知された物体が生物であるか否かを判定可能な識別情報を含む。 As described above, the surveillance radar device RD1 according to Embodiment 1 and Modifications 1 to 3 of Embodiment 1 has a transmission antenna unit that transmits a first radio wave in a first detection area AR1 (an example of a first range). RBT1, . . . , RBTN, receiving antenna units RBR1, . A radar IC RB1 that detects the presence or absence of an object in the first detection area AR1 based on the reflected wave and acquires the detection result (an example of the first information) regarding the object detected in the first detection area AR1, ..., RBN (an example of a first detection processing unit) and a transmission antenna unit RAT1, . Receiving antenna units RAR1, . Radar IC RA1, ..., RAM (second detection processing an example of the part) and The detection result (an example of the first information) obtained by the radar ICs RB1, .
 これにより、実施の形態1、及び実施の形態1の変形例1~3に係る監視レーダ装置RD1は、遠距離(つまり、第2の検知エリアAR2)と、近距離(第1の検知エリアAR1)とでそれぞれレーダを使い分けることで、遠距離に位置する物体の検知と、近距離に位置する物体の検知及びバイタルセンシングとを同時に実行できる。また、実施の形態1における監視レーダ装置RD1は、遠距離(つまり、第2の検知エリアAR2)と、近距離(第1の検知エリアAR1)とでそれぞれレーダを使い分けることで、監視レーダ装置RD1が1台で広範囲を監視しつつ、近距離では物体が生物であるかを判定できる。 As a result, the surveillance radar device RD1 according to the first embodiment and modifications 1 to 3 of the first embodiment has a long range (that is, the second detection area AR2) and a short range (the first detection area AR1 ), detection of an object located at a long distance, detection of an object located at a short distance, and vital sensing can be performed at the same time. In addition, the surveillance radar device RD1 according to Embodiment 1 uses different radars for a long range (that is, the second detection area AR2) and for a short range (the first detection area AR1). While monitoring a wide range with one unit, it can determine whether an object is a living thing at a short distance.
 また、以上により、実施の形態1、及び実施の形態1の変形例1~3に係る監視レーダ装置RD1において、第1の検知エリアAR1で検知された物体に関する検知結果は、さらに、第1の検知エリアAR1において検知された物体の座標を示す位置(第1座標情報の一例)を含む。第2の検知エリアAR2で検知された物体に関する検知結果は、第2範囲で検知された物体の座標を示す第2座標情報を含む。これにより、実施の形態1、及び実施の形態1の変形例1~3に係る監視レーダ装置RD1は、第1の検知エリアAR1及び第2の検知エリアAR2のそれぞれで検知された物体の位置情報を取得できる。 Further, as described above, in the surveillance radar device RD1 according to the first embodiment and the first to third modifications of the first embodiment, the detection result regarding the object detected in the first detection area AR1 is further divided into the first It includes a position (an example of the first coordinate information) indicating the coordinates of the object detected in the detection area AR1. The detection result regarding the object detected in the second detection area AR2 includes second coordinate information indicating the coordinates of the object detected in the second range. As a result, the surveillance radar device RD1 according to the first embodiment and modifications 1 to 3 of the first embodiment obtains the position information of the object detected in each of the first detection area AR1 and the second detection area AR2. can be obtained.
 また、以上により、実施の形態1、及び実施の形態1の変形例1~3に係る監視レーダ装置RD1は、レーダIC RB1,…,RBNにより取得された検知結果とレーダIC RA1,…,RAMにより取得された検知結果(第2情報の一例)とに基づいて、検知された物体に関する検知画面SC1(通知情報の一例)を生成して出力するプロセッサ11、を備える。これにより、実施の形態1、及び実施の形態1の変形例1~3に係る監視レーダ装置RD1は、検知エリアAR0内で検知された物体の検知結果を出力することで、管理者、警備員等により行われる監視業務を支援できる。 In addition, as described above, the monitoring radar device RD1 according to the first embodiment and the first to third modifications of the first embodiment includes the detection results acquired by the radar ICs RB1, . . . , RBN and the radar ICs RA1, . and a processor 11 for generating and outputting a detection screen SC1 (an example of notification information) related to the detected object based on the detection result (an example of second information) obtained by. As a result, the surveillance radar device RD1 according to the first embodiment and modifications 1 to 3 of the first embodiment outputs the detection result of the object detected in the detection area AR0, thereby It is possible to support the monitoring work performed by such as.
 また、以上により、実施の形態1、及び実施の形態1の変形例1~3に係る監視レーダ装置RD1のレーダIC RB1,…,RBNは、識別情報に基づいて、第1の検知エリアAR1において検知された物体が生物であるか否かを判定する第1判定処理を行う。プロセッサ11は、第1判定処理により生物であると判定された物体が生物であることを示す情報(例えば、物体のバイタル情報)と、第1判定処理により生物であると判定された物体に対応する座標(つまり、方位、距離)を示す位置情報(第1座標情報の一例)とを、少なくとも第1の検知エリアAR1を含むマップに重畳した検知画面SC1を生成する。これにより、実施の形態1、及び実施の形態1の変形例1~3に係る監視レーダ装置RD1は、近距離検知用レーダRBを用いて物体(生物)を検知でき、この物体が検知された位置を出力することで、管理者、警備員等により行われる監視業務を支援できる。 Further, as described above, the radar ICs RB1, ..., RBN of the surveillance radar device RD1 according to Embodiment 1 and Modifications 1 to 3 of Embodiment 1 operate in the first detection area AR1 based on the identification information. A first determination process is performed to determine whether the detected object is a living thing. The processor 11 stores information (for example, vital information of the object) indicating that an object determined to be a living thing by the first determination process is a living thing, and information corresponding to the object determined to be a living thing by the first determination process. A detection screen SC1 is generated by superimposing position information (an example of first coordinate information) indicating coordinates (that is, azimuth and distance) to a map including at least the first detection area AR1. As a result, the surveillance radar device RD1 according to the first embodiment and modifications 1 to 3 of the first embodiment can detect an object (creature) using the short-range detection radar RB, and the object is detected. By outputting the position, it is possible to support monitoring work performed by administrators, security guards, and the like.
 また、以上により、実施の形態1変形例1に係る監視レーダ装置RD1のレーダIC RB1,…,RBNは、第1の検知エリアAR1において検知された物体が移動しているか否かを判定する第2判定処理を行い、当該第2判定処理において移動していないと判定した物体に対応する識別情報に物体が生物であることを示す生体情報が含まれている場合に、当該第2判定処理において移動していないと判定された物体が生物であると判定する。これにより、実施の形態1の変形例1に係る監視レーダ装置RD1は、検知された物体が生物であるか否かを判定できる。 Further, as described above, the radar ICs RB1, ..., RBN of the surveillance radar device RD1 according to the first modification of the first embodiment determine whether or not the object detected in the first detection area AR1 is moving. 2 determination processing is performed, and if the identification information corresponding to the object determined not to move in the second determination processing includes biological information indicating that the object is a living thing, in the second determination processing An object determined not to move is determined to be a living thing. Accordingly, the surveillance radar device RD1 according to Modification 1 of Embodiment 1 can determine whether or not the detected object is a living thing.
 また、以上により、実施の形態1の変形例2に係る監視レーダ装置RD1のレーダIC RB1,…,RBNは、識別情報と、物体の大きさと、受信アンテナ部RBR1,…,RBRNにより受信された電波の反射波の強度と、のうち少なくとも1つに基づいて、第1判定処理において生物であると判定された物体がヒトであるか否かを判定する第3判定処理を行う。これにより、実施の形態1の変形例2に係る監視レーダ装置RD1は、検知された物体が人物であるか動物であるかをより高精度に判定できる。 Further, as described above, the radar ICs RB1, ..., RBN of the surveillance radar device RD1 according to Modification 2 of Embodiment 1 have the identification information, the size of the object, and the reception antenna units RBR1, ..., RBRN A third determination process is performed for determining whether or not the object determined to be a living thing in the first determination process is a human based on at least one of the intensity of the reflected wave of the radio wave and the intensity of the reflected wave of the radio wave. Thereby, the surveillance radar device RD1 according to the second modification of the first embodiment can more accurately determine whether the detected object is a person or an animal.
 また、以上により、実施の形態1の変形例1に係る監視レーダ装置RD1のプロセッサ11は、識別情報に基づいて、第1の検知エリアAR1において検知された物体が生物であるか否かを判定する第4判定処理を行い、第4判定処理において生物であると判定された物体が生物であることを示す情報と、第4判定処理において生物であると判定された物体に対応する位置情報(第1座標情報の一例)とを、少なくとも第1の検知エリアAR1を示すマップに重畳した検知画面SC1を生成する。これにより、実施の形態1の変形例1に係る監視レーダ装置RD1は、近距離検知用レーダRBを用いて物体(生物)を検知でき、この物体が検知された位置を出力することで、管理者、警備員等により行われる監視業務を支援できる。 Further, as described above, the processor 11 of the surveillance radar device RD1 according to the first modification of the first embodiment determines whether or not the object detected in the first detection area AR1 is a living thing based on the identification information. Information indicating that an object determined to be a living thing in the fourth determination process is a living thing, and position information corresponding to the object determined to be a living thing in the fourth determination process ( An example of first coordinate information) is superimposed on a map showing at least the first detection area AR1 to generate a detection screen SC1. As a result, the surveillance radar device RD1 according to Modification 1 of Embodiment 1 can detect an object (creature) using the short-range detection radar RB, and by outputting the position at which this object is detected, the management It can support surveillance work performed by personnel, security guards, etc.
 また、以上により、実施の形態1の変形例1に係る監視レーダ装置RD1のプロセッサ11は、第1の検知エリアAR1において検知された物体が移動しているか否かを判定する第5判定処理を行い、第5判定処理において移動していないと判定された物体に対応する識別情報に、物体が生物であることを示す生体情報が含まれていると判定した場合、当該第5判定処理において移動していないと判定された物体が生物であると判定する。これにより、実施の形態1の変形例1に係る監視レーダ装置RD1は、静止している物体が生物であるか否かを判定できる。 Further, as described above, the processor 11 of the surveillance radar device RD1 according to the first modification of the first embodiment performs the fifth determination process of determining whether or not the object detected in the first detection area AR1 is moving. and if it is determined that the identification information corresponding to the object determined not to move in the fifth determination process includes biological information indicating that the object is a living thing, the fifth determination process moves An object that is determined not to be is determined to be a living thing. Thus, the surveillance radar device RD1 according to Modification 1 of Embodiment 1 can determine whether or not a stationary object is a living thing.
 また、以上により、実施の形態1の変形例2に係る監視レーダ装置RD1のプロセッサ11は、識別情報と、物体の大きさと、受信アンテナ部RBR1,…,RBRNが受信した第1電波の反射波の強度と、のうち、少なくとも1つに基づいて、第4判定処理において生物であると判定された物体がヒトであるか否かを判定する第6判定処理を行う。これにより、実施の形態1の変形例2に係る監視レーダ装置RD1は、検知された物体が人物であるか動物であるかをより高精度に判定できる。 In addition, as described above, the processor 11 of the surveillance radar device RD1 according to the second modification of the first embodiment includes the identification information, the size of the object, and the reflected waves of the first radio waves received by the receiving antenna units RBR1, . . . , RBRN. A sixth determination process is performed to determine whether or not the object determined to be a living thing in the fourth determination process is a human based on at least one of the intensity of the . Thereby, the surveillance radar device RD1 according to the second modification of the first embodiment can more accurately determine whether the detected object is a person or an animal.
 また、以上により、実施の形態1の変形例2に係る監視レーダ装置RD1のプロセッサ11は、通知情報として、レーダIC RB1,…,RBN又はプロセッサ11によってヒトであると判定された物体に関する第1通知を行う第1通知情報と、レーダIC RB1,…,RBN又はプロセッサ11によってヒトであると判定されなかった物体に関して、第1通知とは異なる第2通知を行う第2通知情報とを生成する。これにより、実施の形態1の変形例2に係る監視レーダ装置RD1は、生物がヒト(人物)であるか否かを管理者、警備員等に通知できる。 In addition, as described above, the processor 11 of the surveillance radar device RD1 according to the second modification of the first embodiment uses the radar ICs RB1, . . . , RBN or the first Generate first notification information for notification and second notification information for second notification different from the first notification regarding an object not determined to be human by the radar IC RB1, . . . , RBN or processor 11 . As a result, the surveillance radar device RD1 according to the second modification of the first embodiment can notify the administrator, security guard, etc. of whether or not the creature is a human (person).
 また、以上により、実施の形態1の変形例2に係る監視レーダ装置RD1のプロセッサ11は、ヒトの侵入が禁止される侵入禁止領域AR10にヒトであると判定された物体が侵入したと判定した場合、第1通知として、侵入禁止領域AR10にヒトが侵入したことを通知する。これにより、実施の形態1の変形例2に係る監視レーダ装置RD1は、事前に設定された侵入禁止領域AR10への人物の侵入の検知を管理者、警備員等に通知することで、監視業務を支援できる。 In addition, from the above, the processor 11 of the surveillance radar device RD1 according to the second modification of the first embodiment has determined that an object determined to be a human has entered the no-entry area AR10 into which humans are prohibited from entering. In this case, the first notification is that a person has entered the no-entry area AR10. As a result, the surveillance radar device RD1 according to Modification 2 of Embodiment 1 notifies the administrator, security guards, etc. of the detection of the intrusion of a person into the intrusion-prohibited area AR10 that has been set in advance, so that surveillance work can be carried out. can support
 また、以上により、実施の形態1の変形例3に係る監視レーダ装置RD1のプロセッサ11は、レーダIC RB1,…,RBN又はプロセッサ11によってヒトであると判定された物体のバイタル情報に基づいて、当該ヒトであると判定された物体の感情指標を求め、第1通知として、当該ヒトであると判定された物体に関する感情指標を通知する。これにより、実施の形態1の変形例3に係る監視レーダ装置RD1は、興奮状態、あるいは攻撃的な感情指標が算出された人物の検知を管理者、警備員等に通知することで、監視業務を支援できる。 In addition, as described above, the processor 11 of the surveillance radar device RD1 according to the third modification of the first embodiment, based on the vital information of the object determined to be human by the radar ICs RB1, . The emotion index of the object determined to be human is obtained, and the emotion index related to the object determined to be human is notified as a first notification. As a result, the surveillance radar device RD1 according to the third modification of the first embodiment notifies the administrator, the security guard, etc. of the detection of a person for whom an excited state or an aggressive emotion index has been calculated, thereby performing surveillance work. can support
 また、以上により、実施の形態1の変形例3に係る監視レーダ装置RD1のプロセッサ11は、レーダIC RB1,…,RBN又はプロセッサ11によってヒトであると判定された物体であり、かつ感情指標が所定状態である物体が侵入禁止領域に侵入した場合に、第1通知として侵入禁止領域への要注意物体の侵入が発生したことを通知する。これにより、実施の形態1の変形例3に係る監視レーダ装置RD1は、事前に設定された侵入禁止領域AR10へ、興奮状態、あるいは攻撃的な感情指標が算出された人物の侵入を管理者、警備員等に通知することで、監視業務を支援できる。 Further, as described above, the processor 11 of the surveillance radar device RD1 according to the third modification of the first embodiment is the radar IC RB1, . When an object in a predetermined state enters the no-entry area, the first notification is that an object requiring attention has entered the no-entry area. As a result, the surveillance radar device RD1 according to the third modification of the first embodiment prevents a person whose excited state or aggressive emotion index has been calculated from entering the preset no-entry area AR10. Monitoring work can be supported by notifying a security guard or the like.
 また、以上により、実施の形態1の変形例2に係る監視レーダ装置RD1のプロセッサ11は、レーダIC RB1,…,RBN又はプロセッサ11によってヒトであると判定された物体の生体情報と、あらかじめ登録された登録済み生体情報とが類似する場合に、第1通知として、レーダIC RB1,…,RBN又はプロセッサ11によってヒトであると判定された物体があらかじめ登録された人物であることを通知する。これにより、実施の形態1の変形例2に係る監視レーダ装置RD1は、第1の検知エリアAR1内に位置する人物が、事前にデータベースDB11に登録され、第1の検知エリアAR1への侵入が許可された人物であるか否かを判定できる。 In addition, as described above, the processor 11 of the surveillance radar device RD1 according to the second modification of the first embodiment includes the radar ICs RB1, . If the registered biometric information is similar, as the first notification, the radar IC RB1, . As a result, in the surveillance radar device RD1 according to Modification 2 of Embodiment 1, a person located within the first detection area AR1 is registered in the database DB11 in advance, and the person is prevented from entering the first detection area AR1. It can be determined whether the person is an authorized person.
 また、以上により、実施の形態1の変形例1~3に係る監視レーダ装置RD1における第1電波及び第2電波は、ミリ波又はマイクロ波である。これにより、実施の形態1、及び実施の形態1の変形例1~3に係る監視レーダ装置RD1は、バイタル情報の取得により適した波長帯の電波を照射することで、検知エリアAR0内の物体(人物、動物等)をより高精度に検知できる。 Further, according to the above, the first radio wave and the second radio wave in the surveillance radar device RD1 according to Modifications 1 to 3 of Embodiment 1 are millimeter waves or microwaves. As a result, the surveillance radar device RD1 according to Embodiment 1 and Modifications 1 to 3 of Embodiment 1 emits radio waves in a wavelength band more suitable for acquiring vital information, thereby detecting objects in the detection area AR0. (People, animals, etc.) can be detected with higher accuracy.
 また、以上により、実施の形態1、及び実施の形態1の変形例1~3に係る監視レーダ装置RD1のレーダIC RB1,…,RBNにより取得された検知結果は、監視レーダ装置RD1の設置位置PS0から第1の検知エリアAR1において検知された物体までの距離と、監視レーダ装置RD1の設置位置PS0から第1の検知エリアAR1において検知された物体に向かう方位と、第1の検知エリアAR1において検知された物体の高さとのうち、少なくとも1つを含む。レーダIC RA1,…,RAMにより取得された検知結果は、監視レーダ装置RD1の設置位置PS0から第2の検知エリアAR2において検知された物体までの距離と、監視レーダ装置RD1の設置位置PS0から第2の検知エリアAR2において検知された物体に向かう方位と、第2の検知エリアAR2において検知された物体の高さとのうち、少なくとも1つを含む。これにより、実施の形態1、及び実施の形態1の変形例1~3に係る監視レーダ装置RD1は、物体の位置、方位に基づく物体の位置と、物体の高さとに基づいて、物体の種別(例えば、人物、動物、車両等)とを判定できる。 Further, from the above, the detection results acquired by the radar ICs RB1, . The distance from PS0 to the object detected in the first detection area AR1, the azimuth from the installation position PS0 of the surveillance radar device RD1 toward the object detected in the first detection area AR1, and the height of the detected object. The detection results obtained by the radar ICs RA1, . At least one of the orientation toward the object detected in the second detection area AR2 and the height of the object detected in the second detection area AR2. As a result, the surveillance radar device RD1 according to Embodiment 1 and Modifications 1 to 3 of Embodiment 1 can determine the type of object based on the position of the object based on the position and orientation of the object and the height of the object. (eg, person, animal, vehicle, etc.).
 また、以上により、実施の形態1の変形例1~変形例3に係る監視レーダ装置RD1のレーダIC RB1,…,RBNは、第1の検知エリアAR1において検知された物体の移動速度を求める。レーダIC RB1,…,RBNにより取得された検知結果は、第1の検知エリアAR1において検知された物体の移動速度を、さらに含む。レーダIC RA1,…,RAMは、第2の検知エリアAR2において検知された物体の移動速度を求める。レーダIC RA1,…,RAMにより取得された検知結果は、第2の検知エリアAR2において検知された物体の移動速度を、さらに含む。これにより、実施の形態1の変形例1~変形例3に係る監視レーダ装置RD1は、算出された物体の移動速度に基づいて、物体の種別(例えば、人物、動物、車両等)をより高精度に判定できる。 Also, from the above, the radar ICs RB1, . The detection results obtained by the radar ICs RB1, . . . , RBN further include the moving speed of the object detected in the first detection area AR1. The radar ICs RA1, . . . , RAM obtain the moving speed of the object detected in the second detection area AR2. The detection results obtained by the radar ICs RA1, . . . , RAM further include the moving speed of the object detected in the second detection area AR2. As a result, the surveillance radar device RD1 according to Modifications 1 to 3 of Embodiment 1 can determine the type of object (for example, person, animal, vehicle, etc.) to a higher degree based on the calculated moving speed of the object. Accurate judgment is possible.
 また、以上により、実施の形態1、及び実施の形態1の変形例1~3に係る監視レーダ装置RD1のマップは、2次元又は3次元のマップデータである。これにより、実施の形態1、及び実施の形態1の変形例1~3に係る監視レーダ装置RD1は、2次元又は3次元のマップ上に検知情報が重畳された検知画面SC1により管理者による監視業務を支援できる。 Also, as described above, the map of the surveillance radar device RD1 according to Embodiment 1 and Modifications 1 to 3 of Embodiment 1 is two-dimensional or three-dimensional map data. As a result, the monitoring radar device RD1 according to the first embodiment and modifications 1 to 3 of the first embodiment can be monitored by an administrator using the detection screen SC1 in which detection information is superimposed on a two-dimensional or three-dimensional map. I can support my business.
 また、以上により、実施の形態1、及び実施の形態1の変形例1~3に係る監視レーダ装置RD1において、識別情報は、呼吸数と、心拍数と、血圧と、呼吸間隔と、心拍間隔とのうち、少なくとも1つを含む。これにより、実施の形態1、及び実施の形態1の変形例1~3に係る監視レーダ装置RD1は、識別情報に基づいて、検知された物体の種別(例えば、人物、動物等)をより高精度に判定できる。 Further, as described above, in the monitoring radar device RD1 according to the first embodiment and the first to third modifications of the first embodiment, the identification information includes the respiration rate, the heart rate, the blood pressure, the respiration interval, and the heartbeat interval. and at least one of As a result, the surveillance radar device RD1 according to the first embodiment and modifications 1 to 3 of the first embodiment can classify the type of detected object (for example, person, animal, etc.) to a higher level based on the identification information. Accurate judgment is possible.
 また、以上により、実施の形態1、及び実施の形態1の変形例1~3に係る監視レーダ装置RD1において、第2の検知エリアAR2で検知された物体に関する検知結果は、識別情報(例えば、呼吸数、心拍数)を含む。これにより、実施の形態1、及び実施の形態1の変形例1~3に係る監視レーダ装置RD1は、識別情報に基づいて、検知された物体の種別(例えば、生物、静止物等)をより高精度に判定できる。 Further, as described above, in the surveillance radar device RD1 according to the first embodiment and the first to third modifications of the first embodiment, the detection result regarding the object detected in the second detection area AR2 is the identification information (for example, breathing rate, heart rate). As a result, the surveillance radar device RD1 according to the first embodiment and modifications 1 to 3 of the first embodiment can more accurately identify the type of detected object (for example, a living thing, a stationary object, etc.) based on the identification information. It can be determined with high accuracy.
 以上により、実施の形態1、及び実施の形態1の変形例1~3に係る検知システム100(監視システムの一例)は、第1の検知エリアAR1(第1範囲の一例)において第1電波を送信することと、第1電波の反射波を受信することと、第1電波の反射波に基づいて、第1の検知エリアAR1における物体の有無を検知することと、第1の検知エリアAR1で検知された物体に関する情報であり、かつ物体が生物であるか否かを判定可能な識別情報を含む第1情報を取得することと、第1の検知エリアAR1よりも広い第2の検知エリアAR2(第2範囲の一例)において第2電波を送信することと、第2電波の反射波を受信することと、第2電波の反射波に基づいて、第2の検知エリアAR2における物体の有無を検知することと、第2の検知エリアAR2で検知された物体に関する第2情報を取得することと、を備える。 As described above, the detection system 100 (an example of a monitoring system) according to Embodiment 1 and Modifications 1 to 3 of Embodiment 1 emits the first radio wave in the first detection area AR1 (an example of the first range). transmitting; receiving a reflected wave of the first radio wave; detecting presence or absence of an object in the first detection area AR1 based on the reflected wave of the first radio wave; Acquiring first information that is information about a detected object and includes identification information that can determine whether or not the object is a living thing; obtaining a second detection area AR2 that is wider than the first detection area AR1; Transmitting the second radio wave in (an example of the second range), receiving the reflected wave of the second radio wave, and detecting the presence or absence of the object in the second detection area AR2 based on the reflected wave of the second radio wave and obtaining second information about the object detected in the second detection area AR2.
 これにより、実施の形態1、及び実施の形態1の変形例1~3に係る検知システム100は、遠距離(つまり、第2の検知エリアAR2)と、近距離(検知エリアAR0)とでそれぞれレーダを使い分けることで、遠距離に位置する物体の検知と、近距離に位置する物体の検知及びバイタルセンシングとを同時に実行できる。また、実施の形態1における監視レーダ装置RD1は、遠距離(つまり、第2の検知エリアAR2)と、近距離(検知エリアAR0)とでそれぞれレーダを使い分けることで、監視レーダ装置RD1が1台で広範囲を監視しつつ、近距離では物体が生物であるかを判定できる。 As a result, the detection system 100 according to Embodiment 1 and Modifications 1 to 3 of Embodiment 1 has a long range (that is, the second detection area AR2) and a short range (detection area AR0), respectively. By using different radars, it is possible to simultaneously detect an object located at a long distance, detect an object located at a short distance, and perform vital sensing. In addition, the surveillance radar device RD1 according to the first embodiment uses different radars for a long range (that is, the second detection area AR2) and for a short range (detection area AR0). While monitoring a wide range with , it can determine whether an object is a living thing at a short distance.
 以上、図面を参照しながら各種の実施の形態について説明したが、本開示はかかる例に限定されないことは言うまでもない。当業者であれば、特許請求の範囲に記載された範疇内において、各種の変更例、修正例、置換例、付加例、削除例、均等例に想到し得ることは明らかであり、それらについても当然に本開示の技術的範囲に属するものと了解される。また、発明の趣旨を逸脱しない範囲において、上述した各種の実施の形態における各構成要素を任意に組み合わせてもよい。 Various embodiments have been described above with reference to the drawings, but it goes without saying that the present disclosure is not limited to such examples. It is obvious that a person skilled in the art can conceive of various modifications, modifications, substitutions, additions, deletions, and equivalents within the scope of the claims. Naturally, it is understood that it belongs to the technical scope of the present disclosure. In addition, the constituent elements of the various embodiments described above may be combined arbitrarily without departing from the gist of the invention.
 なお、本出願は、2022年2月28日出願の日本特許出願(特願2022-029909)に基づくものであり、その内容は本出願の中に参照として援用される。 This application is based on a Japanese patent application (Japanese Patent Application No. 2022-029909) filed on February 28, 2022, the content of which is incorporated herein by reference.
 本開示は、レーダによる生物の検知精度を向上させる監視装置、監視システム及び監視方法として有用である。 The present disclosure is useful as a monitoring device, a monitoring system, and a monitoring method that improve the detection accuracy of living things by radar.
10,20 通信部
11,21 プロセッサ
12,22 メモリ
13 AI処理部
14 AI演算処理部
15 学習モデルメモリ
100 検知システム
AR0 検知エリア
AR1 第1の検知エリア
AR2 第2の検知エリア
C1 カメラ
DB11,DB2 データベース
DR1 警備ドローン
RA1,RAM,RB1,RBN レーダIC
RAT1,RATM,RBT1,RBTN 送信アンテナ部
RAR1,RARM,RBR1,RBRN 受信アンテナ部
RD1 監視レーダ装置
MN モニタ
NW ネットワーク
S1 サーバ
TP1 警備員端末
10, 20 communication units 11, 21 processors 12, 22 memory 13 AI processing unit 14 AI arithmetic processing unit 15 learning model memory 100 detection system AR0 detection area AR1 first detection area AR2 second detection area C1 camera DB11, DB2 database DR1 security drone RA1, RAM, RB1, RBN radar IC
RAT1, RATM, RBT1, RBTN Transmitting antenna unit RAR1, RARM, RBR1, RBRN Receiving antenna unit RD1 Monitoring radar device MN Monitor NW Network S1 Server TP1 Security guard terminal

Claims (22)

  1.  第1範囲において第1電波を送信し、前記第1電波の反射波を受信する第1送受信部と、
     前記第1送受信部により受信された前記第1電波の反射波に基づいて、前記第1範囲における物体の有無を検知し、前記第1範囲で検知された物体に関する第1情報を取得する第1検知処理部と、
     前記第1範囲よりも広い第2範囲において第2電波を送信し、前記第2電波の反射波を受信する第2送受信部と、
     前記第2送受信部により受信された前記第2電波の反射波に基づいて、前記第2範囲における物体の有無を検知し、前記第2範囲で検知された物体に関する第2情報を取得する第2検知処理部と、を備え、
     前記第1情報は、物体が生物であるか否かを判定可能な識別情報を含む、
     監視装置。
    a first transmission/reception unit that transmits a first radio wave in a first range and receives a reflected wave of the first radio wave;
    a first transmitting/receiving unit for detecting the presence or absence of an object in the first range based on the reflected wave of the first radio wave received by the first transmitting/receiving unit and acquiring first information about the object detected in the first range; a detection processing unit;
    a second transmitting/receiving unit that transmits a second radio wave in a second range wider than the first range and receives a reflected wave of the second radio wave;
    a second transmitting/receiving unit for detecting the presence or absence of an object in the second range based on the reflected wave of the second radio wave received by the second transmitting/receiving unit and acquiring second information about the object detected in the second range; a detection processing unit,
    The first information includes identification information that can determine whether the object is a living thing,
    surveillance equipment.
  2.  前記第1情報は、さらに、前記第1範囲において検知された物体の座標を示す第1座標情報を含み、
     前記第2情報は、前記第2範囲で検知された物体の座標を示す第2座標情報を含む、
     請求項1に記載の監視装置。
    The first information further includes first coordinate information indicating the coordinates of the object detected in the first range,
    The second information includes second coordinate information indicating the coordinates of the object detected in the second range,
    A monitoring device according to claim 1 .
  3.  前記第1情報と前記第2情報とに基づいて、前記第1範囲にある物体と前記第2範囲にある物体とに関する通知情報を生成して出力するプロセッサ、を備える、
     請求項2に記載の監視装置。
    a processor that generates and outputs notification information about the object in the first range and the object in the second range based on the first information and the second information;
    3. A monitoring device according to claim 2.
  4.  前記第1検知処理部は、前記識別情報に基づいて、前記第1範囲において検知された物体が生物であるか否かを判定する第1判定処理を行い、
     前記プロセッサは、前記第1判定処理により生物であると判定された物体が生物であることを示す情報と、前記第1判定処理により生物であると判定された物体に対応する第1座標情報とを、少なくとも前記第1範囲を示すマップ情報に重畳した重畳地図情報を生成する、
     請求項3に記載の監視装置。
    The first detection processing unit performs a first determination process for determining whether or not the object detected in the first range is a living thing based on the identification information,
    The processor provides information indicating that the object determined to be a living thing by the first determination process is a living thing, and first coordinate information corresponding to the object determined to be a living thing by the first determination process. is superimposed on map information indicating at least the first range to generate superimposed map information;
    4. A monitoring device according to claim 3.
  5.  前記第1検知処理部は、前記第1範囲において検知された物体が移動しているか否かを判定する第2判定処理を行い、当該第2判定処理において移動していないと判定した物体に対応する前記第1情報に物体が生物であることを示す生体情報が含まれている場合に、当該第2判定処理において移動していないと判定された物体が生物であると判定する、
     請求項4に記載の監視装置。
    The first detection processing unit performs a second determination process for determining whether or not the object detected in the first range is moving, and corresponds to the object determined as not moving in the second determination process. determining that the object determined not to move in the second determination process is a living thing when the first information includes biological information indicating that the object is a living thing;
    5. A monitoring device according to claim 4.
  6.  前記第1検知処理部は、前記識別情報と、前記物体の大きさと、前記第1送受信部が受信した前記第1電波の反射波の強度と、のうち、少なくとも1つに基づいて、前記第1判定処理において生物であると判定された物体がヒトであるか否かを判定する第3判定処理を行う、
     請求項4又は5に記載の監視装置。
    The first detection processing unit detects the first perform a third determination process for determining whether an object determined to be a living thing in the first determination process is a human;
    6. A monitoring device according to claim 4 or 5.
  7.  前記プロセッサは、前記識別情報に基づいて、前記第1範囲において検知された前記物体が生物であるか否かを判定する第4判定処理を行い、前記第4判定処理において生物であると判定された物体が生物であることを示す情報と、前記第4判定処理において前記生物であると判定された前記物体に対応する前記第1座標情報とを、少なくとも前記第1範囲を示すマップ情報に重畳した重畳地図情報を生成する、
     請求項3に記載の監視装置。
    The processor performs a fourth determination process for determining whether or not the object detected in the first range is a living thing based on the identification information, and determines that the object is a living thing in the fourth determination process. information indicating that the object is a living thing and the first coordinate information corresponding to the object determined to be the living thing in the fourth determination process are superimposed on map information indicating at least the first range. Generate superimposed map information with
    4. A monitoring device according to claim 3.
  8.  前記プロセッサは、前記第1範囲において検知された物体が移動しているか否かを判定する第5判定処理を行い、前記第5判定処理において移動していないと判定された物体に対応する前記識別情報に、物体が生物であることを示す生体情報が含まれていると判定した場合、当該第5判定処理において移動していないと判定された物体が前記生物であると判定する、
     請求項7に記載の監視装置。
    The processor performs a fifth determination process for determining whether or not the object detected in the first range is moving, and performs the identification corresponding to the object determined not to be moving in the fifth determination process. when it is determined that the information includes biological information indicating that the object is a living thing, determining that the object determined not to move in the fifth determination process is the living thing;
    8. A monitoring device according to claim 7.
  9.  前記プロセッサは、前記識別情報と、前記物体の大きさと、前記第1送受信部が受信した前記第1電波の反射波の強度と、のうち、少なくとも1つに基づいて、前記第4判定処理において前記生物であると判定された物体がヒトであるか否かを判定する第6判定処理を行う、
     請求項7又は8に記載の監視装置。
    The processor, in the fourth determination process, based on at least one of the identification information, the size of the object, and the intensity of the reflected wave of the first radio wave received by the first transceiver unit performing a sixth determination process for determining whether the object determined to be a living thing is a human;
    9. A monitoring device according to claim 7 or 8.
  10.  前記プロセッサは、前記通知情報として、前記第1検知処理部又は前記プロセッサによってヒトであると判定された前記物体に関する第1通知を行う第1通知情報と、前記第1検知処理部又は前記プロセッサによってヒトであると判定されなかった物体に関して、前記第1通知とは異なる第2通知を行う第2通知情報とを生成する、
     請求項6又は9に記載の監視装置。
    The processor includes, as the notification information, first notification information for performing a first notification regarding the object determined to be human by the first detection processing unit or the processor, and generating second notification information for performing a second notification different from the first notification with respect to an object that is not determined to be human;
    10. A monitoring device according to claim 6 or 9.
  11.  前記プロセッサは、ヒトの侵入が禁止される侵入禁止領域にヒトであると判定された物体が侵入したと判定した場合に、前記第1通知として、前記侵入禁止領域にヒトが侵入したことを通知する、
     請求項10に記載の監視装置。
    When the processor determines that an object determined to be a human has entered an intrusion prohibited area where human entry is prohibited, the processor notifies that a human has entered the intrusion prohibited area as the first notification. do,
    11. A monitoring device according to claim 10.
  12.  前記プロセッサは、前記第1検知処理部又は前記プロセッサによってヒトであると判定された物体の生体情報に基づいて、当該ヒトであると判定された物体の感情指標を求め、前記第1通知として、当該ヒトであると判定された物体に関する前記感情指標を通知する、
     請求項10又は11に記載の監視装置。
    The processor obtains the emotion index of the object determined to be human based on the biological information of the object determined to be human by the first detection processing unit or the processor, and as the first notification, notifying the emotional indicator of the object determined to be human;
    12. A monitoring device according to claim 10 or 11.
  13.  前記プロセッサは、前記第1検知処理部又は前記プロセッサによって前記ヒトであると判定された物体であり、かつ前記感情指標が所定状態である物体が侵入禁止領域に侵入した場合に、前記第1通知として前記侵入禁止領域への要注意物体の侵入が発生したことを通知する、
     請求項12に記載の監視装置。
    The processor provides the first notification when an object determined to be the human by the first detection processing unit or the processor and having the emotion index in a predetermined state enters an intrusion prohibited area. Notifying that an object requiring caution has entered the no-entry area as
    13. A monitoring device according to claim 12.
  14.  前記プロセッサは、前記第1検知処理部又は前記プロセッサによって前記ヒトであると判定された物体の生体情報と、あらかじめ登録された登録済み生体情報とが類似する場合に、前記第1通知として、前記第1検知処理部又は前記プロセッサによって前記ヒトであると判定された物体があらかじめ登録された人物であることを通知する、
     請求項10から13のいずれか1項に記載の監視装置。
    The processor, when the biological information of the object determined to be the human by the first detection processing unit or the processor is similar to registered biological information registered in advance, as the first notification, the Notifying that the object determined to be the human by the first detection processing unit or the processor is a pre-registered person;
    14. A monitoring device according to any one of claims 10-13.
  15.  前記マップ情報は、2次元又は3次元のマップデータである、
     請求項4から14のいずれか1項に記載の監視装置。
    The map information is two-dimensional or three-dimensional map data,
    15. A monitoring device according to any one of claims 4-14.
  16.  前記第1電波及び前記第2電波は、ミリ波又はマイクロ波である、
     請求項1から15のいずれか1項に記載の監視装置。
    The first radio wave and the second radio wave are millimeter waves or microwaves,
    16. A monitoring device according to any one of claims 1-15.
  17.  前記第1情報は、前記監視装置の設置位置から前記第1範囲において検知された物体までの距離と、前記監視装置の設置位置から前記第1範囲において検知された物体に向かう方位と、前記第1範囲において検知された物体の高さとのうち、少なくとも1つを含み、
     前記第2情報は、前記監視装置の設置位置から前記第2範囲において検知された物体までの距離と、前記監視装置の設置位置から前記第2範囲において検知された物体に向かう方位と、前記第2範囲において検知された物体の高さとのうち、少なくとも1つを含む、
     請求項1から16のいずれか1項に記載の監視装置。
    The first information includes a distance from the installation position of the monitoring device to the object detected in the first range, a direction from the installation position of the monitoring device to the object detected in the first range, at least one of height of an object detected in a range,
    The second information includes the distance from the installation position of the monitoring device to the object detected in the second range, the direction from the installation position of the monitoring device to the object detected in the second range, and the at least one of the height of the object detected at the two ranges;
    17. A monitoring device according to any one of claims 1-16.
  18.  前記第1検知処理部は、前記第1範囲において検知された前記物体の移動速度を求め、
     前記第1情報は、前記第1範囲において検知された物体の移動速度を、さらに含み、
     前記第2検知処理部は、前記第2範囲において検知された前記物体の移動速度を求め、
     前記第2情報は、前記第2範囲において検知された物体の移動速度を、さらに含む、
     請求項1から17のいずれか1項に記載の監視装置。
    The first detection processing unit obtains a moving speed of the object detected in the first range,
    The first information further includes a moving speed of the object detected in the first range,
    The second detection processing unit obtains a moving speed of the object detected in the second range,
    the second information further includes a moving speed of the object detected in the second range;
    18. A monitoring device according to any one of claims 1-17.
  19.  前記識別情報は、呼吸数と、心拍数と、血圧と、呼吸間隔と、心拍間隔とのうち、少なくとも1つを含む、
     請求項1から18のいずれか1項に記載の監視装置。
    wherein the identification information includes at least one of respiration rate, heart rate, blood pressure, respiration interval, and heartbeat interval;
    19. A monitoring device according to any one of claims 1-18.
  20.  前記第2情報は、前記識別情報を含む、
     請求項1から19のいずれか1項に記載の監視装置。
    wherein the second information includes the identification information;
    20. A monitoring device according to any one of claims 1-19.
  21.  請求項1から20のいずれか1項に記載の監視装置と、
     前記監視装置と通信可能に接続される情報処理装置と、を備える、
     監視システム。
    a monitoring device according to any one of claims 1 to 20;
    an information processing device communicably connected to the monitoring device;
    Monitoring system.
  22.  第1範囲において第1電波を送信することと、
     前記第1電波の反射波を受信することと、
     前記第1電波の反射波に基づいて、前記第1範囲における物体の有無を検知することと、
     前記第1範囲で検知された物体に関する情報であり、かつ物体が生物であるか否かを判定可能な識別情報を含む第1情報を取得することと、
     前記第1範囲よりも広い第2範囲において第2電波を送信することと、
     前記第2電波の反射波を受信することと、
     前記第2電波の反射波に基づいて、前記第2範囲における物体の有無を検知することと、
     前記第2範囲で検知された物体に関する第2情報を取得することと、を備える、
     監視方法。
    transmitting a first radio wave in a first range;
    receiving a reflected wave of the first radio wave;
    detecting the presence or absence of an object in the first range based on the reflected wave of the first radio wave;
    Acquiring first information that is information about an object detected in the first range and that includes identification information that can determine whether the object is a living thing;
    transmitting a second radio wave in a second range wider than the first range;
    receiving a reflected wave of the second radio wave;
    Detecting the presence or absence of an object in the second range based on the reflected wave of the second radio wave;
    obtaining second information about objects detected at the second range;
    Monitoring method.
PCT/JP2023/004584 2022-02-28 2023-02-10 Monitoring device, monitoring system, and monitoring method WO2023162723A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022029909 2022-02-28
JP2022-029909 2022-02-28

Publications (1)

Publication Number Publication Date
WO2023162723A1 true WO2023162723A1 (en) 2023-08-31

Family

ID=87765798

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/004584 WO2023162723A1 (en) 2022-02-28 2023-02-10 Monitoring device, monitoring system, and monitoring method

Country Status (1)

Country Link
WO (1) WO2023162723A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006109771A1 (en) * 2005-04-11 2006-10-19 Optex Co., Ltd. Intrusion sensor
JP2007178140A (en) * 2005-12-27 2007-07-12 Hitachi Ltd Object detection sensor
CN110779150A (en) * 2019-11-14 2020-02-11 宁波奥克斯电气股份有限公司 Millimeter wave-based air conditioner control method and device and air conditioner
JP2020024185A (en) * 2018-06-22 2020-02-13 旭化成エレクトロニクス株式会社 Sensor device and system, and living body sensing method and system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006109771A1 (en) * 2005-04-11 2006-10-19 Optex Co., Ltd. Intrusion sensor
JP2007178140A (en) * 2005-12-27 2007-07-12 Hitachi Ltd Object detection sensor
JP2020024185A (en) * 2018-06-22 2020-02-13 旭化成エレクトロニクス株式会社 Sensor device and system, and living body sensing method and system
CN110779150A (en) * 2019-11-14 2020-02-11 宁波奥克斯电气股份有限公司 Millimeter wave-based air conditioner control method and device and air conditioner

Similar Documents

Publication Publication Date Title
US11164329B2 (en) Multi-channel spatial positioning system
US20210117585A1 (en) Method and apparatus for interacting with a tag in a cold storage area
US8884813B2 (en) Surveillance of stress conditions of persons using micro-impulse radar
US11514207B2 (en) Tracking safety conditions of an area
CN101221621B (en) Method and system for warning a monitored user about adverse behaviors
Lin et al. WiAU: An accurate device-free authentication system with ResNet
US11727518B2 (en) Systems and methods for location fencing within a controlled environment
US11615620B2 (en) Systems and methods of enforcing distancing rules
JPWO2007138811A1 (en) Suspicious behavior detection apparatus and method, program, and recording medium
AU2017376121A1 (en) Drone pre-surveillance
US11625510B2 (en) Method and apparatus for presentation of digital content
US11450197B2 (en) Apparatus and method of controlling a security system
JP2008204219A (en) Crime prevention system, suspicious person detection device using the same and crime prevention server
CN105938648A (en) Household safety integrated management system and method
US20110260859A1 (en) Indoor and outdoor security system and method of use
EP3910539A1 (en) Systems and methods of identifying persons-of-interest
Nadeem et al. A smart city application design for efficiently tracking missing person in large gatherings in Madinah using emerging IoT technologies
WO2023162723A1 (en) Monitoring device, monitoring system, and monitoring method
Johnson Jr et al. Social-distancing monitoring using portable electronic devices
CN209312052U (en) A kind of carry-on police system of recognition of face
CN111914050A (en) Visual 3D monitoring platform based on specific places
CN108022411B (en) Monitoring system based on image procossing
CN115953815A (en) Monitoring method and device for infrastructure site
JP2019213116A (en) Image processing device, image processing method, and program
CN109327681A (en) A kind of specific people identifies warning system and its method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23759727

Country of ref document: EP

Kind code of ref document: A1