WO2018020965A1 - Système de détection de véhicule aérien sans pilote et procédé de détection de véhicule aérien sans pilote - Google Patents

Système de détection de véhicule aérien sans pilote et procédé de détection de véhicule aérien sans pilote Download PDF

Info

Publication number
WO2018020965A1
WO2018020965A1 PCT/JP2017/024476 JP2017024476W WO2018020965A1 WO 2018020965 A1 WO2018020965 A1 WO 2018020965A1 JP 2017024476 W JP2017024476 W JP 2017024476W WO 2018020965 A1 WO2018020965 A1 WO 2018020965A1
Authority
WO
WIPO (PCT)
Prior art keywords
air vehicle
unmanned air
detection
camera
sound
Prior art date
Application number
PCT/JP2017/024476
Other languages
English (en)
Japanese (ja)
Inventor
宏之 松本
信太郎 吉國
良一 湯下
Original Assignee
パナソニックIpマネジメント株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2017107170A external-priority patent/JP2018026792A/ja
Application filed by パナソニックIpマネジメント株式会社 filed Critical パナソニックIpマネジメント株式会社
Priority to EP17833974.3A priority Critical patent/EP3493554A4/fr
Priority to US16/316,940 priority patent/US20190228667A1/en
Publication of WO2018020965A1 publication Critical patent/WO2018020965A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0073Surveillance aids
    • G08G5/0082Surveillance aids for monitoring traffic from a ground station
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B25/00Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0017Arrangements for implementing traffic-related aircraft activities, e.g. arrangements for generating, displaying, acquiring or managing traffic information
    • G08G5/0026Arrangements for implementing traffic-related aircraft activities, e.g. arrangements for generating, displaying, acquiring or managing traffic information located on the ground
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0047Navigation or guidance aids for a single aircraft
    • G08G5/0069Navigation or guidance aids for a single aircraft specially adapted for an unmanned aircraft
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/188Capturing isolated or intermittent images triggered by the occurrence of a predetermined event, e.g. an object reaching a predetermined position
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R1/00Details of transducers, loudspeakers or microphones
    • H04R1/02Casings; Cabinets ; Supports therefor; Mountings therein
    • H04R1/028Casings; Cabinets ; Supports therefor; Mountings therein associated with devices performing functions other than acoustics, e.g. electric candles
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R3/00Circuits for transducers, loudspeakers or microphones
    • H04R3/005Circuits for transducers, loudspeakers or microphones for combining the signals of two or more microphones
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R2499/00Aspects covered by H04R or H04S not otherwise provided for in their subgroups
    • H04R2499/10General applications
    • H04R2499/13Acoustic transducers and sound field adaptation in vehicles

Definitions

  • the present disclosure relates to an unmanned air vehicle detection system and an unmanned air vehicle detection method for detecting an unmanned air vehicle (UAV).
  • UAV unmanned air vehicle
  • a flying flying object monitoring device capable of detecting the presence of an object and detecting the flying direction of the object using a plurality of sound detection units that detect sound generated in the monitoring region for each direction is known (for example, Patent Document 1).
  • the processing device of this flying object monitoring apparatus detects the flying object flying and its flying direction by sound detection using a microphone, it directs the monitoring camera in the direction of the flying object flying. Further, the processing device displays the video image taken by the monitoring camera on the display device.
  • This disclosure is intended to easily determine the presence and position of an unmanned air vehicle intended by a user using an image captured by a camera.
  • An unmanned air vehicle detection system of the present disclosure includes a camera that captures an imaging area, a microphone array that collects sound of the imaging area, a display unit that displays a captured image of the imaging area captured by the camera, A signal processing unit that detects an unmanned air vehicle that appears in the imaging area using sound collected by the microphone array, and the signal processing unit includes the unmanned air vehicle in a captured image of the imaging area.
  • the first identification information converted into the visual information in is superimposed on the captured image of the imaging area and displayed on the display unit.
  • the unmanned air vehicle detection method of the present disclosure is an unmanned air vehicle detection method in an unmanned air vehicle detection system, which captures an imaging area with a camera, collects sound of the imaging area with a microphone array, Using the sound collected by the microphone array, an unmanned air vehicle that appears in the imaging area is detected, and the unmanned air vehicle is converted into visual information in a captured image of the imaging area to generate first identification information Then, the first identification information is superimposed on the captured image of the imaging area and displayed on the display unit.
  • FIG. 1 is a diagram illustrating an example of a schematic configuration of the unmanned air vehicle detection system according to the first embodiment.
  • FIG. 2 is a diagram illustrating an example of the appearance of the sound source detection unit.
  • FIG. 3 is a block diagram showing in detail an example of the internal configuration of the microphone array.
  • FIG. 4 is a block diagram showing in detail an example of the internal configuration of the omnidirectional camera.
  • FIG. 5 is a block diagram showing in detail an example of the internal configuration of the PTZ camera.
  • FIG. 6 is a block diagram illustrating an example of the internal configuration of the monitoring device in detail.
  • FIG. 7 is a timing chart showing an example of the pattern of the detection sound signal of the unmanned air vehicle registered in the memory.
  • FIG. 1 is a diagram illustrating an example of a schematic configuration of the unmanned air vehicle detection system according to the first embodiment.
  • FIG. 2 is a diagram illustrating an example of the appearance of the sound source detection unit.
  • FIG. 3 is a block diagram showing in detail an
  • FIG. 8 is a timing chart showing an example of the frequency change of the detected sound signal obtained as a result of the frequency analysis processing.
  • FIG. 9 is a sequence diagram illustrating an example of an unmanned air vehicle detection operation in the unmanned air vehicle detection system according to the first embodiment.
  • FIG. 10 is a flowchart showing an example of details of the unmanned air vehicle detection determination procedure in step T15 of FIG.
  • FIG. 11 is a diagram illustrating an example of a state in which the pointing direction is sequentially scanned in the monitoring area and the unmanned air vehicle is detected.
  • FIG. 12 is a diagram illustrating an example of a monitor display screen when no unmanned flying object is detected.
  • FIG. 13 is a diagram illustrating an example of a monitor display screen when an unmanned air vehicle is detected.
  • FIG. 10 is a flowchart showing an example of details of the unmanned air vehicle detection determination procedure in step T15 of FIG.
  • FIG. 11 is a diagram illustrating an example of a state in which the pointing direction is sequentially
  • FIG. 14 is a diagram showing an example of a monitor display screen when an unmanned air vehicle is detected and the PTZ camera changes the optical axis direction in conjunction with the detection.
  • FIG. 15 is a diagram illustrating another example of a monitor display screen when an unmanned air vehicle is detected and the PTZ camera changes the optical axis direction in conjunction with the detection.
  • FIG. 16 is a diagram illustrating a schematic example of the configuration of the unmanned air vehicle detection system according to the first modification of the first embodiment.
  • FIG. 17 is a diagram illustrating a schematic example of the configuration of the unmanned air vehicle detection system according to the second modification of the first embodiment.
  • FIG. 18 is a diagram illustrating a schematic example of the configuration of the unmanned air vehicle detection system according to the third modification of the first embodiment.
  • FIG. 19 is a sequence diagram illustrating an example of an unmanned air vehicle detection operation of the unmanned air vehicle detection system according to the third modification of the first embodiment.
  • FIG. 20 is a diagram illustrating a schematic example of the configuration of the unmanned air vehicle detection system according to the fourth modification of the first embodiment.
  • FIG. 21 is a diagram illustrating a schematic example of a configuration of an unmanned air vehicle detection system according to Modification 5 of the first embodiment.
  • FIG. 22 is a diagram illustrating an example of a schematic configuration of an unmanned air vehicle detection system according to the second embodiment.
  • FIG. 23 is a block diagram showing in detail an example of the internal configuration of the distance calculation apparatus.
  • FIG. 24 is a flowchart illustrating an example of a detection operation procedure in the detection device.
  • FIG. 24 is a flowchart illustrating an example of a detection operation procedure in the detection device.
  • FIG. 25 is a flowchart illustrating an example of a distance calculation procedure in the distance calculation apparatus.
  • FIG. 26 is an explanatory diagram of an example of a method for obtaining the distance from the sound source detection unit to the unmanned aerial vehicle using the technique of the triangulation method.
  • FIG. 27 is an explanatory diagram of a calculation example of the position of the unmanned air vehicle.
  • FIG. 28 is a diagram illustrating an example of a UI screen including the distance from the sound source detection unit to the unmanned air vehicle.
  • FIG. 29 is a flowchart showing an example of a section estimation procedure in a monitoring area where an unmanned air vehicle exists in the third embodiment.
  • FIG. 30 is a diagram illustrating a screen display example of a display on which a water purification plant including a block in which an unmanned air vehicle exists in the sky is displayed.
  • FIG. 31A is an explanatory diagram of correct / incorrect determination of unmanned air vehicle detection by two sound source detection units in the fourth embodiment.
  • FIG. 31B is an explanatory diagram of correct / incorrect determination of unmanned air vehicle detection by two sound source detection units in the fourth embodiment.
  • FIG. 31C is an explanatory diagram of correct / incorrect determination of unmanned air vehicle detection by two sound source detection units in the fourth embodiment.
  • FIG. 32 is a flowchart illustrating an example of a determination procedure for detection of an unmanned air vehicle by two sound source detection units.
  • FIG. 1 is a diagram illustrating an example of a schematic configuration of an unmanned air vehicle detection system 5 according to the first embodiment.
  • the unmanned air vehicle detection system 5 detects an unmanned air vehicle dn (for example, see FIG. 14) that is a user's purpose as a detection target.
  • the unmanned air vehicle dn is, for example, a multicopter such as a UAV (for example, a drone) that flies autonomously using a GPS (Global Positioning System) function, a radio control helicopter that is wirelessly operated by a third party, or the like.
  • a multicopter such as a UAV (for example, a drone) that flies autonomously using a GPS (Global Positioning System) function, a radio control helicopter that is wirelessly operated by a third party, or the like.
  • GPS Global Positioning System
  • Such an unmanned air vehicle dn is used for, for example, aerial shooting of a target, spraying of a liquid such as an agrochemical, transportation of goods
  • a multi-copter type drone equipped with a plurality of rotors is exemplified as the unmanned air vehicle dn.
  • a multi-copter type drone generally, when the number of rotor wings is two, a harmonic having a frequency twice as high as a specific frequency and a harmonic having a frequency multiplied by that frequency are generated. Similarly, when the number of rotor blades is three, harmonics having a frequency three times as high as a specific frequency, and further harmonics having a frequency multiplied by the harmonics are generated. The same applies when the number of rotor wings is four or more.
  • the unmanned aerial vehicle detection system 5 includes a plurality of sound source detection units UD, a monitoring device 10, and a monitor 50.
  • the plurality of sound source detection units UD are connected to the monitoring apparatus 10 via the network NW.
  • Each sound source detection unit UD includes a microphone array MA, an omnidirectional camera CA, and a PTZ camera CZ.
  • the individual sound source detection units are referred to as sound source detection units UD unless there is a particular need to distinguish them.
  • the microphone array MA, the omnidirectional camera CA, and the PTZ camera CZ are referred to unless the microphone array, the omnidirectional camera, and the PTZ camera need to be particularly distinguished from each other.
  • the microphone array MA collects sound in all directions in the sound collection area where the apparatus is installed in a non-directional state.
  • the microphone array MA has a housing 15 (see FIG. 2) in which a cylindrical opening having a predetermined width is formed at the center.
  • the sound to be picked up by the microphone array MA includes, for example, a mechanical operation sound such as a drone, a sound emitted by a human being, and other sounds, and is limited to a sound in an audible frequency range (that is, 20 Hz to 23 kHz). Alternatively, a low frequency sound lower than the audible frequency or an ultrasonic sound exceeding the audible frequency may be included.
  • the microphone array MA includes a plurality of omnidirectional microphones M1 to Mn (see FIG. 3).
  • the microphones M1 to Mn are arranged concentrically in a predetermined interval (for example, a uniform interval) along the circumferential direction around the opening provided in the housing 15.
  • a predetermined interval for example, a uniform interval
  • ECM electret condenser microphone
  • the microphone array MA transmits sound data of sounds (see below) obtained by collecting the sounds of the microphones M1 to Mn to the monitoring device 10 via the network NW.
  • the arrangement of the microphones M1 to Mn is an example, and other arrangements may be used.
  • Analog signals output from the amplifiers are converted into digital signals by A / D converters A1 to An (see FIG. 3) described later.
  • the number of microphones in the microphone array is not limited to 32, but may be other numbers (for example, 16, 64, 128).
  • An omnidirectional camera CA that substantially matches the volume of the opening is accommodated inside the opening formed in the center of the casing 15 (see FIG. 2) of the microphone array MA. That is, the microphone array MA and the omnidirectional camera CA are integrally arranged (see FIG. 2).
  • the omnidirectional camera CA is a camera equipped with a fisheye lens capable of capturing an omnidirectional image of the imaging area that is the sound collection space.
  • the sound collection area and the imaging area are both described as a common monitoring area, but the spatial size (for example, volume) of the sound collection area and the imaging area may not be the same.
  • the volume of the sound collection area may be larger or smaller than the volume of the imaging area.
  • the omnidirectional camera CA functions as a monitoring camera capable of imaging an imaging area where the sound source detection unit UD is installed, for example. That is, the omnidirectional camera CA has an angle of view of, for example, vertical direction: 180 ° and horizontal direction: 360 °, and images the monitoring area 8 (see FIG. 11), which is a hemisphere, for example, as an imaging area.
  • each sound source detection unit UD the omnidirectional camera CA and the microphone array MA are coaxially arranged by fitting the omnidirectional camera CA inside the opening of the housing 15.
  • the optical axis of the omnidirectional camera CA and the central axis of the housing of the microphone array MA coincide with each other, so that the imaging area and the sound collection area in the axial circumferential direction (that is, the horizontal direction) are substantially the same.
  • the position of the subject in the image and the position of the sound source to be collected can be expressed in the same coordinate system (for example, coordinates indicated by (horizontal angle, vertical angle)).
  • Each sound source detection unit UD is attached so that, for example, the upward direction in the vertical direction becomes the sound collection surface and the imaging surface in order to detect the unmanned air vehicle dn flying from above (see FIG. 2).
  • the monitoring apparatus 10 forms directivity with the arbitrary beam direction as the main beam direction (that is, beam forming) based on the user's operation with respect to the omnidirectional sound collected by the microphone array MA.
  • the sound of direction can be emphasized.
  • the technique regarding the directivity control processing of the sound data for beam forming the sound collected by the microphone array MA is a known technique as shown in Patent Documents 2 and 3, for example.
  • the monitoring device 10 generates an omnidirectional image by processing a captured image using an image captured by the omnidirectional camera CA (hereinafter, sometimes abbreviated as “captured image”). Note that the omnidirectional image may be generated not by the monitoring device 10 but by the omnidirectional camera CA.
  • the monitoring device 10 uses various images (see FIG. 15) based on the calculated values of the sound pressures collected by the microphone array MA and images based on the captured images captured by the omnidirectional camera CA.
  • the image is output to the monitor 50 or the like and displayed.
  • the monitoring device 10 displays the omnidirectional image GZ1 and the identification mark mk (see FIG. 13) obtained by converting the detected unmanned air vehicle dn into visual information in the omnidirectional image GZ1.
  • the monitoring device 10 is configured using, for example, a PC (Personal Computer) or a server.
  • the visual information means that, for example, in the omnidirectional image GZ1, it is information expressed to such an extent that it can be clearly distinguished from other subjects when the user views the omnidirectional image GZ1, and so on.
  • the monitor 50 displays an omnidirectional image GZ1 captured by the omnidirectional camera CA.
  • the monitor 50 generates and displays a composite image in which the identification mark mk is superimposed on the omnidirectional image GZ1.
  • the monitor 50 may be configured as an apparatus integrated with the monitoring device 10.
  • the plurality of sound source detection units UD and the monitoring device 10 have a communication interface and are connected to each other via a network NW so that data communication is possible.
  • the network NW may be a wired network (for example, an intranet, the Internet, a wired LAN (Local Area Network), or a wireless network (for example, a wireless LAN).
  • the sound source detection unit UD and the monitoring device 10 are connected via the network NW.
  • the monitoring device 10 and the monitor 50 are installed in a monitoring room RM in which a user such as a monitor is resident.
  • FIG. 2 is a diagram showing the appearance of the sound source detection unit UD.
  • the sound source detection unit UD includes a support base 70 that mechanically supports the microphone array MA, the omnidirectional camera CA, and the PTZ camera CZ described above.
  • the support base 70 includes a tripod 71, two rails 72 fixed to the top plate 71 a of the tripod 71, and a first mounting plate 73 and a second mounting plate 74 that are respectively attached to both ends of the two rails 72. And has a combined structure.
  • the first mounting plate 73 and the second mounting plate 74 are mounted so as to straddle the two rails 72 and have substantially the same plane. Further, the first mounting plate 73 and the second mounting plate 74 are movable on the two rails 72, and are adjusted and fixed at positions separated or approached from each other.
  • the first mounting plate 73 is a disk-shaped plate material.
  • An opening 73 a is formed at the center of the first mounting plate 73.
  • a housing 15 of the microphone array MA is accommodated and fixed in the opening 73a.
  • the second mounting plate 74 is a substantially rectangular plate material.
  • An opening 74 a is formed in a portion near the outside of the second mounting plate 74.
  • the PTZ camera CZ is accommodated and fixed in the opening 74a.
  • the optical axis L1 of the omnidirectional camera CA housed in the casing 15 of the microphone array MA and the optical axis L2 of the PTZ camera CZ attached to the second mounting plate 74 are in the initial installation state. Are set to be parallel to each other.
  • the tripod 71 is supported on the grounding surface by three legs 71b, and the position of the top plate 71a can be moved in the vertical direction with respect to the grounding surface by manual operation, and the top and bottom can be moved in the pan and tilt directions.
  • the direction of the plate 71a can be adjusted.
  • the sound collection area of the microphone array MA in other words, the imaging area of the omnidirectional camera CA
  • the imaging area of the omnidirectional camera CA can be set in an arbitrary direction.
  • FIG. 3 is a block diagram showing an example of the internal configuration of the microphone array MA in detail.
  • a compression processing unit 25 for example, a compression processing unit 25, and a transmission unit 26, each of which converts an analog signal output from the digital signal into a digital signal.
  • the compression processor 25 generates audio data packets based on the digital audio signals output from the A / D converters A1 to An.
  • the transmission unit 26 transmits the audio data packet generated by the compression processing unit 25 to the monitoring apparatus 10 via the network NW.
  • the microphone array MA amplifies the output signals of the microphones M1 to Mn with the amplifiers PA1 to PAn, and converts them into digital audio signals with the A / D converters A1 to An. Thereafter, the microphone array MA generates a packet of audio data by the compression processing unit 25, and transmits the packet of audio data to the monitoring device 10 via the network NW.
  • FIG. 4 is a block diagram showing in detail an example of the internal configuration of the omnidirectional camera CA.
  • the omnidirectional camera CA shown in FIG. 4 includes a CPU 41, a communication unit 42, a power management unit 44, an image sensor 45, a memory 46, and a network connector 47.
  • the illustration of the fisheye lens provided in the front stage of the image sensor 45 (that is, the right side in FIG. 4) is omitted.
  • the CPU 41 performs signal processing for controlling the operation of each unit of the omnidirectional camera CA, data input / output processing with other units, data calculation processing, and data storage processing.
  • a processor such as MPU (Micro Processing Unit) or DSP (Digital Signal Processor) may be provided.
  • the CPU 41 generates cut-out image data obtained by cutting out an image in a specific range (direction) from the omnidirectional image data according to the designation of the user who operates the monitoring device 10 and stores the cut-out image data in the memory 46.
  • the image sensor 45 is configured using, for example, a CMOS (complementary metal oxide semiconductor) sensor or a CCD (charge coupled device) sensor, and an optical image of reflected light from an imaging area collected by a fisheye lens (not shown). Omnidirectional image data is acquired by performing imaging processing on the light receiving surface.
  • CMOS complementary metal oxide semiconductor
  • CCD charge coupled device
  • the memory 46 stores a ROM 46z in which a program for defining the operation of the omnidirectional camera CA and data of setting values are stored, and omnidirectional image data or cutout image data or work data obtained by cutting out a part of the range. And a memory card 46x that is detachably connected to the omnidirectional camera CA and stores various data.
  • the communication unit 42 is a network interface (I / F) that controls data communication with the network NW connected via the network connector 47.
  • the power management unit 44 supplies DC power to each unit of the omnidirectional camera CA. Further, the power management unit 44 may supply DC power to devices connected to the network NW via the network connector 47.
  • the network connector 47 is a connector capable of transmitting omnidirectional image data or two-dimensional panoramic image data to the monitoring apparatus 10 via the network NW and supplying power via a network cable.
  • FIG. 5 is a block diagram showing in detail an example of the internal configuration of the PTZ camera CZ.
  • the PTZ camera CZ is a camera that can adjust the optical axis direction (sometimes referred to as an imaging direction) in accordance with a view angle change instruction from the monitoring device 10.
  • the PTZ camera CZ like the omnidirectional camera CA, includes a CPU 51, a communication unit 52, a power management unit 54, an image sensor 55, a memory 56, and a network connector 57, and also includes an imaging direction control unit 58 and a lens drive motor 59.
  • the CPU 51 When there is a view angle change instruction of the monitoring device 10, the CPU 51 notifies the imaging direction control unit 58 of the view angle change instruction.
  • the imaging direction control unit 58 controls the imaging direction of the PTZ camera CZ to at least one of the pan direction and the tilt direction according to the view angle change instruction notified from the CPU 51, and further changes the zoom magnification as necessary.
  • the control signal for output is output to the lens drive motor 59.
  • the lens drive motor 59 drives the imaging lens according to this control signal, changes the imaging direction (direction of the optical axis L2), and adjusts the focal length of the imaging lens to change the zoom magnification.
  • FIG. 6 is a block diagram showing an example of the internal configuration of the monitoring device 10 in detail. 6 has a configuration including at least a communication unit 31, an operation unit 32, a signal processing unit 33, a speaker device 37, a memory 38, and a setting management unit 39.
  • the signal processing unit 33 is configured using, for example, a CPU (Central Processing Unit), an MPU (Micro Processing Unit), or a DSP (Digital Signal Processing), and performs control processing for controlling the operation of each unit of the monitoring device 10 in an integrated manner. Data input / output processing, data calculation (calculation) processing, and data storage processing are performed with respect to other units.
  • the signal processing unit 33 includes a directivity processing unit 63, a frequency analysis unit 64, an object detection unit 65, a detection result determination unit 66, a scanning control unit 67, a detection direction control unit 68, a sound source direction detection unit 34, and an output control unit 35. including.
  • the monitoring device 10 is connected to the monitor 50.
  • the sound source direction detection unit 34 uses the sound data of the sound in the monitoring area 8 collected by the microphone array MA in accordance with, for example, a known whitening cross-correlation method (CSP (Cross-power Spectrum Phase Analysis) method). Is estimated.
  • CSP Cross-correlation method
  • the sound source direction detection unit 34 divides the monitoring area 8 shown in FIG. 11 into a plurality of blocks, and when sound is picked up by the microphone array MA, a threshold such as sound pressure or volume is exceeded for each block. By determining whether there is sound, the position of the sound source in the monitoring area 8 can be roughly estimated.
  • the setting management unit 39 has in advance a coordinate conversion formula relating to the coordinates of the position designated by the user with respect to the screen of the monitor 50 on which the omnidirectional image data captured by the omnidirectional camera CA is displayed.
  • This coordinate conversion formula is specified by the user on the omnidirectional image data based on, for example, a physical distance difference between the installation position of the omnidirectional camera CA (see FIG. 2) and the installation position of the PTZ camera CZ (see FIG. 2).
  • This is a mathematical formula for converting the coordinates of the position (that is, (horizontal angle, vertical angle)) into coordinates in the direction viewed from the PTZ camera CZ.
  • the signal processing unit 33 uses the coordinate conversion formula held by the setting management unit 39 to determine the position designated by the user from the installation position of the PTZ camera CZ with reference to the installation position of the PTZ camera CZ (see FIG. 2).
  • the coordinates ( ⁇ MAh, ⁇ MAv) indicating the directivity direction toward the actual sound source position corresponding to is calculated.
  • ⁇ MAh is a horizontal angle in a direction toward the actual sound source position corresponding to the position designated by the user as viewed from the installation position of the PTZ camera CZ.
  • ⁇ MAv is a vertical angle in a direction toward the actual sound source position corresponding to the position designated by the user as viewed from the installation position of the PTZ camera CZ. As shown in FIG.
  • the sound source position is an actual sound source position corresponding to the position designated from the operation unit 32 by the operation of the user's finger or stylus pen with respect to the video data displayed on the monitor 50.
  • the omnidirectional camera CA and the microphone array MA are arranged so that the optical axis direction of the omnidirectional camera CA and the central axis of the casing of the microphone array MA are coaxial.
  • the coordinates of the designated position derived by the omnidirectional camera CA according to the designation by the user with respect to the monitor 50 on which the omnidirectional image data is displayed are the sound enhancement direction (also referred to as the directional direction) viewed from the microphone array MA.
  • the monitoring apparatus 10 transmits the coordinates of the designated position on the omnidirectional image data to the omnidirectional camera CA.
  • the omnidirectional camera CA uses the coordinates of the designated position transmitted from the monitoring device 10, and the coordinates (horizontal angle, vertical angle) indicating the direction of the sound source position corresponding to the designated position as seen from the omnidirectional camera CA. ) Is calculated. Since the calculation process in the omnidirectional camera CA is a known technique, a description thereof will be omitted.
  • the omnidirectional camera CA transmits a calculation result of coordinates indicating the direction of the sound source position to the monitoring device 10.
  • the monitoring device 10 can use the coordinates (horizontal angle, vertical angle) calculated by the omnidirectional camera CA as coordinates (horizontal angle, vertical angle) indicating the direction of the sound source position as viewed from the microphone array MA.
  • the setting management unit 39 derives the omnidirectional camera CA according to a method described in, for example, Japanese Patent Application Laid-Open No. 2015-029241. It is necessary to convert these coordinates into coordinates in the direction viewed from the microphone array MA.
  • the setting management unit 39 holds the first threshold th1 and the second threshold th2 that are compared with the sound pressure p for each pixel calculated by the signal processing unit 33.
  • the sound pressure p is used as an example of a sound parameter relating to the sound source, represents the loudness of sound collected by the microphone array MA, and represents the loudness output from the speaker device 37. It is distinguished from the volume.
  • the first threshold th1 and the second threshold th2 are values that are compared with the sound pressure of the sound generated in the monitoring area 8, and are set to predetermined values for determining the sound generated by the unmanned air vehicle dn, for example. Also, a plurality of threshold values can be set.
  • first threshold value th1 for example, two values, a first threshold value th1 and a second threshold value th2 that is larger than this, are set (first threshold value th1 ⁇ second threshold value th2). . In the present embodiment, three or more threshold values may be set.
  • the pixel region R1 (see FIG. 15) where the sound pressure greater than the second threshold th2 is obtained is drawn in red, for example, on the monitor 50 on which the omnidirectional image data is displayed.
  • the pixel region B1 in which the sound pressure greater than the first threshold th1 and less than or equal to the second threshold th2 is obtained is rendered, for example, in blue on the monitor 50 on which the omnidirectional image data is displayed.
  • the sound pressure region N1 of the pixel equal to or less than the first threshold th1 is drawn, for example, in a colorless manner on the monitor 50 on which the omnidirectional image data is displayed, that is, the display color of the omnidirectional image data is not different.
  • the communication unit 31 receives the omnidirectional image data or cut-out video data transmitted from the omnidirectional camera CA and the audio data transmitted from the microphone array MA and outputs them to the signal processing unit 33.
  • the operation unit 32 is a user interface (UI) for notifying the signal processing unit 33 of the content of a user input operation, and is configured by a pointing device such as a mouse or a keyboard.
  • the operation unit 32 may be configured using, for example, a touch panel or a touch pad that is arranged corresponding to the screen of the monitor 50 and can be directly input by a user's finger or stylus pen.
  • the operation unit 32 acquires coordinate data indicating the designated position and sends it to the signal processing unit 33. Output.
  • the signal processing unit 33 reads out sound data corresponding to the coordinate data of the designated position from the memory 38, forms directivity from the microphone array MA in the direction toward the sound source position corresponding to the designated position, and then the speaker. Output from the device 37. Thereby, the user can confirm clearly not only the unmanned air vehicle dn but the sound in the other designated position is emphasized.
  • the memory 38 is constituted by a ROM or a RAM.
  • the memory 38 holds, for example, various data including sound data of a certain section, setting information, a program, and the like.
  • the memory 38 has a pattern memory in which a sound pattern unique to each unmanned air vehicle dn is registered. Furthermore, the memory 38 stores data of the sound pressure heat map MP.
  • an identification mark mk (see FIG. 13) schematically representing the position of the unmanned air vehicle dn is registered.
  • the identification mark mk used here is, for example, a star symbol.
  • the identification mark mk is not limited to a star shape, but may be a circle or a rectangle, or a symbol or character such as a “ ⁇ ” shape reminiscent of an unmanned air vehicle.
  • the display mode of the identification mark mk may be changed between daytime and nighttime. For example, it may have a star shape during the daytime and a quadrangle that is not mistaken for a star at nighttime.
  • the identification mark mk may be dynamically changed. For example, a star symbol may be blinked or rotated, and the user can be further alerted.
  • FIG. 7 is a timing chart showing an example of a detection sound pattern of the unmanned air vehicle dn registered in the memory 38.
  • the detected sound pattern shown in FIG. 7 is a combination of frequency patterns, and sounds of four frequencies f1, f2, f3, and f4 generated by the rotation of four rotors mounted on the multicopter type unmanned air vehicle dn.
  • the signals of the respective frequencies are signals having different sound frequencies that are generated, for example, with the rotation of a plurality of wings pivotally supported by each rotor.
  • the frequency region indicated by the diagonal lines is the region where the sound pressure is high.
  • the detected sound pattern may include not only the number of sounds having a plurality of frequencies and the sound pressure but also other sound information.
  • a sound pressure ratio representing a ratio of sound pressures at each frequency can be used.
  • detection of the unmanned air vehicle dn is determined based on whether or not the sound pressure of each frequency included in the detection sound pattern exceeds a threshold value.
  • the directivity processing unit 63 uses the sound signals (also referred to as sound data) collected by the non-directional microphones M1 to Mn, performs the directivity forming process (beam forming) described above, and sets an arbitrary direction to the directivity direction.
  • the sound data extraction process is performed.
  • the directivity processing unit 63 can perform sound data extraction processing in which a range in an arbitrary direction is set as a directivity range.
  • the directivity range is a range including a plurality of adjacent directivity directions, and is intended to include a certain extent of directivity direction compared to the directivity direction.
  • the frequency analysis unit 64 performs frequency analysis processing on the sound data extracted in the directivity direction by the directivity processing unit 63. In this frequency analysis process, the frequency and the sound pressure included in the sound data in the directivity direction are detected.
  • FIG. 8 is a timing chart showing an example of the frequency change of the detected sound signal obtained as a result of the frequency analysis process.
  • four frequencies f11, f12, f13, f14 and sound pressures at the respective frequencies are obtained as detected sound signals (that is, detected sound data).
  • the fluctuation of each frequency that changes irregularly occurs, for example, due to the fluctuation of the rotation of the rotor (rotary blade) that changes when the unmanned air vehicle dn controls the attitude of the unmanned air vehicle dn itself.
  • the object detection unit 65 performs detection processing of the unmanned air vehicle dn.
  • the object detection unit 65 registers in advance in the pattern memory of the memory 38 and the detected sound pattern (see FIG. 8) (frequency f11 to f14) obtained as a result of the frequency analysis process.
  • the detected sound pattern (see FIG. 7) (frequency f1 to f4) is compared.
  • the object detection unit 65 determines whether or not the patterns of both detection sounds are approximate.
  • Whether or not both patterns are approximated is determined, for example, as follows. Of the four frequencies f1, f2, f3, and f4, if the sound pressures of at least two frequencies included in the detected sound data exceed the threshold values, the object detection unit 65 determines that the sound pattern is approximated, The flying object dn is detected. The unmanned air vehicle dn may be detected when other conditions are satisfied.
  • the detection result determination unit 66 instructs the detection direction control unit 68 to shift to detection of the unmanned air vehicle dn in the next pointing direction.
  • the detection result determination unit 66 notifies the output control unit 35 of the detection result of the unmanned air vehicle dn when it is determined that the unmanned air vehicle dn exists as a result of the scanning in the pointing direction.
  • the detection result includes information on the detected unmanned air vehicle dn.
  • the information of the unmanned air vehicle dn includes, for example, identification information of the unmanned air vehicle dn and position information (for example, direction information) of the unmanned air vehicle dn in the sound collection space.
  • the detection direction control unit 68 controls the direction for detecting the unmanned air vehicle dn in the sound collection space based on an instruction from the detection result determination unit 66. For example, the detection direction control unit 68 sets an arbitrary direction of the directivity range BF1 including the sound source position estimated by the sound source direction detection unit 34 in the entire sound collection space as the detection direction.
  • the scanning control unit 67 instructs the directivity processing unit 63 to perform beam forming using the detection direction set by the detection direction control unit 68 as the directivity direction.
  • the directivity processing unit 63 performs beam forming in the directivity direction instructed from the scanning control unit 67.
  • the directivity processing unit 63 sets the initial position in the directivity range BF1 (see FIG. 11) including the sound source position estimated by the sound source direction detection unit 34 as the directivity direction BF2.
  • the directivity direction BF2 is sequentially set by the detection direction control unit 68 within the directivity range BF1.
  • the output control unit 35 Based on the omnidirectional image data picked up by the omnidirectional camera CA and the sound data picked up by the microphone array MA, the output control unit 35 generates a sound pressure for each pixel constituting the omnidirectional image data. Is calculated. This sound pressure calculation process is a known technique, and a detailed description of the process is omitted. Thereby, the output control unit 35 generates a sound pressure heat map MP in which a calculated value of sound pressure is assigned to the position of the corresponding pixel for each pixel constituting the omnidirectional image data. Furthermore, the output control unit 35 generates a sound pressure heat map MP as shown in FIG. 15 by performing color conversion processing on the sound pressure value for each pixel of the generated sound pressure heat map MP.
  • the output control unit 35 has been described as generating the sound pressure heat map MP in which the sound pressure value calculated in units of pixels is assigned to the corresponding pixel position, but does not calculate the sound pressure for each pixel.
  • the sound pressure heat map is calculated by calculating an average value of sound pressure values in units of pixel blocks including a predetermined number (for example, four) of pixels, and assigning an average value of sound pressure values corresponding to the corresponding predetermined number of pixels. MP may be generated.
  • the output control unit 35 controls each operation of the monitor 50 and the speaker device 37, outputs the omnidirectional image data or cut-out video data transmitted from the omnidirectional camera CA to the monitor 50, and displays it. Audio data transmitted from the microphone array MA is output as audio to the speaker device 37. Further, when the unmanned air vehicle dn is detected, the output control unit 35 outputs the identification mark mk representing the unmanned air vehicle dn to the monitor 50 so as to be displayed superimposed on the omnidirectional image.
  • the output control unit 35 uses the audio data collected by the microphone array MA and the coordinates indicating the direction of the sound source position derived by the omnidirectional camera CA to output the sound data collected by the microphone array MA.
  • the voice data directivity forming process is a known technique described in, for example, Japanese Patent Application Laid-Open No. 2015-029241.
  • the speaker device 37 outputs sound data picked up by the microphone array MA or sound data picked up by the microphone array MA and formed with directivity by the signal processing unit 33. Note that the speaker device 37 may be configured as a separate device from the monitoring device 10.
  • FIG. 9 is a sequence diagram illustrating an example of an unmanned air vehicle detection operation in the unmanned air vehicle detection system 5 according to the first embodiment.
  • the power of each device for example, the monitor 50, the monitoring device 10, the PTZ camera CZ, the omnidirectional camera CA, and the microphone array MA
  • the unmanned air vehicle detection system 5 starts operating. To do.
  • the monitoring apparatus 10 issues an image transmission request to the PTZ camera CZ (T1).
  • the PTZ camera CZ starts imaging processing according to power-on (T2).
  • the monitoring apparatus 10 makes an image transmission request to the omnidirectional camera CA (T3).
  • the omnidirectional camera CA starts imaging processing in response to power-on (T4).
  • the monitoring device 10 makes a sound transmission request to the microphone array MA (T5).
  • the microphone array MA starts sound collection processing in response to power-on (T6).
  • the PTZ camera CZ transmits data of a captured image (for example, a still image or a moving image) obtained by imaging to the monitoring device 10 via the network NW (T7).
  • the monitoring device 10 converts the captured image data transmitted from the PTZ camera CZ into display data such as NTSC (T8) and outputs it to the monitor 50 (T9).
  • the monitor 50 displays a PTZ image GZ2 (see FIG. 12 and the like) by the PTZ camera CZ on the screen.
  • the omnidirectional camera CA transmits data of omnidirectional images (for example, still images and moving images) obtained by imaging to the monitoring device 10 via the network NW (T10).
  • the monitoring device 10 converts the omnidirectional image data transmitted from the omnidirectional camera CA into display data such as NTSC (T11), and outputs it to the monitor 50 (T12).
  • the monitor 50 displays an omnidirectional image GZ1 (see FIG. 12, etc.) by the omnidirectional camera CA on the screen.
  • the microphone array MA encodes the sound data of the sound obtained by the sound collection via the network NW and transmits it to the monitoring device 10 (T13).
  • the sound source direction detection unit 34 estimates the sound source position in the monitoring area 8 (T14). The estimated sound source position is used as a reference position of the directivity range BF1 (see FIG. 11) necessary for setting the initial directivity direction when the monitoring apparatus 10 detects the unmanned air vehicle dn.
  • the monitoring apparatus 10 performs detection determination of the unmanned air vehicle dn (T15). Details of the detection determination process for the unmanned air vehicle dn will be described later.
  • the output control unit 35 in the monitoring device 10 adds the omnidirectional image GZ1 displayed on the screen of the monitor 50 in the directivity direction determined in step T15.
  • An identification mark mk representing the existing unmanned air vehicle dn is superimposed and displayed (T16).
  • the output control unit 35 transmits information on the directivity direction obtained in step T15 to the PTZ camera CZ, and a request for changing the imaging direction of the PTZ camera CZ to the directivity direction (in other words, an angle of view change instruction). (T17).
  • the imaging direction control unit 58 drives the lens driving motor 59 based on the information on the directional direction, and the imaging lens of the PTZ camera CZ.
  • the optical axis L2 is changed, and the imaging direction is changed to the pointing direction (T18).
  • the imaging direction control unit 58 changes the zoom magnification of the imaging lens of the PTZ camera CZ to a preset value, a value corresponding to the ratio of the unmanned air vehicle dn in the captured image, or the like.
  • the process of the unmanned air vehicle detection system 5 returns to the procedure T7, and the same process is repeated until a predetermined event such as a power-off operation is detected.
  • FIG. 10 is a flowchart showing an example of details of the unmanned air vehicle detection determination procedure of procedure T15 of FIG.
  • the directivity processing unit 63 sets the directivity range BF1 (see FIG. 11) based on the sound source position estimated by the sound source direction detection unit 34 as the initial position of the directivity direction BF2 (see FIG. 11). (S21).
  • FIG. 11 is a diagram illustrating an example of a state in which the pointing direction BF2 is sequentially scanned in the monitoring area 8 and the unmanned air vehicle dn is detected.
  • the initial position is not limited to the directivity range BF1 based on the sound source position of the monitoring area 8 estimated by the sound source direction detection unit 34, and an arbitrary position designated by the user is set as the initial position, and the monitoring area 8 The inside may be scanned sequentially. Since the initial position is not limited, even if the sound source included in the directivity range BF1 based on the estimated sound source position is not an unmanned air vehicle, an unmanned air vehicle flying in another directivity direction can be detected at an early stage. It becomes.
  • the directivity processing unit 63 determines whether or not the sound data collected by the microphone array MA and converted into digital values by the A / D converters An1 to An is temporarily stored in the memory 38 (S22). . If not stored, the processing of the directivity processing unit 63 returns to step S21.
  • the directivity processing unit 63 corresponds to an arbitrary directivity direction BF2 in the directivity range BF1 of the monitoring area 8. Beam forming is performed, and sound data in the directivity direction BF2 is extracted (S23).
  • the frequency analysis unit 64 detects the frequency and the sound pressure of the extracted sound data (S24).
  • the object detection unit 65 compares the detection sound pattern registered in the pattern memory of the memory 38 with the detection sound pattern obtained as a result of the frequency analysis process, and detects the unmanned air vehicle (S25).
  • the detection result determination unit 66 notifies the output control unit 35 of the result of the comparison, and notifies the detection direction control unit 68 about the detection direction shift (S26).
  • the object detection unit 65 compares the pattern of the detection sound obtained as a result of the frequency analysis process with the four frequencies f1, f2, f3, and f4 registered in the pattern memory of the memory 38. As a result of the comparison, the object detection unit 65 has at least two of the same frequency in both detection sound patterns, and when the sound pressure of these frequencies is greater than the first threshold th1, the detection sound patterns of both are Approximate and determine that the unmanned air vehicle dn exists.
  • the object detection unit 65 approximates when one frequency matches and the sound pressure at this frequency is greater than the first threshold th1. It may be determined that
  • the object detection unit 65 may set an error of an allowable frequency with respect to each frequency, and may determine the presence or absence of the approximation assuming that the frequencies within the error range are the same frequency.
  • the object detection unit 65 may determine that the sound pressure ratios of the sounds of the respective frequencies substantially match in addition to the determination condition.
  • the sound source detection unit UD can easily identify the detected unmanned air vehicle dn as a pre-registered object (for example, the unmanned air vehicle dn that is a moving object). The detection accuracy of the flying object dn can be improved.
  • the detection result determination unit 66 determines whether or not the unmanned air vehicle dn is present as a result of step S26 (S27).
  • the detection result determination unit 66 notifies the output control unit 35 that the unmanned air vehicle dn exists (detection result of the unmanned air vehicle dn) (S28).
  • the scanning control unit 67 moves the pointing direction BF2 to be scanned in the monitoring area 8 in the next different direction (S29).
  • the notification of the detection result of the unmanned air vehicle dn may be performed collectively after the omnidirectional scanning is completed, not at the timing when the detection processing of one directivity direction is completed.
  • the order in which the directing direction BF2 is sequentially moved in the monitoring area 8 is, for example, within the directivity range BF1 of the monitoring area 8 or within the entire range of the monitoring area 8, so as to go from the outer circumference to the inner circumference.
  • the order may be a spiral (spiral) from the inner circumference to the outer circumference.
  • the detection direction control unit 68 does not scan the directivity direction continuously as in a single stroke, but sets a position in the monitoring area 8 in advance, and sets the directivity direction BF2 at each position in an arbitrary order. It may be moved. Thereby, the monitoring apparatus 10 can start a detection process from the position where the unmanned air vehicle dn easily enters, for example, and can improve the detection process.
  • the scanning control unit 67 determines whether or not scanning in all directions in the monitoring area 8 has been completed (S30). When the omnidirectional scanning is not completed (S30, NO), the process of the directivity processing unit 63 returns to step S23, and the same operation is performed. That is, the directivity processing unit 63 performs beamforming in the directivity direction BF2 at the position moved in step S29, and extracts sound data in the directivity direction BF2. Thereby, even if one unmanned air vehicle dn is detected, the sound source detection unit UD continues detection of the unmanned air vehicles dn that may exist, so that detection of a plurality of unmanned air vehicles dn is possible. Is possible.
  • step S30 the directivity processing unit 63 erases the sound data collected by the microphone array MA temporarily stored in the memory 38 (S31). ).
  • the signal processing unit 33 determines whether or not to end the detection process of the unmanned air vehicle dn (S32).
  • the detection process of the unmanned air vehicle dn is terminated according to a predetermined event. For example, the number of times that the unmanned air vehicle dn was not detected in step S6 is held in the memory 38, and when this number exceeds a predetermined number, the detection process for the unmanned air vehicle dn may be terminated. Further, the signal processing unit 33 may end the detection process of the unmanned air vehicle dn based on time-up by a timer or a user operation on a UI (User Interface) (not shown) of the operation unit 32. Moreover, you may complete
  • UI User Interface
  • the frequency analysis unit 64 analyzes the frequency and also measures the sound pressure at that frequency.
  • the detection result determination unit 66 determines that the unmanned air vehicle dn is approaching the sound source detection unit UD when the sound pressure level measured by the frequency analysis unit 64 gradually increases with time. Also good.
  • the sound pressure level of a predetermined frequency measured at time t11 is smaller than the sound pressure level of the same frequency measured at time t12 after time t11, the sound pressure increases with time. It may be determined that the unmanned air vehicle dn is approaching. Even if the sound pressure level is measured three times or more and it is determined that the unmanned air vehicle dn is approaching based on the transition of statistical values (for example, dispersion value, average value, maximum value, minimum value, etc.). Good.
  • the detection result determination unit 66 may determine that the unmanned air vehicle dn has entered the warning area.
  • the third threshold th3 is larger than the second threshold th2, for example.
  • the warning area is, for example, the same area as the monitoring area 8 or an area included in the monitoring area 8 and narrower than the monitoring area 8.
  • the alert area is, for example, an area where the intrusion of the unmanned air vehicle dn is restricted.
  • the approach determination and the intrusion determination of the unmanned air vehicle dn may be executed by the detection result determination unit 66.
  • FIG. 12 is a diagram showing a display screen example of the monitor 50 when the unmanned air vehicle dn is not detected.
  • an omnidirectional image GZ1 from the omnidirectional camera CA and a PTZ image GZ2 from the PTZ camera CZ are displayed in comparison.
  • the omnidirectional image GZ1 three buildings bL1, bL2, bL3 and a chimney pL are shown, but an unmanned flying object dn is not shown.
  • the omnidirectional image GZ1 and the PTZ image GZ2 are displayed in comparison, but only one of them may be selected and displayed, or these images may be switched and displayed at regular intervals. May be.
  • FIG. 13 is a diagram showing a display screen example of the monitor 50 when the unmanned air vehicle dn is detected.
  • the omnidirectional image GZ1 displayed by the omnidirectional camera CA displayed on the display screen of the monitor 50 in addition to the three buildings bL1, bL2, bL3 and the chimney pL, an identification mark representing an unmanned air vehicle dn flying above these buildings mk is drawn with a star symbol.
  • the PTZ image GZ2 obtained by the PTZ camera CZ still shows the three buildings bL1, bL2, bL3 and the chimney pL, but does not show the unmanned air vehicle dn. That is, in FIG. 13, after the monitoring apparatus 10 displays the identification mark mk in step T16 of FIG.
  • the PTZ camera CZ requests an imaging direction from the PTZ camera CZ in step T17, and the PTZ camera CZ further captures in step T18.
  • a PTZ image GZ2 in a state before the optical axis direction is changed by rotating the lens is displayed.
  • FIG. 14 is a diagram showing an example of a display screen of the monitor 50 when the unmanned air vehicle dn is detected and the PTZ camera CZ changes the optical axis direction in conjunction with the detection.
  • the PTZ image GZ2 obtained by the PTZ camera CZ is an image zoomed up toward the unmanned air vehicle dn.
  • the three buildings bL1, bL2, bL3 and the chimney pL are no longer shown off the angle of view, and the unmanned air vehicle dn is shown in a zoomed-up state.
  • the monitoring apparatus 10 displays the PTZ image GZ2 in a state after rotating the imaging lens of the PTZ camera CZ to change the optical axis direction and further zooming up in step T18 of FIG. ing.
  • the identification mark mk is superimposed on the omnidirectional image GZ1 imaged by the omnidirectional camera CA, and the unmanned air vehicle dn is reflected as it is on the PTZ image GZ2 imaged by the PTZ camera CZ. This is because even if the image of the unmanned air vehicle dn appears as it is in the omnidirectional image GZ1, it is difficult to distinguish.
  • the PTZ image GZ2 captured by the PTZ camera CZ is a zoomed-up image, when the image of the unmanned air vehicle dn appears on the display screen, the unmanned air vehicle dn is clearly displayed.
  • the sound source detection unit UD can appropriately display the unmanned air vehicle dn in consideration of the visibility of the image displayed on the display screen of the monitor 50.
  • the omnidirectional image GZ1 may display the unmanned air vehicle dn as it is without displaying the identification mark mk so that the omnidirectional image GZ1 and the PTZ image GZ2 are displayed in the same or different manner.
  • the identification mark mk may be superimposed on the PTZ image GZ2.
  • FIG. 15 is a diagram showing another example of the display screen of the monitor 50 when the unmanned air vehicle dn is detected and the PTZ camera CZ changes the optical axis direction in conjunction with the detection.
  • the display screen of the monitor 50 shown in FIG. 15 is displayed, for example, when the user instructs another display menu (not shown) via the operation unit 32 of the monitoring apparatus 10.
  • the display screen of FIG. 15 shows that there is another sound source in which the calculated sound pressure value for each pixel constituting the omnidirectional image data is equivalent to the sound pressure value of the unmanned air vehicle dn.
  • the omnidirectional image GZ1 in addition to the identification mark mk representing the unmanned air vehicle dn, another identification mark mc representing another sound source is superimposed.
  • the other identification mark mc is desirably drawn in a display form different from that of the identification mark mk, and is drawn with a circular symbol in FIG.
  • Different display forms include symbols and characters such as ellipses, triangles, and Hatena marks.
  • the other identification mark mc may be dynamically displayed in the same manner as the identification mark mk.
  • the omnidirectional image GZ1 is obtained by generating a sound pressure map representing the sound pressure for each pixel by the output control unit 35 and performing color conversion processing on a region where the calculated sound pressure exceeds the threshold value.
  • a sound pressure heat map MP is superimposed.
  • the region R1 where the sound pressure exceeds the second threshold th2 is drawn in red (large dot group in the figure), and the sound pressure is greater than the first threshold th1 and less than or equal to the second threshold th2.
  • a certain area B1 is drawn in blue (small dot group in the figure).
  • the region N1 where the sound pressure is equal to or less than the first threshold th1 is rendered transparent (nothing is displayed in the figure).
  • the other identification mark mc representing the position of another sound source is drawn, and the sound pressure heat map MP is drawn.
  • the surrounding situation surrounding the flying object dn can be understood well. For example, when a sound source that has not yet been registered is flying as the unmanned air vehicle dn, the user indicates the position of the sound source represented by another identification mark mc from the display screen of the monitor 50, or The red region R1 of the sound pressure heat map MP is indicated.
  • the output control unit 35 of the monitoring device 10 causes the PTZ camera CZ to zoom in on the position of the sound source or the red region R1 to obtain the PTZ image GZ2 after zooming up, and displays the PTZ image GZ2 on the monitor 50. Therefore, unidentified sound sources can be confirmed quickly and accurately. Thereby, even if an unregistered unmanned air vehicle dn exists, the user can detect it.
  • the omnidirectional camera CA images the monitoring area 8 (imaging area).
  • the microphone array MA collects the sound in the monitoring area 8.
  • the monitoring device 10 detects the unmanned air vehicle dn appearing in the monitoring area 8 using the voice data collected by the microphone array MA.
  • the signal processing unit 33 in the monitoring device 10 monitors the identification mark mk (first identification information) obtained by converting the unmanned air vehicle dn into visual information in the captured image (that is, the omnidirectional image GZ1) of the omnidirectional camera CA.
  • the image is superimposed on the omnidirectional image GZ1 of the area 8 and displayed on the monitor 50.
  • the unmanned air vehicle detection system 5 can quickly and accurately determine the presence and position of the target unmanned air vehicle dn using the omnidirectional image GZ1 captured by the omnidirectional camera CA. .
  • the PTZ camera CZ capable of adjusting the optical axis direction images the monitoring area 8.
  • the signal processing unit 33 outputs an instruction for adjusting the optical axis direction in a direction corresponding to the detection result of the unmanned air vehicle dn to the PTZ camera CZ.
  • the monitor 50 displays an image (that is, a PTZ image GZ2) captured by the PTZ camera CZ whose optical axis direction has been adjusted.
  • the unmanned aerial vehicle detection system 5 clearly recognizes an accurate model of the unmanned aerial vehicle dn from the undistorted image of the unmanned aerial vehicle dn captured by the PTZ camera CZ. Can be specified.
  • the monitor 50 displays the omnidirectional image GZ1 of the omnidirectional camera CA including the identification mark mk of the unmanned air vehicle dn and the captured image of the PTZ camera CZ (that is, the PTZ image GZ2) in contrast.
  • the signal processing unit 33 detects at least one other sound source in the monitoring area 8 and converts the other sound source into visual information in the captured image of the omnidirectional camera, which is different from the identification mark mk. It is displayed on the monitor 50 as (second identification information). Thereby, the supervisor who is a user can grasp
  • the signal processing unit 33 calculates a sound pressure value for each pixel in the captured image of the monitoring area 8, and uses the sound pressure value for each pixel in the captured image as the sound pressure heat map MP, for each pixel. Accordingly, the image is superimposed on the omnidirectional image data of 8 in the imaging area and displayed on the monitor 50 so that it can be identified with a plurality of different color gradations. Thereby, the user can compare and compare the sound pressure of the sound generated by the unmanned air vehicle dn with the sound pressure of the surroundings, and the sound pressure of the unmanned air vehicle can be known relatively and visually.
  • FIG. 16 is a diagram illustrating a schematic example of a configuration of an unmanned air vehicle detection system 5A according to Modification 1 of the first embodiment.
  • the camera arranged coaxially with the microphone array MA is the omnidirectional camera CA.
  • Modification 1 the camera arranged so that the central axis of the microphone array MA coincides with its own optical axis is a fixed camera CF.
  • the fixed camera CF is a camera having a predetermined angle of view in which the optical axis is fixed in a specific direction, and is installed in advance so that, for example, a space where the unmanned air vehicle dn is expected to fly can be imaged.
  • the angle of view agl of the fixed camera CF is set above the building group bLg.
  • the unmanned flight as a subject is performed by changing the imaging direction in the direction of the sound collected by the microphone array MA (that is, the direction from the microphone array MA toward the unmanned air vehicle dn).
  • the camera that images the body dn is the PTZ camera CZ, as in the first embodiment.
  • the unmanned air vehicle dn is changed according to the sequence similar to that of the first embodiment only by changing the omnidirectional camera CA to the fixed camera CF in the sequence diagram shown in FIG. The detection operation is performed.
  • the unmanned air vehicle detection system 5A detects an unmanned air vehicle dn, and when the unmanned air vehicle dn enters the angle of view agl monitored by the fixed camera CF, an image captured by the fixed camera CF is displayed. An identification mark mk representing the unmanned air vehicle dn is displayed on the monitor 50, and a zoom-up image captured by the PTZ camera CZ is displayed on the monitor 50. Further, when the unmanned aerial vehicle dn does not exist at the angle of view agl monitored by the fixed camera CF, imaging by the PTZ camera CZ in the imaging direction requested by the monitoring device 10 is not performed.
  • the pixel region R1 in which the sound pressure larger than the second threshold th2 is obtained is, for example, red as in the first embodiment.
  • a pixel region B1 where a sound pressure greater than the first threshold th1 and less than or equal to the second threshold th2 is obtained is drawn in blue, for example.
  • the pixel region R0 on which the identification mark mk is superimposed and the sound pressure greater than the third threshold th3 (> th2) is obtained is drawn in, for example, purple.
  • the unmanned aerial vehicle only corresponds to a limited area corresponding to an image captured by the fixed camera CF, for example, where the unmanned aerial vehicle dn is expected to fly. Since the detection process of dn is performed, the load of the detection process of the unmanned air vehicle dn can be reduced, and the speed of the detection process of the unmanned air vehicle dn can be increased.
  • the monitor 50 displays the captured image of the fixed camera CF including the identification mark mk of the unmanned air vehicle dn and the captured image of the PTZ camera CZ in a comparative manner.
  • the supervisor who is the user for example, alternately compares the captured image of the fixed camera CF and the captured image of the PTZ camera CZ, so that the model of the unmanned air vehicle dn and the surrounding situation where the unmanned air vehicle dn exists are present. Can be grasped accurately.
  • FIG. 17 is a diagram illustrating a schematic example of the configuration of the unmanned air vehicle detection system 5B according to the second modification of the first embodiment.
  • the camera arranged coaxially with the microphone array MA is an omnidirectional camera CA.
  • the camera arranged coaxially with the microphone array MA is a fixed camera CF. It was.
  • the camera arranged so that the central axis of the microphone array MA coincides with its own optical axis is the PTZ camera CZ1.
  • the PTZ camera CZ1 is a camera that can capture an image by changing the direction of the optical axis, and has a step angle in a predetermined direction (preset direction) with respect to the monitoring area 8 (see FIG. 11) that is a hemisphere. Can be taken.
  • the PTZ camera CZ1 changes the angle of view ag2-1, ag2-2, ag2-3, ag2-4 (generally referred to as ag2 if there is no need to distinguish) in four preset directions, Four areas of interest in the monitoring area 8 are imaged.
  • the PTZ camera CZ1 may image the monitoring area 8 at each angle of view where the optical axis is continuously changed, instead of changing the angle of view stepwise.
  • the imaging direction is changed to the direction of the sound collected by the microphone array MA (that is, the direction from the microphone array MA toward the unmanned air vehicle dn).
  • the camera that images the unmanned air vehicle dn as the subject is the PTZ camera CZ, as in the first embodiment.
  • the unmanned aerial vehicle detection system 5n As in the first modification, in the sequence diagram shown in FIG. 9, the omnidirectional camera CA is changed to the PTZ camera CZ1, and the same sequence as in the first embodiment is performed. Accordingly, the unmanned air vehicle dn is detected.
  • the unmanned air vehicle detection system 5B detects the unmanned air vehicle dn in the image of the angle of view ag2 captured by the PTZ camera CZ1 when the PTZ camera CZ1 detects the unmanned air vehicle dn at a certain angle of view ag2.
  • An identification mark mk to be displayed is displayed on the monitor 50, and a zoom-up image captured by the PTZ camera CZ is further displayed on the monitor 50.
  • the image pickup direction requested by the monitoring device 10 is increased. Imaging by the PTZ camera CZ1 is not performed.
  • the PTZ camera CZ1 captures an image, thereby improving the visibility of the unmanned air vehicle dn reflected in the captured image. . That is, when the image is captured by the omnidirectional camera CA, it may be difficult to accurately determine the unmanned air vehicle dn due to distortion of the periphery of the image, but in the captured image by the PTZ camera CZ1, the outline of the unmanned air vehicle dn is correct. Can be caught. Thereby, the detection accuracy of the unmanned air vehicle dn is improved.
  • the monitor 50 displays the captured image of the PTZ camera CZ1 including the identification mark mk of the unmanned air vehicle dn and the captured image of the PTZ camera CZ in a comparative manner.
  • the supervisor who is the user for example, has zoomed in on the captured image of the PTZ camera CZ1 (that is, the image of the monitoring area captured extensively) and the captured image of the PTZ camera CZ (the detected unmanned air vehicle dn is focused) By alternately comparing the images), it is possible to accurately grasp the model of the unmanned air vehicle dn and the surrounding situation where the unmanned air vehicle dn exists.
  • FIG. 18 is a diagram illustrating a schematic example of a configuration of an unmanned air vehicle detection system 5C according to Modification 3 of the first embodiment.
  • the camera that captures an image in response to a request from the monitoring apparatus 10 is the PTZ camera CZ.
  • Modification 3 hereinafter referred to as “Modification 3” of the first embodiment, there are a plurality of fixed cameras CF2-1, CF2-2,..., CF2-N (N is an arbitrary number).
  • Each of the plurality of fixed cameras CF2-1, CF2-2,..., CF2-N can capture the unmanned air vehicle dn reliably with respect to the omnidirectional image GZ3 captured by the omnidirectional camera CA.
  • FIG. 19 is a sequence diagram illustrating an example of an unmanned air vehicle detection operation of the unmanned air vehicle detection system 5C according to the third modification of the first embodiment.
  • the same steps as those in the sequence diagram (see FIG. 9) in the first embodiment are denoted by the same reference numerals, and the description thereof is omitted.
  • the main operation of the PTZ camera CZ shown in FIG. 9 may be read as a plurality of fixed cameras CF2-1, CF2-2,..., CF2-N.
  • the image data is transmitted to the monitoring apparatus 10 only from the fixed camera CF2-N in the procedure T7, but each of the fixed cameras CF2-1, CF2-2,.
  • the captured image data is transmitted to the monitoring device 10.
  • step T16 the monitoring device 10 superimposes and displays the identification mark mk representing the unmanned air vehicle dn on the omnidirectional image GZ1 displayed on the screen of the monitor 50, and then selects the fixed camera CF2-1, for example.
  • the identification mark mk is included in the angle of view captured by the fixed camera CF2-1 among the plurality of fixed cameras CF2-1, CF2-2,.
  • the monitoring apparatus 10 selects the fixed camera CF2-1 (T19), and makes an image distribution request to the selected fixed camera CF2-1 (T20).
  • the fixed camera CF2-1 transmits the image data of the captured image in the fixed optical axis direction to the monitoring device 10 (T21).
  • each of the fixed cameras CF2-1, CF2-2,..., CF2-N takes an image in the procedure T2 in accordance with the image transmission request transmitted from the monitoring apparatus 10 in the procedure T1. Processing has started and continues.
  • the monitoring device 10 When the monitoring device 10 receives the image data captured by the fixed camera CF2-1, the monitoring device 10 displays the image data on the monitor 50 (T22). An image showing the unmanned air vehicle dn is displayed on the screen of the monitor 50 (see the lower right side of the drawing in FIG. 18).
  • the unmanned air vehicle detection system 5C further includes two or more fixed cameras CF2-1,..., CF2-N that have different optical axis directions and image the imaging area.
  • the signal processing unit 33 selects a fixed camera having the detection direction of the unmanned air vehicle dn as the optical axis direction from two or more fixed cameras, and requests the selected fixed camera to distribute the captured image.
  • the monitor 50 displays the captured image distributed from the fixed camera selected based on this request.
  • an image that reliably captures the unmanned air vehicle dn captured by the fixed camera is obtained simply by selecting a fixed camera whose imaging direction (optical axis direction) is fixed in advance.
  • the image is displayed on the monitor 50.
  • FIG. 20 is a diagram illustrating a schematic example of the configuration of an unmanned air vehicle detection system 5D according to Modification 4 of the first embodiment.
  • the unmanned aerial vehicle detection system 5D of Modification 4 (hereinafter referred to as “Modification 4”) of the first embodiment, a plurality of cameras that capture images in response to a request from the monitoring device 10 are fixed as in Modification 3.
  • the camera arranged coaxially with the microphone array MA is the omnidirectional camera CA.
  • the center axis of the microphone array MA and the optical axis of the microphone array MA coincide with each other.
  • the arranged camera is a fixed camera CF. That is, the unmanned air vehicle detection system 5D according to the modification 4 has a configuration corresponding to the combination of the modification 1 and the modification 3.
  • the unmanned aerial vehicle detection system 5D of the fourth modification in the sequence diagram shown in FIG. 19 of the third modification, only the omnidirectional camera CA is changed to the fixed camera CF, and the unmanned flight follows the same sequence as the third modification.
  • the detection operation of the body dn is performed.
  • the unmanned air vehicle dn may be detected only for a limited area where the unmanned air vehicle dn is expected to fly.
  • the load of the detection process of the body dn can be reduced, and the detection process of the unmanned air vehicle dn can be speeded up.
  • FIG. 21 is a diagram illustrating a schematic example of the configuration of an unmanned air vehicle detection system 5E according to Modification 5 of the first embodiment.
  • a camera that captures an image in response to a request from the monitoring device 10 as in Modifications 3 and 4.
  • CF2-1, 2-2,..., CF2-n are a plurality of fixed cameras CF2-1, 2-2,..., CF2-n (n is an arbitrary number).
  • the camera arranged coaxially with the microphone array MA is an omnidirectional camera CA
  • the camera arranged coaxially with the microphone array MA is a fixed camera CF.
  • the camera arranged so that the central axis of the microphone array MA coincides with its own optical axis is the PTZ camera CZ1. That is, the unmanned aerial vehicle detection system 5E of Modification 5 has a configuration corresponding to the combination of Modification 2 and Modification 3.
  • the unmanned flight is performed in accordance with the same sequence as that of the third modification only in that the omnidirectional camera CA is changed to the PTZ camera CZ1 in the sequence diagram shown in FIG. The detection operation of the body dn is performed.
  • the visibility of the unmanned air vehicle dn reflected in the image is improved by taking an image with the PTZ camera CZ1. That is, when the image is captured by the omnidirectional camera CA, it may be difficult to accurately determine the unmanned air vehicle dn due to distortion of the periphery of the image, but in the captured image by the PTZ camera CZ1, the outline of the unmanned air vehicle dn is correct. Can be caught. Thereby, the detection accuracy of the unmanned air vehicle dn is improved. Furthermore, it is possible to reduce the number of a plurality of fixed cameras that capture images at different angles of view in different imaging directions within a limited area. Thereby, the unmanned air vehicle detection system which can detect an unmanned air vehicle accurately can be constructed at low cost.
  • FIG. 22 is a diagram illustrating an example of a schematic configuration of the unmanned air vehicle detection system 5F according to the second embodiment.
  • the same components as those of the unmanned air vehicle detection system 5 of the first embodiment are denoted by the same reference numerals, and the description thereof is omitted.
  • the unmanned air vehicle detection system 5F of the second embodiment includes, for example, two sound source detection units UD1 and UD2 having the same configuration as the sound source detection unit UD shown in FIG.
  • the unmanned aerial vehicle detection system 5F includes information on each direction of the unmanned air vehicle dn detected by the respective sound source detection units UD1 and UD2 (that is, a direction from the sound source detection units UD1 and UD2 toward the unmanned air vehicle dn) (hereinafter, The distance from either one or both of the two sound source detection units to the unmanned air vehicle dn is measured using “the detection direction of the unmanned air vehicle dn”) and the triangulation technique.
  • the unmanned air vehicle detection system 5F includes a detection device DT1, a detection device DT2, and a distance calculation device 90.
  • the detection device DT1 includes a sound source detection unit UD1, a monitoring device 10A, and a monitor 50A, and performs the same operation as the unmanned air vehicle detection systems 5 to 5E of the first embodiment. Since the internal configuration of the monitoring device 10A is the same as the internal configuration of the monitoring device 10 of the first embodiment, the description of the internal configuration of the monitoring device 10A is omitted.
  • the monitoring device 10A and the monitor 50A are integrated with a general-purpose computer device, but may be a device having separate housings.
  • the detection device DT2 includes a sound source detection unit UD2, a monitoring device 10B, and a monitor 50B, and performs the same operation as the unmanned air vehicle detection systems 5 to 5E of the first embodiment. Since the internal configuration of the monitoring device 10B is the same as the internal configuration of the monitoring device 10 of the first embodiment, the description of the internal configuration of the monitoring device 10B is omitted.
  • the distance calculation device 90 is a general-purpose computer device, and is based on detection information including the detection direction of the unmanned air vehicle dn from the detection device DT1 and detection information including the detection direction of the unmanned air vehicle dn from the detection device DT2. The distance from either or both of the detection devices DT1 and DT2 to the unmanned air vehicle dn is calculated.
  • FIG. 23 is a block diagram showing an example of the internal configuration of the distance calculation device 90 in detail.
  • the distance calculation device 90 includes a distance calculation processing unit 91, a memory 92, an operation unit 93, a setting management unit 94, a communication unit 95, and a display 96.
  • the distance calculation processing unit 91 is configured using a processor such as a CPU (Central Processing Unit), an MPU (Micro Processing Unit), or a DSP (Digital Signal Processor).
  • the distance calculation processing unit 91 uses the two detection directions of the unmanned air vehicle dn and the constant distance (known) between the sound source detection units UD1 and UD2 included in the detection information received from the detection devices DT1 and DT2.
  • a surveying technique is performed to calculate the distance from either or both of the sound source detection units UD1 and UD2 to the unmanned air vehicle dn.
  • the memory 92 stores a program for calculating the distance from one or both of the sound source detection units UD1 and UD2 to the unmanned air vehicle dn using the triangulation technique.
  • the operation unit 93 is an input device such as a mouse or a keyboard.
  • the operation unit 93 and the display 96 may be integrally configured with a touch panel.
  • the communication unit 95 receives the detection information from the two monitoring devices 10A and 10B, and transmits the distance information calculated by the distance calculation processing unit 91 to the monitoring devices 10A and 10B based on these detection information.
  • the display 96 displays a UI screen GM (see FIG. 28) or the like that represents the measurement result of the distance to the unmanned air vehicle dn.
  • the display 96 may be accommodated in the housing
  • the setting management unit 94 uses triangulation technique to calculate information necessary for calculating the distance from either or both of the sound source detection units UD1 and UD2 to the unmanned air vehicle dn (for example, the sound source detection units UD1 and UD2). Information of a known constant distance between).
  • the distance between the sound source detection units UD ⁇ b> 1 and UD ⁇ b> 2 is stored by, for example, a user performing surveying in advance and inputting the result via the operation unit 93. Note that the distance between the sound source detection units UD1 and UD2 may be automatically measured by the detection devices DT1 and DT2.
  • the distance calculation device 90 calculates and stores the distance between the sound source detection units UD1 and UD2. It is possible to keep.
  • the distance calculation device 90 that measures the distance from either or both of the sound source detection units UD1 and UD2 to the unmanned air vehicle dn includes the detection devices DT1 and DT2. However, it may be realized by either one of the monitoring devices 10A and 10B.
  • FIG. 24 is a flowchart showing an example of a detection operation procedure in the detection devices DT1 and DT2.
  • Each microphone array MA of the sound source detection units UD1 and UD2 collects sound including sound in the imaging area (see the first embodiment) and transmits sound data obtained by the sound collection to the monitoring devices 10A and 10B. .
  • monitoring device 10A, 10B receives the sound data transmitted from each microphone array MA, and inputs it into each signal processing part 33 (S51).
  • Each signal processing unit 33 of the monitoring devices 10A and 10B performs detection processing of the unmanned air vehicle dn by sound source search (S52).
  • the detection processing in step S52 is the same as the processing in steps S23 to S26 of FIG. 10 in the first embodiment.
  • Each signal processing part 33 of monitoring device 10A, 10B discriminate
  • the processing of each signal processing unit 33 returns to step S51.
  • each signal processing unit 33 is directed when the sound directivity is formed in the direction from the microphone array MA toward the detected unmanned air vehicle dn.
  • the direction is calculated as the detection direction to the unmanned air vehicle dn (S54).
  • Each communication part 31 of monitoring device 10A, 10B transmits the detection information containing the detection direction calculated by step S54 to the distance calculation apparatus 90 (S55). Thereafter, the processing of the detection devices DT1 and DT2 returns to step S51.
  • FIG. 25 is a flowchart showing an example of a distance calculation procedure in the distance calculation device 90.
  • the communication unit 95 of the distance calculation device 90 determines whether or not the detection information from the detection device DT1 has been received (S61). When the detection information has not been received (S61, NO), the processing of the communication unit 95 returns to step S61.
  • the communication unit 95 determines whether or not the detection information from the detection device DT2 has been received (S62). When the detection information has not been received (S62, NO), the processing of the communication unit 95 returns to step S61.
  • the presence / absence of reception of the detection information is determined in the order of the detection devices DT1 and DT2, but may be performed in the reverse order. In other words, it may be executed in the order of step S62 ⁇ step S61.
  • the distance calculation processing unit 91 uses the triangulation technique based on the detection information received in steps S61 and S62 to detect the detection target (unmanned aerial vehicle) from either or both of the sound source detection units UD1 and UD2.
  • the distance to dn) is calculated (S63). Details of the calculation of the distance from one or both of the sound source detection units UD1 and UD2 to the unmanned air vehicle dn will be described later.
  • the distance calculation processing unit 91 displays the UI screen GM (see FIG. 28) including the calculation result of the distance to the unmanned air vehicle dn on the display 96 (S64). Thereafter, the processing of the distance calculation device 90 returns to step S61.
  • FIG. 26 is an explanatory diagram of an example of a method for obtaining the distance from either or both of the sound source detection units UD1 and UD2 to the unmanned air vehicle dn using the triangulation technique.
  • the positions of the sound source detection units UD1 and UD2 are TM1 and TM2, respectively.
  • Dp be the position of the unmanned air vehicle dn.
  • the distance from the sound source detection unit UD1 to the unmanned air vehicle dn is defined as distR.
  • the distance from the sound source detection unit UD2 to the unmanned air vehicle dn is set to distL.
  • the length of the perpendicular from the position Dp of the unmanned air vehicle dn to the line segment between the position TM1 of the sound source detection unit UD1 and the position TM2 of the sound source detection unit UD2 is defined as dist.
  • FIG. 27 is an explanatory diagram of a calculation example of the position Dp of the unmanned air vehicle dn.
  • a square plane having a length of one side along the xy plane is assumed.
  • m is the length (known) of the line segment between the sound source detection units UD1 and UD2.
  • a position Dp of the unmanned air vehicle dn flying in the air is represented by coordinates (x, y, z).
  • the sound source detection unit UD1 When the sound source detection unit UD1 is arranged at the midpoint position of one side (length m) extending in the y-axis direction from the origin (0, 0, 0) which is one vertex of the square plane described above,
  • the position TM1 of the detection unit UD1 is represented by coordinates (0, m / 2, 0).
  • the sound source detection unit UD2 is arranged at the midpoint position of one side (length m) extending in the y-axis direction from the point (m, 0, 0) which is one vertex of the square plane described above.
  • the position TM2 of the sound source detection unit UD2 is represented by coordinates (m, m / 2, 0).
  • the length (constant distance) of the line segment between the sound source detection units UD1 and UD2 is m.
  • the detection direction of the detection information of the unmanned air vehicle dn detected by the sound source detection unit UD1 is indicated by the vertical angle v1 and the horizontal angle h1.
  • the detection direction of the detection information of the unmanned air vehicle dn detected by the sound source detection unit UD2 is indicated by the vertical angle v2 and the horizontal angle h2.
  • the horizontal angle h1 h2.
  • the length dist of the perpendicular line from the unmanned air vehicle dn to the line segment between the sound source detection units UD1 and UD2 is a radius r, and the foot of the perpendicular line from the unmanned air vehicle dn to this line segment is represented by coordinates (x, m / 2, 0 ),
  • the position Dp of the unmanned air vehicle dn is at a horizontal angle h2 in a circle CRC on the horizontal plane centered on the leg of this perpendicular line.
  • Equation (1) the coordinates (x, y, z) of the position Dp of the unmanned air vehicle dn are expressed by Equation (1), Equation (2), and Equation (3), respectively.
  • the distances distR and distL from the sound source detection units UD1 and UD2 to the unmanned air vehicle dn and the sound source detection unit UD1 , UD 2 is simply calculated. This calculation is performed by the distance calculation processing unit 91.
  • each of the sound source detection units UD1 and UD2 detects the unmanned air vehicle dn.
  • the detection direction (vertical angle v1, horizontal angle h1), (vertical angle v2, horizontal angle h2) and the constant distance m between the sound source detection units UD1 and UD2 may be known.
  • FIG. 28 is a diagram illustrating an example of a UI screen GM including a distance from one or both of the sound source detection units UD1 and UD2 to the unmanned air vehicle dn.
  • This UI screen GM is displayed on the display 96 of the distance calculation device 90.
  • UI screen GM a schematic diagram showing a positional relationship among the unmanned air vehicle dn, the sound source detection unit UD1 (first unit), and the sound source detection unit UD2 (second unit), one of the sound source detection units UD1 and UD2, or Distance from both to unmanned air vehicle dn, distance between sound source detection units UD1 and UD2 (constant distance), direction information obtained by sound source detection unit UD1 (first direction information), and direction information obtained by sound source detection unit UD2 (Second direction information) is included.
  • the user can confirm the numerical value used as the basis by which the distance to the unmanned air vehicle dn was calculated.
  • the UI screen GM displays a distance L1 and an elevation angle ⁇ 1 from the sound source detection unit UD1 to the unmanned air vehicle dn, and a distance L2 and an elevation angle ⁇ 2 from the sound source detection unit UD1 to the unmanned air vehicle dn.
  • a button bn1 for setting the distance between the two sound source detection units UD1 and UD2 is displayed.
  • the distance between the two sound source detection units UD1 and UD2 is set to 2 meters (M) by pressing the button bn1.
  • a screen wd1 including detection information of the sound source detection unit UD1 and a screen wd2 including detection information of the sound source detection unit UD2 are displayed.
  • the screen wd1 displays the IP address of the sound source detection unit UD1, the connection / disconnection switch SW1, and the data table Td1.
  • a report ID, a vertical angle, and a horizontal angle are registered.
  • vertical angle: 72.1” and “horizontal angle: 246.8” represent the detection direction det1 of the sound source detection unit UD1.
  • the IP address of the sound source detection unit UD2, the connection / disconnection switch SW2, and the data table Td2 are displayed on the screen wd2.
  • a report ID, a vertical angle, a horizontal angle, and a discovery category are registered.
  • vertical angle: 71.05” and “horizontal angle: 240.26” represent the detection direction det2 of the sound source detection unit UD2.
  • a pull-down menu bn2 for setting the distance between the two sound source detection units UD1, UD2, an environment setting read button bn3, a setting button bn4, and a default save button bn5 are arranged.
  • the sound source detection unit UD1 (first detection unit) in which the omnidirectional camera and the microphone array are coaxially arranged, and the omnidirectional camera and the microphone array are coaxially arranged.
  • the sound source detection unit UD2 (second detection unit) is disposed a certain distance away.
  • the distance calculation device 90 includes this fixed distance, first direction information including the detection direction det1 (vertical angle v1, horizontal angle h1) of the unmanned air vehicle dn derived by the omnidirectional camera of the sound source detection unit UD1, and sound source detection.
  • the distance to the unmanned air vehicle dn is derived based on the second direction information including the detection direction det2 (vertical angle v2, horizontal angle h2) of the unmanned air vehicle dn derived by the omnidirectional camera of the unit UD2. 96 (second display section).
  • the unmanned air vehicle detection system 5F in the second embodiment it is possible not only to display the flying unmanned air vehicle dn on the display 96 but also to obtain the actual distance to the unmanned air vehicle dn. As a result, it is possible to roughly grasp where the unmanned air vehicle dn is, and how much time it takes to reach the unmanned air vehicle dn. Therefore, it can be used for preparation for the unmanned air vehicle dn.
  • FIG. 1 An example will be described in which a plurality of sound source detection units UD shown in FIG. 1 are used to estimate the position in the monitoring area where the unmanned air vehicle dn exists.
  • the water purification plant is illustrated and described as the monitoring area of the present embodiment, it is not limited to the water purification plant.
  • a water purification plant is a facility that takes water flowing into a river or the like and underground water, disinfects it by filtration, sterilization, etc., and supplies water suitable for drinking to the waterworks.
  • an unmanned air vehicle detection of the present embodiment is assumed assuming that the unmanned air vehicle dn flies over the water purification plant and sprays poisons and the like.
  • the system estimates the block (section) of the water purification plant where the unmanned air vehicle dn exists.
  • a water purification plant has many storage tanks suitable for water purification treatment. Each storage tank is divided into blocks (sections). In the present embodiment, a storage tank divided into a plurality of blocks (sections) in a grid pattern is assumed, and it is estimated in which block the unmanned air vehicle dn flying over the water purification plant exists.
  • FIG. 29 is a flowchart showing an example of a section estimation procedure in the monitoring area where the unmanned air vehicle dn exists in the third embodiment.
  • the distance measuring device 90 includes detection devices DT3, DT4, DT5, DT6 provided corresponding to the sound source detection units UD3, UD4, UD5, UD6, as in FIG. Information or data communication is possible between the two.
  • illustration of the respective monitoring devices and monitors constituting the detection devices DT3, DT4, DT5, and DT6 is omitted.
  • the communication unit 95 of the distance calculation device 90 determines whether or not the detection information from the detection device DT3 has been received (S71). When the detection information has not been received (S71, NO), the processing of the communication unit 95 returns to step S71.
  • the communication unit 95 determines whether or not the detection information from the detection device DT4 has been received (S72). When the detection information has not been received (S72, NO), the processing of the communication unit 95 returns to step S71.
  • the communication unit 95 determines whether or not the detection information from the detection device DT5 has been received (S73). When the detection information has not been received (S73, NO), the processing of the communication unit 95 returns to step S71.
  • the communication unit 95 determines whether or not the detection information from the detection device DT6 has been received (S74). When the detection information has not been received (S74, NO), the processing of the communication unit 95 returns to step S71. In FIG. 29, the presence / absence of reception of detection information is determined in the order of the detection devices DT3, DT4, DT5, and DT6, but it is sufficient that reception of detection information from the four detection devices is confirmed. The order of the four processes in step S74 is not limited.
  • the distance calculation processing unit 91 of the distance calculation device 90 calculates the distance from each sound source detection unit UD3, UD4, UD5, UD6 to the unmanned air vehicle dn (S75). ).
  • the distance calculation processing unit 91 is based on the distance from each sound source detection unit UD3, UD4, UD5, UD6 to the unmanned air vehicle dn, and is a block in the water purification plant Upw (see FIG. 30) where the unmanned air vehicle dn exists in the sky. Blk is estimated (S76).
  • step S76 it is assumed that the map information (position information) of the storage tank divided into a plurality of blocks (sections) in a grid pattern is known and held by the distance calculation device 90.
  • the distance calculation processing unit 91 of the distance calculation device 90 includes the position of the unmanned air vehicle dn specified by the distance from each sound source detection unit UD3, UD4, UD5, UD6 to the unmanned air vehicle dn, and the storage tank described above.
  • the block Blk in the water purification plant Upw see FIG. 30 where the unmanned air vehicle dn exists in the sky is estimated (S76).
  • the distance calculation processing unit 91 displays the estimated block Blk (in other words, the block Blk in the water purification plant Upw where the unmanned air vehicle dn exists above) on the display 96 (S77).
  • FIG. 30 is a diagram illustrating a screen display example of the display 96 on which the water purification plant Upw including the block Blk in which the unmanned air vehicle dn exists in the sky is displayed. The block Blk in which the unmanned air vehicle dn exists in the sky is displayed so as to be distinguishable from other blocks.
  • the four sound source detection units UD3, UD4, UD5, and UD6 are arranged so as to surround the water purification plant Upw (monitoring area).
  • the distance calculation device 90 estimates a block blk (a section in the monitoring area) of the water purification plant Upw where the unmanned air vehicle dn exists, based on the distances to the unmanned air vehicle dn derived respectively by a plurality of units.
  • a block (a section in the monitoring area) where the unmanned air vehicle dn has come easily is specified. Can do.
  • an appropriate measure can be taken for every block estimated that the unmanned air vehicle dn exists in the sky. For example, if there is nothing empty in the estimated block, it is also possible as a measure to drop the unmanned air vehicle dn as it is.
  • the estimated block contains a large amount of water that has been sterilized such as filtration and sterilization, it is possible to take measures such as capturing the unmanned air vehicle dn so as to cover it. is there.
  • the distance calculation processing unit 91 of the distance calculation device 90 has the sound source detection units UD3, UD4, and UD4.
  • the distance from UD5, UD6 to unmanned air vehicle dn was calculated.
  • the distance calculation device 90 can acquire and specify the position of the unmanned air vehicle dn. Therefore, the facility where the unmanned air vehicle dn exists.
  • the blocks inside can also be estimated.
  • the water purification plant that is a monitoring area is assumed to be rectangular, and four sound source detection units are arranged at the four corners.
  • the monitoring area is a circle such as a sports facility, the circle is enclosed.
  • a plurality of sound source detection units may be arranged at an arbitrary place to estimate a section where the unmanned air vehicle dn exists.
  • the four sound source detection units UD3, UD4, UD5, and UD6 at the four corners measure GPS (Global Positioning System) for measuring position information including latitude and longitude indicating the respective installation locations.
  • the devices Gp3, Gp4, Gp5, Gp6 may be incorporated.
  • the GPS measurement devices Gp3, Gp4, Gp5, and Gp6 are incorporated in at least one of the omnidirectional camera, microphone array, and PTZ camera that constitute the sound source detection units UD3, UD4, UD5, and UD6. That's fine.
  • the GPS measuring devices Gp3, Gp4, Gp5, and Gp6 are not limited to being incorporated in the sound source detection units UD3, UD4, UD5, and UD6, and may be connected externally.
  • the GPS measurement devices Gp3, Gp4, Gp5, and Gp6 may be connected to the corresponding sound source detection units UD3, UD4, UD5, and UD6 by wire (for example, transmission cables) or may be connected wirelessly.
  • the GPS measuring devices Gp3, Gp4, Gp5, and Gp6 constitute the detecting devices DT3, DT4, DT5, and DT6 by measuring their respective positional information when the corresponding sound source detecting units UD3, UD4, UD5, and UD6 are installed.
  • Each monitoring device sends each position information measured by the GPS measuring devices Gp3, Gp4, Gp5, Gp6 to the distance calculation device 90 (see FIG. 22).
  • the distance calculation device 90 uses the information from the detection devices DT3, DT4, DT5, and DT6 (for example, the position information of the sound source detection units UD3, UD4, UD5, and UD6) to determine the distance between the sound source detection units UD3, UD4, UD5, and UD6. Each distance can be calculated.
  • the distance calculation device 90 also includes information or data from the detection devices DT3, DT4, DT5, DT6 (for example, the detection direction of the unmanned air vehicle dn and the position information of the sound source detection units UD3, UD4, UD5, UD6) and the sound source. It is possible to specify the absolute position (latitude and longitude) of the unmanned air vehicle dn using the distances between the detection units UD3, UD4, UD5, and UD6. That is, it is possible to automate the measurement of the installation position by incorporating the GPS measurement devices Gp3, GP4, Gp5, GP6 in the sound source detection units UD3, UD4, UD5, UD6, respectively.
  • the location information that is, the absolute position
  • the distance between UD3, UD4, UD5 and UD6 it is also possible to specify the absolute position (latitude and longitude) of the unmanned air vehicle dn.
  • FIGS. 31A to 31C are explanatory diagrams of correct / incorrect determination of the detection of the unmanned air vehicle dn by the two sound source detection units UD1 and UD2 in the fourth embodiment.
  • FIG. 31A shows a case where the detections by the two sound source detection units UD1, UD2 are in the same direction and are aligned.
  • the directions are the same, for example, the respective visual fields indicating the directions toward the unmanned air vehicle dn viewed from the two sound source detection units UD1 and UD2 with respect to the monitoring area 8 which is a hemisphere in FIG.
  • the corners are the same or substantially the same.
  • each of the monitoring devices 10A and 10B has a notification unit (not shown), and reports by this notification unit.
  • the notification unit may be, for example, a speaker, a display, or a lamp such as an LED (Light Emission Diode). In the case of a speaker, the reporting unit outputs a predetermined sound.
  • the reporting unit displays a predetermined message.
  • the reporting unit blinks in a predetermined blinking pattern.
  • the lens axes (optical axes) of the omnidirectional cameras CA of the sound source detection units UD1 and UD2 are separated by a certain distance so as to be parallel.
  • identification marks mk1 and mk2 representing the unmanned air vehicle dn displayed on the respective omnidirectional images GZ11 and GZ12 are inward.
  • FIG. 31B shows a case where the detection of the unmanned air vehicle dn by only one of the sound source detection units (for example, the sound source detection unit UD2) is erroneously detected or cannot be detected.
  • the sound source detection unit UD2 for example, the sound source detection unit UD2
  • FIG. 31C shows a case where detection by the two sound source detection units UD1 and UD2 is in different directions and is erroneous detection.
  • the center axis (optical axis) of the lens center (optical axis) of the omnidirectional camera CA of the sound source detection units UD1 and UD2 is separated by a certain distance.
  • the identification mark mk representing the unmanned air vehicle dn displayed in the respective omnidirectional images GZ11 and GZ12 is close to the same side.
  • FIG. 32 is a flowchart showing an example of a determination procedure for detecting the unmanned air vehicle dn by two sound source detection units.
  • the communication unit 95 of the distance calculation device 90 determines whether or not the detection information from the detection device DT1 has been received (S81). When the detection information has not been received (S81, NO), the processing of the communication unit 95 returns to step S81.
  • the communication unit 95 determines whether or not the detection information from the detection device DT2 has been received (S82). When the detection information is not received (S82, NO), the process of the communication unit 95 returns to step S81.
  • the presence / absence of reception of the detection information is determined in the order of the detection devices DT1 and DT2, but may be performed in the reverse order. In other words, it may be executed in the order of step S82 ⁇ step S81.
  • the distance calculation processing unit 91 superimposes and displays the identification mark mk representing the unmanned air vehicle dn on the omnidirectional images GZ11 and GZ12 based on the received detection information (S83).
  • the distance calculation processing unit 91 determines the positions of the two identification marks mk1 and mk2 displayed superimposed on the two omnidirectional images GZ11 and GZ12 (S84).
  • the distance calculation processing unit 91 determines whether or not the directions of the two identification marks mk1 and mk2 are aligned (S85). When the directions of the two identification marks mk1 and mk2 are aligned, the distance calculation processing unit 91 displays an alarm on the display 96 and issues a notification that the unmanned air vehicle dn is correctly detected (S86). The distance calculation device 90 may issue an alarm by voice from a speaker (not shown). Thereafter, the distance calculation processing unit 91 returns to the process of step S81.
  • the distance calculation processing unit 91 reports that the unmanned air vehicle dn is erroneously detected, and does not report anything in step S81. Process.
  • the alarm is issued when the detection is correct, and nothing is issued when the detection is incorrect, but the correct report is issued when the detection is correct, and the false alarm is detected when the detection is incorrect. May be issued.
  • the direction of sound when the sound source detection unit UD1 (first detection unit) and the sound source detection unit UD2 (second detection unit) detect the unmanned air vehicle dn is determined, and the determination result is notified.
  • the unmanned air vehicle detection system in the fourth embodiment it is possible to easily determine whether the unmanned air vehicle dn detected by the two sound source detection units UD1 and UD2 is correct (correct or incorrect). Therefore, even if there are many sound sources in the sky, the presence or absence of the unmanned air vehicle dn can be easily determined.
  • the present invention is useful as an unmanned air vehicle detection system and an unmanned air vehicle detection method that can easily determine the presence and position of an unmanned air vehicle from a captured image when detecting an unmanned air vehicle.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Signal Processing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Acoustics & Sound (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Otolaryngology (AREA)
  • Multimedia (AREA)
  • Business, Economics & Management (AREA)
  • Emergency Management (AREA)
  • Measurement Of Velocity Or Position Using Acoustic Or Ultrasonic Waves (AREA)
  • Circuit For Audible Band Transducer (AREA)

Abstract

Dans un système de détection de véhicule aérien sans pilote, une caméra omnidirectionnelle capture une image d'une zone de surveillance. Un réseau de microphones capte le son dans la zone de surveillance. Un dispositif de surveillance détecte un véhicule aérien sans pilote qui apparaît dans la zone de surveillance à l'aide de données sonores captées par le réseau de microphones. Lors de l'affichage, sur un moniteur, de données d'image de la zone de surveillance dont l'image a été capturée par la caméra omnidirectionnelle, une unité de traitement de signal dans le dispositif de surveillance superpose, sur les données d'image de la zone de surveillance, une marque d'identification obtenue en convertissant le véhicule aérien sans pilote en informations visuelles.
PCT/JP2017/024476 2016-07-28 2017-07-04 Système de détection de véhicule aérien sans pilote et procédé de détection de véhicule aérien sans pilote WO2018020965A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP17833974.3A EP3493554A4 (fr) 2016-07-28 2017-07-04 Système de détection de véhicule aérien sans pilote et procédé de détection de véhicule aérien sans pilote
US16/316,940 US20190228667A1 (en) 2016-07-28 2017-07-04 Unmanned aerial vehicle detection system and unmanned aerial vehicle detection method

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2016148471 2016-07-28
JP2016-148471 2016-07-28
JP2017107170A JP2018026792A (ja) 2016-07-28 2017-05-30 無人飛行体検知システム及び無人飛行体検知方法
JP2017-107170 2017-05-30

Publications (1)

Publication Number Publication Date
WO2018020965A1 true WO2018020965A1 (fr) 2018-02-01

Family

ID=61016068

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/024476 WO2018020965A1 (fr) 2016-07-28 2017-07-04 Système de détection de véhicule aérien sans pilote et procédé de détection de véhicule aérien sans pilote

Country Status (1)

Country Link
WO (1) WO2018020965A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020135187A1 (fr) * 2018-12-27 2020-07-02 赣州德业电子科技有限公司 Système et procédé de reconnaissance et de localisation de véhicule aérien sans pilote basés sur rgb_d et sur un réseau convolutif profond
US20210405663A1 (en) * 2019-05-15 2021-12-30 Panasonic Intellectual Property Management Co., Ltd. Information processing method, unmanned aerial vehicle, and unmanned aerial vehicle control system
CN116543141A (zh) * 2022-12-16 2023-08-04 无锡恺韵来机器人有限公司 一种基于声信号与图像融合的无人机识别与定位方法

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006168421A (ja) 2004-12-13 2006-06-29 Toshiba Corp 飛来飛行物体監視装置
JP2010183417A (ja) * 2009-02-06 2010-08-19 Hitachi Ltd 音声情報表示システム、音声情報表示方法、及び音声情報表示装置
JP2011015050A (ja) * 2009-06-30 2011-01-20 Nittobo Acoustic Engineering Co Ltd ビームフォーミング用のアレイ、及びそれを用いた音源探査測定システム
JP2014143678A (ja) 2012-12-27 2014-08-07 Panasonic Corp 音声処理システム及び音声処理方法
JP2015029241A (ja) 2013-06-24 2015-02-12 パナソニックIpマネジメント株式会社 指向性制御システム及び音声出力制御方法
WO2016098315A1 (fr) * 2014-12-15 2016-06-23 パナソニックIpマネジメント株式会社 Réseau de microphones, système de surveillance, et procédé de réglage de capture sonore
JP2016118987A (ja) * 2014-12-22 2016-06-30 パナソニックIpマネジメント株式会社 異常音検出システム

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006168421A (ja) 2004-12-13 2006-06-29 Toshiba Corp 飛来飛行物体監視装置
JP2010183417A (ja) * 2009-02-06 2010-08-19 Hitachi Ltd 音声情報表示システム、音声情報表示方法、及び音声情報表示装置
JP2011015050A (ja) * 2009-06-30 2011-01-20 Nittobo Acoustic Engineering Co Ltd ビームフォーミング用のアレイ、及びそれを用いた音源探査測定システム
JP2014143678A (ja) 2012-12-27 2014-08-07 Panasonic Corp 音声処理システム及び音声処理方法
JP2015029241A (ja) 2013-06-24 2015-02-12 パナソニックIpマネジメント株式会社 指向性制御システム及び音声出力制御方法
WO2016098315A1 (fr) * 2014-12-15 2016-06-23 パナソニックIpマネジメント株式会社 Réseau de microphones, système de surveillance, et procédé de réglage de capture sonore
JP2016118987A (ja) * 2014-12-22 2016-06-30 パナソニックIpマネジメント株式会社 異常音検出システム

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3493554A4

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020135187A1 (fr) * 2018-12-27 2020-07-02 赣州德业电子科技有限公司 Système et procédé de reconnaissance et de localisation de véhicule aérien sans pilote basés sur rgb_d et sur un réseau convolutif profond
US20210405663A1 (en) * 2019-05-15 2021-12-30 Panasonic Intellectual Property Management Co., Ltd. Information processing method, unmanned aerial vehicle, and unmanned aerial vehicle control system
US11656639B2 (en) * 2019-05-15 2023-05-23 Panasonic Intellectual Property Management Co., Ltd. Information processing method, unmanned aerial vehicle, and unmanned aerial vehicle control system
CN116543141A (zh) * 2022-12-16 2023-08-04 无锡恺韵来机器人有限公司 一种基于声信号与图像融合的无人机识别与定位方法

Similar Documents

Publication Publication Date Title
JP2018026792A (ja) 無人飛行体検知システム及び無人飛行体検知方法
JP5979458B1 (ja) 無人飛行体検知システム及び無人飛行体検知方法
US10791395B2 (en) Flying object detection system and flying object detection method
JP5189536B2 (ja) 監視装置
US9860635B2 (en) Microphone array, monitoring system, and sound pickup setting method
JP7122708B2 (ja) 画像処理装置、モニタリングシステム及び画像処理方法
JP6598064B2 (ja) 物体検出装置、物体検出システム、及び物体検出方法
US10397525B2 (en) Monitoring system and monitoring method
JP6493860B2 (ja) 監視制御システム及び監視制御方法
WO2018020965A1 (fr) Système de détection de véhicule aérien sans pilote et procédé de détection de véhicule aérien sans pilote
JP2018101987A (ja) 監視エリアの音源表示システム及び音源表示方法
US20090167867A1 (en) Camera control system capable of positioning and tracking object in space and method thereof
JP2009118318A (ja) 音監視装置
JP2017092938A (ja) 音源検知システム及び音源検知方法
JP6504539B2 (ja) 収音システム及び収音設定方法
KR101519974B1 (ko) 주변정보를 감지하는 감시 카메라 및 이를 이용한 지능형 경보 시스템
JP2017022664A (ja) モニタリングシステム及びモニタリング方法
KR20180136671A (ko) 영상분석 기법을 이용한 지진감시 대응 안내시스템
JP6664119B2 (ja) モニタリングシステム及びモニタリング方法
JP2019009520A (ja) 情報処理装置、情報処理方法及びプログラム
CN116614710A (zh) 一种近超声定位与摄像头联动系统及其方法
WO2019225147A1 (fr) Dispositif de détection d'objet aéroporté, procédé de détection d'objet aéroporté, et système de détection d'objet aéroporté
JP2017175473A (ja) モニタリングシステム及びモニタリング方法
JP2020068495A (ja) 情報処理装置、情報処理方法およびプログラム
JP2017192044A (ja) マイクアレイシステム及びマイクアレイシステムの製造方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17833974

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2017833974

Country of ref document: EP

Effective date: 20190228