WO2021063046A1 - Système et procédé de surveillance de cible distribuée - Google Patents

Système et procédé de surveillance de cible distribuée Download PDF

Info

Publication number
WO2021063046A1
WO2021063046A1 PCT/CN2020/098588 CN2020098588W WO2021063046A1 WO 2021063046 A1 WO2021063046 A1 WO 2021063046A1 CN 2020098588 W CN2020098588 W CN 2020098588W WO 2021063046 A1 WO2021063046 A1 WO 2021063046A1
Authority
WO
WIPO (PCT)
Prior art keywords
target
image
unit
probability
target object
Prior art date
Application number
PCT/CN2020/098588
Other languages
English (en)
Chinese (zh)
Inventor
周飞
刘倞
Original Assignee
熵康(深圳)科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 熵康(深圳)科技有限公司 filed Critical 熵康(深圳)科技有限公司
Publication of WO2021063046A1 publication Critical patent/WO2021063046A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/19Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using infrared-radiation detection systems
    • G08B13/191Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using infrared-radiation detection systems using pyroelectric sensor means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/187Segmentation; Edge detection involving region growing; involving region merging; involving connected component labelling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19602Image analysis to detect motion of the intruder, e.g. by frame subtraction
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19602Image analysis to detect motion of the intruder, e.g. by frame subtraction
    • G08B13/19608Tracking movement of a target, e.g. by detecting an object predefined as a target, using target direction and or velocity to predict its new position
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19678User interface
    • G08B13/19684Portable terminal, e.g. mobile phone, used for viewing video remotely
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B15/00Identifying, scaring or incapacitating burglars, thieves or intruders, e.g. by explosives
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/02Protocols based on web technology, e.g. hypertext transfer protocol [HTTP]
    • H04L67/025Protocols based on web technology, e.g. hypertext transfer protocol [HTTP] for remote control or remote monitoring of applications
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20036Morphological image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]

Definitions

  • This application relates to the field of target recognition and detection, and in particular to a distributed target monitoring system and method.
  • the embodiments of the present application provide a distributed target monitoring system.
  • the system includes at least one detection and alarm device, an image detection device, and a remote monitoring terminal.
  • the image detection device is respectively connected to the detection and alarm device and the remote monitoring terminal.
  • the detection and alarm equipment includes an infrared detection unit, a central processing unit, and an alarm unit, and the central processing unit is electrically connected to the infrared detection unit and the alarm unit, respectively;
  • the infrared detection unit is used to detect thermal infrared signals within a sensing range
  • the central processing unit is used to receive the thermal infrared signals, and control the alarm unit to issue an alarm signal according to the thermal infrared signals
  • the image detection device includes an image acquisition unit and a control unit that are electrically connected, the image acquisition unit is used to acquire a target area image, the control unit is used to receive the target area image and process the target area image, and Controlling the central processing unit according to the target area image, so that the central processing unit controls the alarm unit to issue an alarm signal;
  • the remote monitoring terminal includes an image display unit and a remote control unit that are electrically connected, and the remote control unit is used to obtain a target area image collected by the image detection device and display the target area image on the image display unit, And controlling the central processing unit, so that the central processing unit controls the alarm unit to issue an alarm signal.
  • an embodiment of the present application also provides a distributed target monitoring method, which is applied to an image detection device, and the method includes:
  • the embodiments of the present application also provide a distributed target monitoring method, which is applied to detecting alarm equipment, and the method includes:
  • the beneficial effect of the present application is: different from the prior art, the distributed target monitoring system and method in the embodiments of the present application detects the sensing range by detecting the infrared detection unit in the alarm unit According to the thermal infrared information detected by the infrared detection unit, the central processing unit controls the alarm unit to issue an alarm signal. On the other hand, it collects the target area image through the image acquisition unit in the image detection equipment and controls it according to the target area image. The central processing unit, so that the central processing unit controls the alarm unit to issue an alarm signal. At the same time, the remote control unit in the remote monitoring terminal acquires the target area image and displays the target area image on the image display unit.
  • the remote control unit controls the central processing unit to make The central processing unit controls the alarm unit to send out an alarm signal.
  • the detection and alarm equipment and the image detection equipment are distributed in a distributed manner and cooperate with each other, so that key targets can be monitored in real time and alarms in time.
  • Figure 1a is a schematic diagram of the connection of an image detection device, a remote monitoring terminal, and a detection alarm device according to an embodiment of the present application;
  • FIG. 1b is a schematic diagram of the connection between the detection alarm device, the network server, and the remote monitoring terminal in an embodiment of the present application;
  • FIG. 2 is a schematic diagram of an image detection device and a remote monitoring terminal and a detection alarm device in another embodiment of the present application through wireless connection;
  • Fig. 3a is a schematic diagram of the hardware structure of a detection alarm device according to an embodiment of the present application.
  • Figure 3b is a schematic diagram of the hardware structure of a detection alarm device according to another embodiment of the present application.
  • Figure 4 is a schematic diagram of an alarm unit according to an embodiment of the present application.
  • FIG. 5 is a schematic diagram of the hardware structure of a detection alarm device according to another embodiment of the present application.
  • FIG. 6 is a schematic diagram of the hardware structure of an image detection device according to an embodiment of the present application.
  • FIG. 7 is a schematic diagram of the hardware structure of a remote monitoring terminal according to an embodiment of the present application.
  • FIG. 8 is a schematic diagram of the connection of a cloud server, a remote monitoring terminal, an image detection device, and a detection alarm device according to an embodiment of the present application;
  • FIG. 9 is a schematic diagram of a cloud server, a remote monitoring terminal, an image detection device, and a detection alarm device in another embodiment of the present application through wireless connection;
  • FIG. 10 is a flowchart of an embodiment of the distributed target monitoring method of the present application.
  • FIG. 11 is a schematic diagram of the total number of harmful organisms in time according to an embodiment of the present application.
  • Figure 12a is a statistical diagram of pest density distribution in an embodiment of the present application.
  • Figure 12b is a schematic diagram of pest density distribution in an embodiment of the present application.
  • FIG. 13 is a flowchart of deep neural network model training in an embodiment of the present application.
  • FIG. 14 is a flowchart of data processing using a deep learning network model in an embodiment of the present application.
  • FIG. 15 is a flowchart of obtaining a second recognition result in an embodiment of the present application.
  • FIG. 16 is a specific flow chart of obtaining a second recognition result in an embodiment of the present application.
  • FIG. 17 is a flowchart of determining a target object and a target position in an embodiment of the present application.
  • FIG. 18 is a flowchart of another embodiment of the distributed target monitoring method of the present application.
  • Figure 19 is a schematic diagram of the distribution of detection and alarm devices in an embodiment of the present application.
  • FIG. 20 is a schematic structural diagram of an embodiment of a distributed target monitoring device of the present application.
  • FIG. 21 is a schematic diagram of the hardware structure of a control unit provided by an embodiment of the present application.
  • An embodiment of the present application provides a distributed target monitoring system, which includes at least one detection alarm device, a remote monitoring terminal 20, and an image detection device 30.
  • the detection alarm device is connected to the remote monitoring terminal 20.
  • Figure 1a exemplarily shows that the image detection device 30 is connected to the detection alarm device 1, the detection alarm device 2, the detection alarm device 3, and the detection alarm device N through a bus (such as a 485 bus).
  • the image detection device 30 and the remote monitoring terminal 20 are connected wirelessly (such as the 433MHZ wireless frequency band).
  • the buses in this embodiment are not all connected by image detection devices, but are cascaded between devices, which makes the actual operation more convenient and more flexible.
  • the devices connected by wireless are equipped with a wireless communication unit.
  • the wireless communication unit can use the Internet of Things system, such as WIFI network, Zigbee network, Bluetooth and NB-IOT network, etc., or mobile communication system, such as 2G, 3G. , 4G and 5G, etc.
  • the wireless communication unit can also include open frequency bands such as 433MHz.
  • an embodiment of the present application also provides a distributed target monitoring system, including at least one detection alarm device, a network server 50, and a remote monitoring terminal 20 .
  • Fig. 1b exemplarily shows that the network server 50 is connected to the detection alarm device 1, the detection alarm device 2, the detection alarm device 3, and the detection alarm device N through a bus.
  • the network server 50 is connected to the remote
  • the monitoring terminal 20 is connected wirelessly. Multiple detection and alarm devices are combined together to construct an Internet of Things system, and the detected thermal infrared signals and alarm signals are sent to the remote monitoring terminal 20 through the network server 50.
  • the buses in this embodiment are not all connected by a network server, but the network server is connected to one of the detection and alarm devices, and then the cascade connection between the detection and alarm devices is used to make the actual operation It is more convenient and flexible.
  • the detection alarm device, the image detection device 30, and the remote monitoring terminal 20 are all provided with a wireless communication unit, and FIG. 2 exemplarily shows The image detection device 30 is connected to the remote monitoring terminal 20, the detection alarm device 1, the detection alarm device 2, the detection alarm device 3, and the detection alarm device N via wireless (such as the 433MHZ wireless frequency band), which includes more detection in the actual environment Alarm equipment.
  • wireless such as the 433MHZ wireless frequency band
  • the detection alarm device 10 includes an infrared detection unit 110, a central processing unit 120 and an alarm unit 130, and the central processing unit 120 is electrically connected to the infrared detection unit 110 and the alarm unit 130, respectively. .
  • the infrared detection unit 110 may be an infrared sensor, a pyroelectric touch sensor, other infrared thermal sensors, etc., for detecting thermal infrared signals within a sensing range.
  • the infrared detection unit 110 is fixed at a high place, and the infrared sensing area can cover the target active area.
  • the infrared detection unit 110 can set the infrared system according to the infrared radiation characteristics of the target to be measured.
  • the infrared sensor detects the change in the infrared spectrum of the target and sends it to the central processing unit 120.
  • the central processing unit 120 may use an STM series chip.
  • the infrared detection unit 110 may also be equipped with a Fresnel lens. It should be noted that the target in this application can be pests, animals or humans, etc.
  • the infrared sensor includes an infrared light emitting diode and an infrared light sensitive diode, and the infrared light emitting diode and the infrared light sensitive diode are encapsulated in a plastic casing.
  • the infrared light-emitting diode lights up and emits an infrared light invisible to the human eye. If there is no target in front of the infrared sensor, then this infrared light will disappear in the cosmic space.
  • the infrared light will be It will be reflected back and shine on the infrared photodiode that is also shining next to itself.
  • the infrared photodiode receives infrared light, the resistance value of its output pin will change. By judging the change in the resistance of the infrared photodiode, Can sense the target ahead.
  • the central processing unit 120 may be a central processing unit (CPU: central processing unit), or a GPU graphics processing unit (GPU: Graphics Processing Unit), etc., for receiving thermal infrared signals, and according to The thermal infrared signal controls the alarm unit 130 to issue an alarm signal.
  • CPU central processing unit
  • GPU Graphics Processing Unit
  • the central processing unit 120 determines whether there is an active target in the sensing range according to the thermal infrared signal. After there is an active target sign, the alarm unit 130 is controlled to issue an alarm signal.
  • the detection alarm device 10 only includes an infrared detection unit 110 and a central processing unit 120 that are electrically connected.
  • the infrared detection unit 110 detects the induction After the thermal infrared signal within the range, it is sent to the central processing unit 120 for processing, and the central processing unit 120 determines whether there is an active target in the sensing range according to the thermal infrared signal.
  • the alarm unit 130 includes a light generator 131 and a sound generator 132. Both the sound generator 132 and the light generator 131 are electrically connected to the central processing unit 120. connection.
  • the light generator 131 is a strong light generator
  • the strong light generator may be a high-power LED lamp bead patch
  • the generated light source is a high-intensity light source
  • the sound generator 132 may be a buzzer or an ultrasonic generator
  • the ultrasonic horn of model 3840 produces high-decibel sound.
  • the central processing unit 120 will control the light generator 131 to generate a high-intensity light source, and the central processing unit 120 will control the sound generator 132 to generate a high-decibel sound to drive the moving target.
  • multiple light generators 131 and sound generators 132 may be distributed at each corner of a closed space or an open space.
  • the processing unit 120 will control the light generator 131 in each corner to generate a high-intensity light source, and the central processing unit 120 will control the sound generator 132 in each corner to generate a high-decibel sound to drive the moving target.
  • the light generator 131 can be replaced with a light strip, which is fixed on the bottom of the wall with 3M glue, similar to a skirting line.
  • the central processing unit 120 controls the light strip to flash continuously to drive away the moving target.
  • the detection and alarm device 10 further includes a photosensitive unit 140, which may be a photosensitive diode or similar components, such as an HPI-6FER2 photosensitive diode.
  • the photosensitive unit 140 can determine whether the current moment is day or night by sensing illumination information.
  • the photosensitive unit 140 is electrically connected to the central processing unit 120 for transmitting the illumination information to the central processing unit 120.
  • the processing unit 120 controls the infrared detection unit 110 and the alarm unit 130 to work at night according to the illumination information, thereby reducing power consumption.
  • the model of the above-mentioned photosensitive unit can be selected according to requirements without being restricted to the limitation in this embodiment.
  • the image detection device 30 includes an image acquisition unit 310 and a control unit 320.
  • the control unit 320 is electrically connected to the image acquisition unit 310, and the image detection device 30 may Fixed in a high place.
  • the image acquisition unit 310 is used to acquire images of the target area.
  • the image acquisition unit 310 may be, for example, an infrared light and/or a visible light camera.
  • the control unit 320 may use chips such as Intel Movidius or Huawei HiSilicon 3519, and infrared light.
  • the camera and/or visible light camera takes advantage of the high altitude to upload the collected target image to the control unit 320.
  • the control unit 320 analyzes the target image through machine vision and neural network.
  • control unit 320 If there is an active target in the target image, the control unit 320 The control signal and the image information of the target are sent to the central processing unit 120, so that the central processing unit 120 controls the alarm unit 130 to issue an alarm signal to drive away the active target according to the received control signal and the image information of the target.
  • the image detection device 30 only includes the control unit 320.
  • the image detection device 30 and the detection alarm device 10 are connected via a bus or wirelessly.
  • the control unit 320 in the image detection device 30 is used to collect and analyze data from multiple detection and alarm devices 10, where the data can be target information and then uploaded to a cloud server.
  • the user can use a terminal device, such as a mobile phone or The computer checks the target information in real time.
  • the remote monitoring terminal 20 includes an image display unit 210 and a remote control unit 220, and the remote control unit 220 and the image display unit 210 are electrically connected.
  • the image display unit 210 may be a display system such as a computer, a mobile phone, or a tablet
  • the remote control unit 220 may be, for example, a host computer software.
  • the remote control unit 220 is used to obtain the target area image sent by the control unit 320, and the state information of the image detection device 30 and the detection alarm device 10, and send it to the image display unit 210 for display.
  • the remote control unit 220 is also used to control the control unit 320 in the image detection device 30 according to the target area image, so that the control unit 320 controls the central processing unit 120 in the detection alarm device, so that the central The processing unit 120 controls the alarm unit 130 to issue an alarm signal to drive the target away.
  • the remote monitoring terminal 20 is wirelessly connected to the image detection device 30 and the detection alarm device, respectively.
  • the remote control unit 220 in the remote monitoring terminal 20 obtains the target area image sent by the image detection device 30, the remote control unit 220 can directly control the detection alarm device to drive the target.
  • the remote control unit 220 The central processing unit 120 is controlled so that the central processing unit 120 controls the alarm unit 130 to issue an alarm signal, so as to drive the target away.
  • the distributed target monitoring system further includes a cloud server 40.
  • the cloud server 40 is wirelessly connected to the image detection device 30, and the cloud server 40 is used to receive the target area image, target information, state information of the image detection device 30, and state information of the detection alarm device sent by the image detection device 30.
  • the cloud server 40 may be a server, such as a rack server, a blade server, a tower server, or a cabinet server, etc., or a server cluster composed of several servers, or a cloud computing service center.
  • FIG. 8 exemplarily shows a cloud server 40, a remote monitoring terminal 20, an image detection device 30, and a detection alarm device 1, a detection alarm device 2, a detection alarm device 3, and Detect alarm device N.
  • the cloud server 40 and the remote monitoring terminal 20 are both wirelessly connected to the image detection device 30, and the image detection device 30 is respectively connected to the detection alarm device 1, the detection alarm device 2, and the detection alarm via a bus.
  • the device 3 is connected to the detection and alarm device N.
  • the image detection device 30 receives the thermal infrared signal and alarm signal sent by the detection alarm device through the bus, and detects the thermal infrared signal and alarm signal, as well as the image of the target area collected by itself, to detect the alarm device's
  • the status information and its own status information are sent to the remote monitoring terminal 20 and the cloud server 40 through the wireless communication unit together.
  • the user can view the above-mentioned various information directly on the remote monitoring terminal 20, or wirelessly connect to the cloud server 40 through a terminal device such as a mobile phone or a computer to view the above-mentioned various information in real time. It should be noted that all devices connected wirelessly are equipped with a wireless communication unit.
  • FIG. 9 exemplarily shows the cloud server 40, the remote monitoring terminal A and the remote monitoring terminal B, the image detection device 1 and the image detection device 2, and the detection alarm device 1.
  • Detect alarm equipment 2 detect alarm equipment 3, detect alarm equipment 4, detect alarm equipment 5, and detect alarm equipment N.
  • the cloud server 40 is wirelessly connected to the image detection device 1 and the image detection device 2
  • the remote monitoring terminal A is wirelessly connected to the image detection device 1.
  • the image detection device 1 is connected to the detection alarm device. 1.
  • the detection alarm device 3 is wirelessly connected; the remote monitoring terminal B is wirelessly connected to the image detection unit 2, and the image detection device 2 is respectively connected to the detection alarm device 4, the detection alarm device 5 and the detection alarm device N wireless connection.
  • the image detection device receives, through a wireless communication unit, information such as thermal infrared signals and alarm signals sent by the detection alarm device wirelessly, and combines the thermal infrared signals and alarm signals, as well as the image of the target area obtained by itself. , Detect the status information of the alarm device and its own status information, etc., and send them to the remote monitoring terminal and the cloud server 40 through the wireless communication unit.
  • the user can view the foregoing various information directly on the remote monitoring terminal, or wirelessly connect to the cloud server 40 through a terminal device such as a mobile phone or a computer to view the foregoing various information in real time, so that the target can be monitored in real time.
  • a terminal device such as a mobile phone or a computer to view the foregoing various information in real time, so that the target can be monitored in real time.
  • all devices connected wirelessly are equipped with a wireless communication unit.
  • an embodiment of the present application also provides a distributed target monitoring method, which is applied to an image detection device, and the method is executed by a control unit in the image detection device, and includes:
  • Step 1010 Obtain an image of the target area.
  • the target area is a biological activity area or a human activity area, preferably a harmful biological activity area.
  • the image includes a video image and a picture image. The image of the target area captured by the camera is obtained, and the target area image contains harmful Biology or human body.
  • Step 1020 Input the target area image into a preset deep neural network model to obtain a first recognition result.
  • the first recognition result is obtained by inputting the target area image captured by the camera into a preset deep neural network model, and the preset deep neural network model is obtained by learning and training a large number of images carrying the target area.
  • Step 1030 Obtain a second recognition result based on the target area image and a preset reference image.
  • the preset reference image is a pre-photographed image that does not contain the target. This image is used as a background image, and the second recognition result is obtained by comparing the target area image taken by the infrared device with the background image.
  • Step 1040 Obtain the target object and the target position in the target area image according to the first recognition result and the second recognition result.
  • the target object is a pest or human body
  • the target position is the position of the pest or human body in the image of the target area.
  • it may be the smallest bounding rectangular frame that encloses the target object, and the position of this rectangular frame is The position of the target object, the target object and the target position in the target area image are obtained through the first recognition result and the second recognition result.
  • Step 1050 Obtain the relationship between the target location, time, and frequency.
  • the target location may be marked by coordinate positions in the electronic map corresponding to the target area, and the coordinate positions may be two-dimensional plane coordinates or three-dimensional space coordinates. Since the target object may be located at different locations at different time points, and the number of times the target object has been active at different time points, the infrared device continuously collects the location of the target object at different time points, and The number of occurrences at different time points to determine the relationship between the target location, time and frequency.
  • Step 1060 Determine the activity trajectory of the target object and/or the activity density information of the target object according to the relationship between the target location, time and frequency.
  • the camera continuously collects images of the target area
  • the control unit analyzes the collected image information to connect the target position and frequency of the target object in a time series into a line to form the trajectory of the target object.
  • Statistics of the target location, time and frequency can determine the activity density of the target object. According to the activity trajectory of the target object and the activity density of the target object, it can be clearly known.
  • the activity area and living habits of the target object can be known, for example, In a specific time range, the target object has the highest activity frequency during that time period, and in which areas the target object likes to move, so that subsequent measures can be taken for the target object.
  • the target object is a pest
  • the activity density information of the target object includes pest density, peak pest density, average pest density, peak number of pests, and continuous activity of pests. Time, peak continuous activity time of pests, total number of pests, and pest density distribution map, etc.
  • the pest density is the density of the number of occurrences of pests in a unit area and unit time.
  • N i is the number of pests per unit time, unit only; [Delta] t units of time, in minutes or hour unit of time measurement; [Delta] s per unit area in square meters or square centimeters area and other units of measurement; ⁇ i It is the density of pests, the typical unit is only/(m 2 ⁇ h), this value has been changing at different times.
  • the peak pest density is the maximum density of pests in a unit area and unit time.
  • ⁇ max MAX ⁇ i ⁇ , ⁇ i is the pest density monitored in the i-th unit time unit area;
  • ⁇ max is the peak pest density within a certain period of time or within a certain area.
  • the average pest density is the average value of the density of pest occurrences per unit area and unit time.
  • ⁇ i is the pest density monitored per unit area for the i-th time;
  • n is the number of pests in a unit time period within a period of time; It is the average pest density within a certain period of time or within a certain area.
  • the peak number of pests is the maximum number of pests in a unit area and unit time.
  • N max MAX ⁇ N i ⁇ , where N i is the number of pests at a certain moment in the unit area, in units; N max is the peak number of pests in the unit area in a certain period of time, in units.
  • the continuous activity time of pests is the sum of the activity time of pests per unit time within the field of view.
  • T i is the i consecutive event time, in seconds, minutes or other time units of measurement;
  • T total is the total active time.
  • the unit time is generally 24 hours. As long as the harmful organisms are seen in the field of view, the accumulated activity time will be the total time.
  • the peak continuous activity time of pests is the longest continuous activity time of pests per unit time within the field of view.
  • T max MAX ⁇ T i ⁇
  • T i is the i-th continuous activity time
  • the unit is a time measurement unit such as minutes or seconds
  • T max is the peak continuous activity time of pests in David's time.
  • the unit time is generally 24 hours.
  • the total number of pests time is the sum of the number of pests per unit time in the field of view multiplied by the activity time.
  • n T / ⁇ t, wherein, [Delta] t units of time, in minutes or hour unit of time measurement; N i is the number of pest organisms at a certain time, only the units; T is a unit time, usually 24 hours; n- It is the quantity per unit time period in a period of time; NT total is the total number of pests per unit time, the unit is only ⁇ h (only ⁇ hour), please refer to Figure 11.
  • the pest density distribution map is the density of pests per unit time within the field of view, which is represented by a chart.
  • Each pixel value of the density distribution map reflects the number of occurrences of harmful organisms per unit time at the location of the area, please refer to Figure 12a and Figure 12b.
  • the target area image is obtained through the camera, and the target area image is input into the preset deep neural network model to obtain the first recognition result, and then the second recognition result is obtained based on the target area image and the preset reference image , Determine the target object and target location according to the first recognition result and the second recognition result, then obtain the relationship between the target location, time and frequency, and determine the target object’s activity trajectory and/or activity density according to the relationship between the target location, time and frequency, As a result, the density of harmful organisms can be accurately monitored.
  • the method further includes:
  • Step 1310 Obtain a sample image of the target area and the target object, label the target object in the sample image, and generate label information.
  • the target object is a target marquee in the sample image of the target area, where the target marquee includes information such as the coordinate position and the center position of the target.
  • the target marquee includes information such as the coordinate position and the center position of the target.
  • Step 1320 Input the marked image into the deep neural network model for training, and obtain the preset deep neural network model.
  • the model is trained using the marked sample images.
  • the purpose is to improve the accuracy of model training, and thus the accuracy of density monitoring. The more sample images, the more situations are covered, and the higher the recognition ability of the deep neural network model.
  • the preset deep neural network model contains multiple convolutional layers and pooling layers.
  • the input target area image passes through the convolutional pooling layer 1 to obtain intermediate result 1, and then passes through the convolutional pooling layer 2 to obtain intermediate result 2; the intermediate result 2 passes through the convolutional pooling layer 4 to obtain intermediate result 4; intermediate result 1
  • the fusion result is obtained through fusion with intermediate result 4; the fusion result passes through the convolutional pooling layer 5 to obtain intermediate result 5; the intermediate result 2 results convolutional pooling layer 3 to obtain intermediate result 3; the intermediate result 3 and intermediate result 5 are fused to obtain
  • the final result is the first recognition result.
  • the obtained target area image samples can be integrated into an image sample set, and then the image sample set can be divided into a training sample set and a test sample set, where the sample training set is used as Train the deep neural network model, and the test sample set is used to test the trained deep neural network model.
  • each picture in the training sample set is input into the deep neural network model, and the pictures in the training sample set are automatically trained through the deep neural network model to obtain the trained deep neural network model.
  • each picture of the test sample set into the trained deep neural network model recognize each input picture through the model, obtain the corresponding recognition result, and integrate the recognition result corresponding to each image to obtain the recognition result set.
  • the test recognition rate can be determined according to the number of target objects in the recognition result set and the number of target objects in the test sample set.
  • the test recognition rate is used to measure the recognition ability of the trained deep neural network model. If the test recognition rate reaches the preset threshold, it indicates that the recognition ability of the trained deep neural network model meets expectations, and the trained deep neural network model can be directly used as a trained deep neural network model for image recognition. On the contrary, the parameters of the deep neural network model are continuously adjusted, and the deep neural network model is trained again until the recognition rate of the model reaches the preset threshold.
  • the obtaining of the second recognition result based on the target area image and the preset reference image includes:
  • Step 1510 Obtain a changed part image of the target area image relative to the preset reference image, and convert the changed part image into a grayscale image.
  • a grayscale image can be obtained by averaging the RGB values of 3 channels at the same pixel position, or the maximum and minimum brightness of the RGB at the same pixel position can be averaged, or a grayscale image can be obtained.
  • the method of converting the changed part of the image into a grayscale image is not limited to the above two.
  • Step 1520 Perform noise filtering and threshold segmentation on the grayscale image to obtain a binarized image, and obtain connected regions in the binarized image through a connected domain algorithm.
  • Linear filtering, threshold average, weighted average, and template smoothing can be used to filter the noise to obtain a filtered binary image.
  • the filtered binary image Pixels are divided into several categories. Several types of pixel points are divided to find the target point of interest. The foreground pixels of the target point of interest that have the same pixel value and are adjacent to each other form a connected area.
  • Step 1530 Obtain a potential contour of the target object according to the connected region.
  • a connected area is formed by foreground pixels with the same pixel value and adjacent positions in the target points of interest, and the potential contour of each target can be obtained through the connected area.
  • Step 1540 Perform a morphological operation on the potential contour of the target object to obtain a second recognition result.
  • the second recognition result includes the second target object in the target area image and the second target object corresponding to the second target object. Two probabilities and the second target position of the second target object in the target area image.
  • the morphological operation includes expanding and filling a closed area. Specifically, pixels can be added to the boundary of the target object in the image, and the target object The feature contour map of the object is filled with holes to obtain the second target object, the probability corresponding to the second target object, and the second target position of the second target object in the target area image.
  • the obtaining the target object and the target position in the target area image according to the first recognition result and the second recognition result includes:
  • Step 1710 Compare the first probability and the second probability.
  • the first probability refers to the target object obtained through deep neural network model recognition
  • the second probability refers to the target object obtained through image processing.
  • Step 1720 If the first probability is greater than the second probability, and the first probability is greater than or equal to a preset probability threshold, the first target object is taken as a target object, and the first target position is taken as target location.
  • the preset probability threshold can be used as the criterion of the target object, and the probability threshold can be set in advance. If the target probability identified by the deep neural network model, that is, the first probability is greater than the target probability acquired by image processing, that is, the second probability, and the first probability is greater than or equal to the preset probability threshold, it will be identified by the deep learning network model The obtained first target object is taken as the target object, and the position of the first target object is taken as the target position.
  • the preset probability threshold is 60%
  • the first probability is 70%
  • the second probability is 40%
  • the first probability 70% is greater than the second probability 40%
  • the first probability 70% is greater than the preset probability threshold 60%
  • Step 1730 If the first probability is less than the second probability, and the second probability is greater than or equal to the preset probability threshold, the second target object is used as the target object, and the second target position is used as the target position .
  • the target probability identified by the deep neural network model that is, the first probability is less than the target probability obtained by image processing, that is, the second probability, and the second probability is greater than or equal to the preset probability threshold
  • the first probability obtained by image processing will be
  • the second target object is used as the target object, and the position of the second target object is used as the target position.
  • the preset probability is 60%
  • the first probability is 20%
  • the second probability is 80%
  • the first probability 20% is less than the second probability 80%
  • the second probability 80% is greater than the preset probability threshold 60%
  • the second target object is taken as the target object
  • the second target position is taken as the target position.
  • Step 1740 If the first probability and the second probability are both less than the preset probability threshold, but the sum of the first probability and the second probability is greater than the preset second probability threshold, then the target is regarded as a suspected target.
  • the preset probability is 60%
  • the preset second probability is 55%
  • the first probability is 40%
  • the second probability is 18%. It can be seen that the first summary 40% and the second probability 18% are both less than the preset probability threshold 60%, but the sum of the first probability 40% and the second probability 18% is 58%, which is greater than the preset second threshold 55%.
  • the target is regarded as a suspected target.
  • Step 1750 if the first probability and the second probability are both less than the preset probability threshold, and the sum of the first probability and the second probability is less than the preset second probability threshold, then discard the first probability The recognition result and the second recognition result.
  • the target probability recognized by the deep neural network model, that is, the first probability, and the target probability obtained by image recognition, that is, the second probability are both less than the preset probability threshold, and the sum of the first probability and the second probability is less than the preset probability. If the second probability threshold is set, the recognition result is inaccurate.
  • the first recognition result obtained by the deep neural network model is the first target object, the position of the first target object, and the first probability recognized here. At the same time, it gives up passing the image
  • the second recognition result obtained by the processing is the second target object, the position of the second target object, and the second probability of this recognition.
  • the preset probability threshold is 60%
  • the preset second probability threshold is 55%
  • the probability of the first target object recognized by the deep neural network model is 40%
  • the probability of the second target object obtained by image processing is 10%
  • the first probability 40% and the second probability 10% are both less than 60%
  • the sum of the first probability and the second probability 50% is less than the preset second probability threshold 55%
  • the above-mentioned preset probability threshold can be set according to actual needs, and does not need to be restricted to the limitation in this embodiment.
  • an embodiment of the present application also provides a distributed target monitoring method, as shown in FIG. 18 to FIG. 19, which is applied to the detection and alarm equipment, and the method is executed by the central processing unit in the detection and alarm equipment.
  • a distributed target monitoring method as shown in FIG. 18 to FIG. 19, which is applied to the detection and alarm equipment, and the method is executed by the central processing unit in the detection and alarm equipment.
  • Step 1810 Acquire thermal infrared signals within the sensing range.
  • the detection alarm device is placed in each corner of the room or room, and the central processing unit in the detection alarm device obtains the thermal infrared signal sent by the infrared detection unit.
  • the infrared detection unit may be an infrared sensor, etc. It can be one or more, please refer to the system embodiment.
  • Step 1820 Determine the relationship between the target position, time and frequency according to the thermal infrared signal.
  • the thermal infrared signal is a special signal possessed by the target object within the sensing range. Since the target object may be located at different positions at different time points, and at different time points, the number of times the target object moves is also different. Therefore, the infrared detection unit continuously collects the position of the target object at different time points and the number of times at different time points for 24 hours, so as to determine the relationship between the target position, time and frequency.
  • Step 1830 Determine the activity trajectory of the target object and/or the activity density information of the target object according to the relationship between the target location, time and frequency.
  • the infrared detection unit continuously performs infrared sensing on the target area, and sends the sensed thermal infrared signal to the central processing unit, and the central processing unit connects the target position and frequency of the target object in the time series
  • the target object’s activity trajectory is formed in a line.
  • the target position, time and frequency are counted to determine the target object’s activity density.
  • the target object’s activity trajectory and the target object’s activity density it can be clearly known that the target object’s activity Activity area and life habits, for example, it can be known that within a certain time range, the target object has the highest activity frequency during that time period, and which areas the target object likes to move in, so that subsequent measures can be taken for the target object.
  • the target object is a pest
  • the activity density information of the target object includes: pest density at the detection alarm device, pest density in the detection alarm device area, average pest density in the detection alarm device area, and pest detection in the alarm device area Density distribution map, etc.
  • the pest density at the detection alarm device is the number of times the detection alarm device finds the pest per unit time.
  • N/T, where N is the total number of pests found per unit time, in units; T is unit time, in hours or days and other time measurement units; ⁇ is the density of pests, and the typical unit is per day.
  • Detect the pest density in the area of the alarm device which is the number of times the alarm device detects the pest per unit area and unit time.
  • ⁇ i N i / T, where, N i is the total number of pests found in the i-th station apparatus per unit time, unit only; T is a unit time, in hours or days, etc. time measurement unit; [rho] i for the first The typical unit of pest density monitored by i equipment is per day; n is the number of detection alarm devices in a unit area; ⁇ total is the density of pests in a certain area within a certain period of time, and the typical unit is per day.
  • the average density of pests in the detection and alarm equipment area is the average of the density of pest occurrences per unit area and unit time.
  • ⁇ i N i / T, where, N i is the total number of pests found in the i-th station apparatus per unit time, unit only; T is a unit time, in hours or days, etc. time measurement unit; [rho] i for the first
  • the density of pests detected by i equipment is typically unit per day; n is the number of detection alarm devices in the unit area; It is the average pest density within a certain period of time in a certain area, and the typical unit is per day.
  • the distribution map of the pest density in the detection and alarm equipment area is the density of pests in a unit area and unit time, which is represented by a chart.
  • the value of each sensing position in the density distribution map reflects the number of occurrences of harmful organisms per unit time at the location of the area.
  • the thermal infrared signal in the sensing range is acquired by detecting the alarm device, the target position, the relationship between time and frequency are determined according to the thermal infrared signal, and the target position, the relationship between time and frequency is determined according to the relationship between the target position, time and frequency.
  • the activity trajectory of the target object and/or the activity density information of the target object so that the density of harmful organisms can be accurately monitored.
  • an embodiment of the present application also provides a distributed target monitoring device 2000, and the device 2000 includes:
  • the first acquisition module 2010 is used to acquire an image of a target area
  • the input module 2020 is configured to input the target area image into a preset deep neural network model to obtain a first recognition result
  • the second obtaining module 2030 is configured to obtain a second recognition result based on the target area image and the preset reference image;
  • the third obtaining module 2040 is configured to obtain the target object and the target position in the target area image according to the first recognition result and the second recognition result;
  • the fourth obtaining module 2050 is configured to obtain the relationship between the target position, time and frequency;
  • the determining module 2060 is configured to determine the activity trajectory of the target object and/or the activity density information of the target object according to the relationship between the target location, time and frequency.
  • the target area image is acquired through the first acquisition module, and then the acquired target area image is input into the preset deep neural network model through the input module to obtain the first recognition result, and then use
  • the second acquisition module obtains the second recognition result based on the target area image and the preset reference image, and obtains the target object and the target position in the target area image according to the first recognition result and the second recognition result through the third acquisition module.
  • the acquisition module acquires the relationship between the target location, time, and frequency.
  • the determination module determines the target's activity trajectory and/or activity density according to the relationship between the target object, time and frequency, so that the pest density can be accurately monitored.
  • the apparatus 2000 further includes:
  • the labeling module 2070 obtains a sample image of a target area and a target object, labels the target object in the sample image, and generates labeling information.
  • the training module 2080 inputs the marked image into the deep neural network model for training, and obtains the preset deep neural network model.
  • the input module 2020 is specifically configured to:
  • the preset deep neural network model includes several convolutional layers and pooling layers, and the first recognition result includes a first target object in the target area image, a first probability corresponding to the first target object, and The first target position of the first target object in the target area image.
  • the second acquiring module 2030 is specifically configured to:
  • Noise filtering and threshold segmentation are performed on the gray image to obtain a binarized image, and connected regions in the gray image are obtained through a connected domain algorithm;
  • the third obtaining module 2040 is specifically configured to:
  • first probability is greater than the second probability, and the first probability is greater than or equal to a preset probability threshold, use the first target object as the target object and the first target position as the target position;
  • the second target object is used as the target object, and the second target position is used as the target position;
  • the target is regarded as a suspected target
  • first probability and the second probability are both less than the preset probability threshold, and the sum of the first probability and the second probability is less than the preset second probability threshold, then discard the first recognition result and The second recognition result.
  • the above-mentioned distributed target monitoring device can execute the distributed target monitoring method provided in the embodiments of the present application, and has the corresponding functional modules and beneficial effects of the execution method.
  • the distributed target monitoring method provided in the embodiment of the present application.
  • FIG. 21 is a schematic diagram of the hardware structure of the control unit in the image detection device provided by an embodiment of the present application. As shown in FIG. 21, the control unit 2100 includes:
  • One or more processors 2110 and memory 2120 are taken as an example.
  • the processor 2110 and the memory 2120 may be connected through a bus or in other ways.
  • the connection through a bus is taken as an example.
  • the memory 2120 can be used to store non-volatile software programs, non-volatile computer-executable programs, and modules, as corresponding to the distributed target monitoring method in the embodiment of the present application.
  • Program instructions/modules for example, the first acquisition module 2010, the input module 2020, the second acquisition module 2030, the third acquisition module 2040, the fourth acquisition module 2050, and the determination module 2060 shown in FIG. 20.
  • the processor 2110 executes various functional applications and data processing of the image detection device by running non-volatile software programs, instructions, and modules stored in the memory 2120, that is, realizes the distributed target monitoring method of the foregoing method embodiment.
  • the memory 2120 may include a storage program area and a storage data area.
  • the storage program area may store an operating system and an application program required by at least one function; the storage data area may store data created according to the use of the distributed target monitoring device.
  • the memory 2120 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or other non-volatile solid-state storage devices.
  • the memory 2120 may optionally include memories remotely provided with respect to the processor 2110, and these remote memories may be connected to the distributed target monitoring device through a network. Examples of the aforementioned networks include, but are not limited to, the Internet, corporate intranets, local area networks, mobile communication networks, and combinations thereof.
  • the one or more modules are stored in the memory 2120, and when executed by the one or more control units 2100, the distributed target monitoring method in any of the foregoing method embodiments is executed, for example, the above-described diagram is executed.
  • the device embodiments described above are merely illustrative.
  • the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, they may be located in One place, or it can be distributed to multiple network units. Some or all of the modules can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
  • each implementation manner can be implemented by means of software plus a general hardware platform, and of course, it can also be implemented by hardware.
  • a person of ordinary skill in the art can understand that all or part of the processes in the methods of the foregoing embodiments can be implemented by a computer program instructing relevant hardware.
  • the program can be stored in a computer readable storage medium, and the program can be stored in a computer readable storage medium. When executed, it may include the procedures of the above-mentioned method embodiments.
  • the storage medium may be a magnetic disk, an optical disc, a read-only memory (Read-Only Memory, ROM), or a random access memory (Random Access Memory, RAM), etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Biophysics (AREA)
  • Mathematical Physics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Computational Linguistics (AREA)
  • Software Systems (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Signal Processing (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Human Computer Interaction (AREA)
  • Alarm Systems (AREA)
  • Geophysics And Detection Of Objects (AREA)

Abstract

L'invention concerne un système et un procédé de surveillance de cible distribuée, se rapportant au domaine de la reconnaissance et de la détection de cible. Le système comprend au moins un dispositif d'alarme de détection (10), un dispositif de détection d'image (30), et un terminal de surveillance à distance (20), le dispositif de détection d'image (30) étant respectivement connecté au dispositif d'alarme de détection (10) et au terminal de surveillance à distance (20); une unité de détection d'infrarouges (110) dans le dispositif d'alarme de détection (10) détecte des informations d'infrarouges thermiques dans une plage de détection, et une unité centrale de traitement (120) commande une unité d'alarme (130) pour émettre un signal d'alarme sur la base des informations d'infrarouges thermiques détectées par l'unité de détection d'infrarouges (110); une unité d'acquisition d'image (310) dans le dispositif de détection d'image (30) acquiert une image de zone cible et, sur la base de l'image de zone cible, commande l'unité centrale de traitement (120) de sorte que l'unité centrale de traitement (120) commande l'unité d'alarme (130) pour émettre un signal d'alarme; et une unité de commande à distance (220) dans le terminal de surveillance à distance (20) acquiert l'image de zone cible et affiche l'image de zone cible sur une unité d'affichage d'image (210), et l'unité de commande à distance (220) commande l'unité centrale de traitement (120) de sorte que l'unité centrale de traitement (120) commande l'unité d'alarme (130) pour émettre un signal d'alarme.
PCT/CN2020/098588 2019-09-30 2020-06-28 Système et procédé de surveillance de cible distribuée WO2021063046A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910942708.1 2019-09-30
CN201910942708.1A CN110728810B (zh) 2019-09-30 2019-09-30 一种分布式目标监测系统和方法

Publications (1)

Publication Number Publication Date
WO2021063046A1 true WO2021063046A1 (fr) 2021-04-08

Family

ID=69218728

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/098588 WO2021063046A1 (fr) 2019-09-30 2020-06-28 Système et procédé de surveillance de cible distribuée

Country Status (2)

Country Link
CN (1) CN110728810B (fr)
WO (1) WO2021063046A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113436295A (zh) * 2021-06-25 2021-09-24 平安科技(深圳)有限公司 活体养殖监控轨迹绘制方法、装置、设备及存储介质
CN113724240A (zh) * 2021-09-09 2021-11-30 中国海洋大学 一种基于视觉识别的冷柜脚轮检测方法、系统及装置
CN115063940A (zh) * 2022-06-06 2022-09-16 中国银行股份有限公司 一种风险监控方法及装置、存储介质及电子设备
CN117706045A (zh) * 2024-02-06 2024-03-15 四川省德阳生态环境监测中心站 基于物联网实现大气臭氧监测设备的联合控制方法及系统

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113496444B (zh) * 2020-03-19 2024-11-01 杭州海康威视系统技术有限公司 一种落脚点识别方法、装置和系统

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015139204A (ja) * 2014-01-24 2015-07-30 国立大学法人岐阜大学 ネズミ検出システムの評価方法
CN108540773A (zh) * 2018-04-12 2018-09-14 云丁网络技术(北京)有限公司 一种监控方法、装置、系统及云服务器
CN109299703A (zh) * 2018-10-17 2019-02-01 思百达物联网科技(北京)有限公司 对鼠情进行统计的方法、装置以及图像采集设备
CN109831634A (zh) * 2019-02-28 2019-05-31 北京明略软件系统有限公司 目标对象的密度信息确定方法及装置
CN109922310A (zh) * 2019-01-24 2019-06-21 北京明略软件系统有限公司 目标对象的监控方法、装置及系统
CN110235890A (zh) * 2019-05-14 2019-09-17 熵康(深圳)科技有限公司 一种有害生物检测及驱赶方法、装置、设备和系统

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102930249A (zh) * 2012-10-23 2013-02-13 四川农业大学 基于颜色和模型的农田害虫识别和计数方法
CN103793923A (zh) * 2014-01-24 2014-05-14 华为技术有限公司 一种图像中运动目标的获取方法及设备
AU2015314684B2 (en) * 2014-09-12 2020-09-03 Appareo Systems, Llc Non-image-based grain quality sensor
CN204695482U (zh) * 2015-06-15 2015-10-07 深圳市尼得科技有限公司 一种摄像机监控自动报警系统
CN107103717A (zh) * 2017-06-28 2017-08-29 四川亚润科技有限公司 一种远程监控预警系统
CN107665355B (zh) * 2017-09-27 2020-09-29 重庆邮电大学 一种基于区域卷积神经网络的农业害虫检测方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015139204A (ja) * 2014-01-24 2015-07-30 国立大学法人岐阜大学 ネズミ検出システムの評価方法
CN108540773A (zh) * 2018-04-12 2018-09-14 云丁网络技术(北京)有限公司 一种监控方法、装置、系统及云服务器
CN109299703A (zh) * 2018-10-17 2019-02-01 思百达物联网科技(北京)有限公司 对鼠情进行统计的方法、装置以及图像采集设备
CN109922310A (zh) * 2019-01-24 2019-06-21 北京明略软件系统有限公司 目标对象的监控方法、装置及系统
CN109831634A (zh) * 2019-02-28 2019-05-31 北京明略软件系统有限公司 目标对象的密度信息确定方法及装置
CN110235890A (zh) * 2019-05-14 2019-09-17 熵康(深圳)科技有限公司 一种有害生物检测及驱赶方法、装置、设备和系统

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113436295A (zh) * 2021-06-25 2021-09-24 平安科技(深圳)有限公司 活体养殖监控轨迹绘制方法、装置、设备及存储介质
CN113436295B (zh) * 2021-06-25 2023-09-15 平安科技(深圳)有限公司 活体养殖监控轨迹绘制方法、装置、设备及存储介质
CN113724240A (zh) * 2021-09-09 2021-11-30 中国海洋大学 一种基于视觉识别的冷柜脚轮检测方法、系统及装置
CN113724240B (zh) * 2021-09-09 2023-10-17 中国海洋大学 一种基于视觉识别的冷柜脚轮检测方法、系统及装置
CN115063940A (zh) * 2022-06-06 2022-09-16 中国银行股份有限公司 一种风险监控方法及装置、存储介质及电子设备
CN115063940B (zh) * 2022-06-06 2024-02-09 中国银行股份有限公司 一种风险监控方法及装置、存储介质及电子设备
CN117706045A (zh) * 2024-02-06 2024-03-15 四川省德阳生态环境监测中心站 基于物联网实现大气臭氧监测设备的联合控制方法及系统
CN117706045B (zh) * 2024-02-06 2024-05-10 四川省德阳生态环境监测中心站 基于物联网实现大气臭氧监测设备的联合控制方法及系统

Also Published As

Publication number Publication date
CN110728810A (zh) 2020-01-24
CN110728810B (zh) 2021-08-17

Similar Documents

Publication Publication Date Title
WO2021063046A1 (fr) Système et procédé de surveillance de cible distribuée
JP2021514548A (ja) 目標対象物の監視方法、装置及びシステム
CN109886999B (zh) 位置确定方法、装置、存储介质和处理器
EP3107382B1 (fr) Systèmes de détection d'objet
JP2020038845A (ja) 照明空間を特性評価するための検知照明システム及び方法
US10195008B2 (en) System, device and method for observing piglet birth
US9740921B2 (en) Image processing sensor systems
CN111723633B (zh) 一种基于深度数据的人员行为模式分析方法和系统
CN103489006A (zh) 一种基于计算机视觉的水稻病虫草害诊断方法
US11532153B2 (en) Splash detection for surface splash scoring
CN109299703A (zh) 对鼠情进行统计的方法、装置以及图像采集设备
KR102492066B1 (ko) 이동식 예방 경계 시스템
CN109886555A (zh) 食品安全的监测方法及装置
KR101944374B1 (ko) 이상 개체 검출 장치 및 방법, 이를 포함하는 촬상 장치
CN108829762A (zh) 基于视觉的小目标识别方法和装置
CN109831634A (zh) 目标对象的密度信息确定方法及装置
US12014541B2 (en) Device for managing environment of breeding farm
Lello et al. Fruit fly automatic detection and monitoring techniques: A review
Bhattacharya et al. Arrays of single pixel time-of-flight sensors for privacy preserving tracking and coarse pose estimation
KR20190143518A (ko) 이상 개체 판단 장치 및 방법
KR102505691B1 (ko) 이상 개체 검출 장치 및 방법, 이를 포함하는 촬상 장치
KR20190103510A (ko) 촬상 장치, 이를 포함하는 가금류 관리 시스템 및 방법
KR20200009530A (ko) 이상 개체 검출 시스템 및 방법
US20230172489A1 (en) Method And A System For Monitoring A Subject
CN109934099A (zh) 放置位置的提示方法及装置、存储介质、电子装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20870577

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20870577

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 20870577

Country of ref document: EP

Kind code of ref document: A1