CN112883856B - Monitoring method, monitoring device, electronic equipment and storage medium - Google Patents

Monitoring method, monitoring device, electronic equipment and storage medium Download PDF

Info

Publication number
CN112883856B
CN112883856B CN202110162895.9A CN202110162895A CN112883856B CN 112883856 B CN112883856 B CN 112883856B CN 202110162895 A CN202110162895 A CN 202110162895A CN 112883856 B CN112883856 B CN 112883856B
Authority
CN
China
Prior art keywords
target object
target
image
detection
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110162895.9A
Other languages
Chinese (zh)
Other versions
CN112883856A (en
Inventor
蒲涛
刘溯
魏紫薇
鲁华超
周绍军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Huagan Technology Co ltd
Original Assignee
Zhejiang Huagan Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Huagan Technology Co ltd filed Critical Zhejiang Huagan Technology Co ltd
Priority to CN202110162895.9A priority Critical patent/CN112883856B/en
Publication of CN112883856A publication Critical patent/CN112883856A/en
Application granted granted Critical
Publication of CN112883856B publication Critical patent/CN112883856B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/41Higher-level, semantic clustering, classification or understanding of video scenes, e.g. detection, labelling or Markovian modelling of sport events or news items
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/18Status alarms
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/07Target detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Computational Linguistics (AREA)
  • Software Systems (AREA)
  • Business, Economics & Management (AREA)
  • Emergency Management (AREA)
  • Alarm Systems (AREA)
  • Geophysics And Detection Of Objects (AREA)

Abstract

The application discloses a monitoring method, a monitoring device, electronic equipment and a storage medium. Target detection is carried out on a target object in a video to be processed of a monitoring area, and if the obtained detection result meets a preset alarm condition, a thermal imaging image of the target object in the detection result is obtained. And if the target object meeting the preset alarm condition exists in the monitoring area according to the thermal imaging image, triggering alarm operation. By the method, the target object in the monitoring area can be accurately monitored, and false detection caused by target detection can be reduced or even avoided as much as possible.

Description

Monitoring method, monitoring device, electronic equipment and storage medium
Technical Field
The present invention relates to the field of security technologies, and in particular, to a monitoring method, a device, an electronic apparatus, and a storage medium.
Background
The detection of the ship is widely applied to navigation management of the ship and the like. The ship detection requires that the sea surface ship video is automatically analyzed under the condition of no artificial interference, and the ship in the dynamic scene is positioned and identified. Thereby realizing the real-time monitoring of the ship and alarming the abnormal event. In the prior art, an infrared thermal imaging technology with strong anti-interference capability and a relatively long monitoring distance is mostly adopted to monitor the sea surface monitoring area in all weather.
The infrared thermal imaging technology can greatly reduce the influence of weather factors such as rain and snow, but because the image features of the thermal imaging image are not as rich as those of the visible light image, false detection can occur when monitoring the sea surface with complex topography, and objects such as floating wood, reef and the like are taken as ships.
Disclosure of Invention
The purpose of the application is to provide a monitoring method. The method is used for reducing or even avoiding the problem of false detection caused by the fact that the image features of the thermal imaging image are not as rich as those of the visible light image when the target object in the monitoring area is monitored in the prior art.
In a first aspect, an embodiment of the present application provides a monitoring method, including:
performing target detection on the video to be processed to obtain a detection result of a target object in the monitoring area;
and if the detection result meets the preset alarm condition of the monitoring area and the monitoring area contains the target object according to the thermal imaging image of the target object, triggering alarm operation.
In some possible embodiments, the detection result includes location information of the target object in the video, and the determining that the monitored area contains the target object according to the thermal imaging image of the target object includes:
Cutting out an image area corresponding to the position information from the thermal imaging image according to the position information of the target object in the video;
detecting thermal radiation energy of the image area;
and if the image area contains a detection target with thermal radiation energy larger than a preset energy threshold, determining that the thermal imaging image contains the target object.
In some possible embodiments, for a trip wire detection, the detection result includes location information of the target object and movement direction information of the target object, and the preset alarm condition of the trip wire detection includes:
the movement direction information of the target object is a specified movement direction, and the position information of the target object intersects a tripwire.
In some possible embodiments, for area detection, the detection result includes location information of the target object, and the preset alarm condition of the area detection includes:
the position information of the target object is intersected with the warning area which is forbidden to pass.
In some possible embodiments, the detection result includes location information of the target object and motion state information of the target object, and when a forbidden region for forbidden the target object to stop exists in the monitoring region, determining whether the detection result meets the preset alarm condition includes:
If the position information of the target object is intersected with the forbidden stop area and the target object is in a static state, determining that the target object meets the preset alarm condition;
and if the position information of the target object does not intersect the forbidden region and/or the target object is not in a static state, determining that the target object does not meet the preset alarm condition.
In some possible embodiments, the method further comprises:
if the target object triggers an alarm operation, taking at least one frame of image containing the target object as a target image, and adding an alarm mark for prompting the position information of the target object to the target object in the target image;
and outputting the target image.
In some possible embodiments, the method further comprises:
and if other target objects which do not trigger the alarm operation exist in the target image, adding a safety identifier for indicating that the other target objects do not trigger the alarm operation to the other target objects in the target image.
In some possible embodiments, the method further comprises:
detecting a weather state of the monitoring area;
If the weather state is a preset weather state, acquiring the video to be processed by adopting a thermal imaging image acquisition state; the preset weather state is used for indicating the natural phenomenon of a substance shielding the target object in the environment where the monitoring area is located;
and if the weather state is not the preset weather state, acquiring the video to be processed by adopting a visible light image acquisition device.
In a second aspect, an embodiment of the present application provides a monitoring apparatus, including:
the target detection module is used for carrying out target detection on the video to be processed to obtain a detection result of a target object in the monitoring area;
and the alarm triggering module is used for triggering alarm operation if the detection result meets the preset alarm condition of the monitoring area and the monitoring area contains the target object according to the thermal imaging image of the target object.
In some possible embodiments, the detection result of the target detection module includes location information of the target object in the video, and the alarm triggering module includes:
the region clipping unit is used for clipping an image region corresponding to the position information from the thermal imaging image according to the position information of the target object in the video;
A heat radiation energy detection unit for detecting heat radiation energy of the image area;
and the target object determining unit is used for determining that the thermal imaging image contains the target object if the image area contains a detection target with thermal radiation energy larger than a preset energy threshold.
In some possible embodiments, for a trip wire detection, the detection result of the alarm triggering module includes location information of the target object and movement direction information of the target object, and the preset alarm condition of the trip wire detection is configured to:
the movement direction information of the target object is a specified movement direction, and the position information of the target object intersects a tripwire.
In some possible embodiments, for area detection, the detection result of the alarm triggering module includes location information of the target object, and the preset alarm condition of the area detection is configured to:
the position information of the target object is intersected with the warning area which is forbidden to pass.
In some possible embodiments, the detection result of the alarm triggering module includes location information of the target object and motion state information of the target object, and when a forbidden region for forbidden the target object to stop exists in the monitoring region, the alarm triggering module determines whether the detection result meets the preset alarm condition, and is configured to:
If the position information of the target object is intersected with the forbidden stop area and the target object is in a static state, determining that the target object meets the preset alarm condition;
and if the position information of the target object does not intersect the forbidden region and/or the target object is not in a static state, determining that the target object does not meet the preset alarm condition.
In some possible embodiments, the apparatus further comprises:
the target image output module is used for taking at least one frame of image containing the target object as a target image if the target object triggers an alarm operation, and adding an alarm mark for prompting the position information of the target object to the target object in the target image;
and outputting the target image.
In some possible embodiments, the target image output module further comprises:
and the security identification adding unit is used for adding a security identification for indicating that the other target object does not trigger the alarm operation to the other target object if the other target object which does not trigger the alarm operation exists in the target image.
In some possible embodiments, the apparatus is further configured to:
detecting a weather state of the monitoring area;
if the weather state is a preset weather state, acquiring the video to be processed by adopting a thermal imaging image acquisition state; the preset weather state is used for indicating the natural phenomenon of a substance shielding the target object in the environment where the monitoring area is located;
and if the weather state is not the preset weather state, acquiring the video to be processed by adopting a visible light image acquisition device.
In a third aspect, another embodiment of the present application also provides an electronic device, including at least one processor; and a memory communicatively coupled to the at least one processor; the memory stores instructions executable by the at least one processor to enable the at least one processor to perform any monitoring method provided by the embodiments of the present application.
In a fourth aspect, another embodiment of the present application further provides a computer storage medium storing a computer program for causing a computer to execute any one of the monitoring methods provided in the embodiments of the present application.
According to the method and the device, target detection is carried out on the target object in the video to be processed of the monitoring area, and if the obtained detection result meets the preset alarm condition, a thermal imaging image of the target object in the detection result is obtained. And if the target object meeting the preset alarm condition exists in the monitoring area according to the thermal imaging image, triggering alarm operation. By the method, the target object in the monitoring area can be accurately monitored, and false detection caused by target detection can be reduced or even avoided as much as possible.
Additional features and advantages of the application will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the application. The objectives and other advantages of the application will be realized and attained by the structure particularly pointed out in the written description and claims thereof as well as the appended drawings.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments of the present application will be briefly described below, and it is obvious that the drawings that are described below are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic illustration of an application environment according to one embodiment of the present application;
FIG. 2a is a flow chart of a monitoring method according to one embodiment of the present application;
FIG. 2b is a schematic diagram of tripwire detection according to an embodiment of the present application;
FIG. 2c is a schematic diagram of region detection according to one embodiment of the present application;
FIG. 2d is a diagram showing an alert operation interface according to one embodiment of the present application;
FIG. 3 is an overall flow chart in a ship monitoring scenario according to one embodiment of the present application;
FIG. 4 is a flow chart of a monitoring device according to one embodiment of the present application;
fig. 5 is a block diagram of an electronic device according to one embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and thoroughly described below with reference to the accompanying drawings. In the description of the embodiments of the present application, unless otherwise indicated, "/" means or, for example, a/B may represent a or B; the text "and/or" is merely an association relation describing the associated object, and indicates that three relations may exist, for example, a and/or B may indicate: the three cases where a exists alone, a and B exist together, and B exists alone, and in addition, in the description of the embodiments of the present application, "plural" means two or more than two.
In the description of the embodiments of the present application, unless otherwise indicated, the term "plurality" refers to two or more, and other words and phrases are to be understood and appreciated that the preferred embodiments described herein are for illustration and explanation of the present application only and are not intended to limit the present application, and embodiments of the present application and features of the embodiments may be combined with each other without conflict.
In order to further explain the technical solutions provided in the embodiments of the present application, the following details are described with reference to the accompanying drawings and the detailed description. Although the embodiments of the present application provide the method operational steps as shown in the following embodiments or figures, more or fewer operational steps may be included in the method based on routine or non-inventive labor. In steps where there is logically no necessary causal relationship, the execution order of the steps is not limited to the execution order provided by the embodiments of the present application. The methods may be performed sequentially or in parallel as shown in the embodiments or the drawings when the actual processing or the control device is executing.
The inventive concept of the present application is: and monitoring the target object of the monitoring area in real time, and processing the monitoring video frame by frame. And carrying out target detection on each frame of image, and if the detection result shows that the frame of image meets the preset alarm condition, acquiring the position of a target object meeting the preset alarm condition in the frame of image. And judging whether a target object exists at the position in the thermal imaging image through a preset energy threshold value, and determining to trigger alarm operation if the target object exists. By the method, the target object in the monitoring area can be accurately monitored, and after the fact that the target object meets the preset alarm condition is determined, the target object is confirmed according to the thermal imaging image of the target object, so that the human resource waste caused by warning when objects such as floating trees, reefs and the like are taken as the target object is avoided.
In addition, when the target object is a ship, in a monitoring scene for the ship, an erroneous recognition of the water surface area may result in taking an object such as a shore house as the target ship. Therefore, the present application is provided with an exclusion zone within the monitoring area. When the image is subjected to target detection, if the image is detected to have the target ship meeting the preset alarm condition and the position of the target ship is an exclusion zone, the result is directly excluded.
Additional features and advantages of the application will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the application. The objectives and other advantages of the application will be realized and attained by the structure particularly pointed out in the written description and claims thereof as well as the appended drawings.
The following describes the monitoring method in the embodiment of the present application in detail with reference to the accompanying drawings.
Referring to fig. 1, a schematic diagram of an application environment according to one embodiment of the present application is shown.
As shown in fig. 1, the application environment may include, for example, a network 10, a server 20, at least one monitoring device 30, and a terminal device 40. Wherein: the monitoring device 30 is used for monitoring the monitored area in real time, and transmitting the monitoring video to the server 20 through the network 10.
The terminal device 40 may send a monitoring image acquisition request to the server 20, and the server 20 issues an image transmission command to the monitoring device 30 through the network 10 in response to the monitoring image acquisition request, and the monitoring device 30 uploads the image of the monitored area in real time in response to the image transmission command. And displays the image uploaded in real time on the terminal device 40 through the network 10.
In some possible embodiments, the monitoring device 30 monitors the monitored area and sends the monitoring video in real time to the server 20 via the network 10. After receiving the monitoring video, the server 20 processes the monitoring video frame by frame, and detects whether a target object meeting preset alarm information exists in the monitoring area through target detection for each frame of image. If the detection result is that the target object meeting the preset alarm information exists in the frame image, the target object is marked, and the frame image is uploaded to the terminal equipment 40 through the network 10 for the monitoring personnel to inquire.
In the description herein, only a single server or terminal device is described in detail, but it should be understood by those skilled in the art that the illustrated server 20, monitoring device 30, and terminal device 40 are intended to represent the server, monitoring device, and terminal device for displaying a monitoring screen to which the technical solution of the present application relates. The details of the individual servers and monitoring devices are provided for ease of illustration at least and are not meant to imply limitations on the number, type, location, etc. of terminal devices and servers.
In the following, the monitoring method of the present application is illustrated by ship detection, and it should be clear that the target object monitored in the present application may be not only a ship, but also a pedestrian, an animal, a vehicle, or the like.
The inventor finds that the ship is monitored by an infrared thermal imaging technology and a microwave imaging radar technology aiming at a sea surface monitoring area in the prior art. When monitoring is performed based on a microwave imaging radar, the resolution capability of the ship is reduced due to the fact that radar imaging is interfered by rain and snow environments. While the infrared thermal imaging technology can greatly reduce the influence of weather factors such as rain and snow, false detection can occur when monitoring sea surfaces with complex terrains due to the fact that the image features of thermal imaging images are not as rich as those of visible light images. To solve the above problems, the inventors have considered that a multispectral camera has a plurality of filters, and can simultaneously obtain a visible light image and a thermal imaging image of a monitoring area at the time of monitoring through a combination of various filters or beamsplitters and various photosensitive films. Based on the characteristics of the multispectral camera, the multispectral camera is used for all-weather monitoring of the monitoring area, when the monitoring time is daytime and the weather condition of the monitoring area is good, the visible light video collected by the multispectral camera is used as a monitoring video, and target detection is carried out on the monitoring video. And when the monitoring time is not daytime and/or the weather condition of the monitored area is poor, taking the thermal imaging image acquired by the multispectral camera as a monitoring video, and carrying out target detection on the monitoring video. Fig. 2a shows a flow chart of a monitoring method according to an embodiment of the present application, including:
Step 201: and carrying out target detection on the video to be processed to obtain a detection result of the target object in the monitoring area.
When a ship in a certain area in a sea area needs to be monitored, a common monitoring scene can be summarized as follows: the monitoring area only allows one-way passage of the ship, and the monitoring area prohibits traveling of the ship and the monitoring area prohibits berthing. Therefore, when target detection is performed for a surveillance video in a sea surveillance area, it is necessary to determine whether or not a ship exists in the surveillance area and a motion state of the ship. In order to ensure the accuracy of identifying the ship and the ship motion state, when the method is implemented, the monitoring video is processed frame by frame, each frame of image is input into a CNN model (Convolutional Neural Network ) for storing a large number of ship features, and a target with the matching confidence of the ship features in the image and the ship features stored in the CNN model reaching more than 90% is determined as a target ship. After the target ship is determined, the frame image and the adjacent frame images are dynamically detected (the position information of the target ship in the frame image in the adjacent frame image is determined, and then the target ship is determined to be in a moving or static state).
The inventor considers that in an application scene that only one-way passage is allowed in a certain area in the sea area, for example, the navigation track of a ship needs to be acquired. In the implementation, the navigation direction of the ship can be obtained through multi-frame image comparison, and the direction is used as the navigation track of the target ship.
The inventors also consider that in an application scenario such as the presence of a no-berthing area in the sea, it is necessary to acquire the time of the vessel in that area. Therefore, after determining the target vessel and the motion state of the target vessel, timing information for timing is added to the frame image. For example, a section of the monitoring video is processed into 4000 frame images frame by frame, and a target ship is found to be parked in the no-parking area from the 6 th frame image to the 3906 th frame image. Time series information such as a month B day C year, a time B minute C seconds is added to each of the 6 th to 3906 th frame images. The time when the target vessel is moored in the no-berth area can be determined based on the difference in timing information in the 6 th frame and 3906 th image, and a fine is made to the target vessel based on the time.
After determining the target ship and related information in the image, it is necessary to detect whether the target ship meets a preset warning condition. The application can be provided with corresponding detection rules aiming at the common application scene of ship monitoring, namely a tripwire detection rule, a region detection rule and a forbidden stop detection rule. The following describes how to perform object detection according to three detection rules, including the following scenarios:
Scene 1: tripwire detection
The trip line detection is to determine whether a ship sails in a region away from the allowed direction when the monitoring region in the sea allows only one direction of the ship. As particularly shown in fig. 2 b.
When the method is implemented, firstly, a monitoring video of the monitoring area is acquired, the video is processed frame by frame, each frame of image is subjected to target detection, if the fact that the center point of the target ship is intersected with the tripwire is detected in a certain frame of image, the position information of the target ship in the front and rear preset frame images of the frame of image is acquired, and the navigation track of the target ship is determined according to the position information of the target ship in the multi-frame image. And if the navigation track of the target ship navigates from the direction of no passing and the target ship intersects with the tripwire, triggering a preset alarm condition for tripwire intrusion detection.
Scene 2: area detection
The regional intrusion detection is that a certain region in the sea area is an alarm region for prohibiting the ship from passing through, the ship in the region is detected, and whether the movement amplitude of the ship in the alarm region reaches a body position or not is determined according to the center point of the ship so as to judge whether a preset alarm condition is met or not.
In implementation, as shown in fig. 2c, after processing the monitoring video frame by frame, object detection is performed on each frame of video image. If the existence of the ship in the alarm area is detected in a certain frame of image, the position information of the target ship in a preset frame of image axially behind a delay time is obtained, the center point of the target ship is determined according to the position information, the center point of the target ship is taken as the center of a circle, the length of one body position of the ship is taken as the radius to intercept the target circle area, and if the center point of the target ship is detected in the alarm area which does not intersect with the target circle area, the preset alarm condition aiming at area invasion is triggered.
Scene 3: forbidden stop detection
And (3) detecting the ship stopping, namely detecting the ship in a certain area in the sea area when the area is a stopping area for prohibiting the ship from stopping. During implementation, the forbidden region is monitored, the monitoring video is processed frame by frame, and each frame of image is subjected to target detection. If the existence of the target ship in the forbidden region is detected in a certain frame of image, acquiring the position information of the target ship in a preset frame of image axially behind a delay time. And determining whether the target ship is in a berthing state according to the position information, and if the target ship is in a berthing static state, determining the stay time of the target ship in a forbidden parking area according to the time sequence information on each frame of image. And triggering a preset alarm condition for ship stopping detection if the stay time is greater than the preset stay time.
Step 202: and if the detection result meets the preset alarm condition of the monitoring area and the monitoring area contains the target object according to the thermal imaging image of the target object, triggering alarm operation.
The inventor considers that when identifying a ship for a monitoring area, the identified target ship may not be an actual ship due to the influence of objects such as reefs, floating wood, and the like. However, any object with a temperature higher than absolute zero will radiate infrared light (the higher the temperature, the stronger the radiated infrared light), and usually when the ship is running on the water, the engine is in a high temperature state due to the operation of the engine. And the engine area temperature may be much higher than elsewhere on the vessel. Therefore, the target ship region in the thermal imaging image can be analyzed by heat radiation energy, and if the heat radiation energy in a certain position in the target ship region is larger than a preset energy threshold value and the heat radiation energy in the certain position is far higher than the heat radiation energy in other positions in the region, the existence of the target ship in the target ship region is determined. It should be noted that, the processing for the thermal imaging image (i.e., the identification of the thermal radiation energy of the target ship region in the thermal imaging image) in the above-described flow is not limited to the processing for the surveillance video (i.e., the processing of the surveillance video frame by frame and then the target detection of each frame of image). For example, the thermal radiation energy of each region within the thermographic image may be identified while each frame of image is subject to object detection. The sequence of the processing of the thermal imaging image and the processing of the monitoring video is not limited.
In some possible embodiments, while target detection is performed on a frame image, no matter the frame image is a visible light image or a thermal imaging image, the thermal imaging image corresponding to the frame image is backed up and stored. If the target detection result of the frame image meets the preset alarm condition, determining the region of the target ship meeting the preset alarm condition in the frame image. And marking a corresponding region where the target ship is located in the thermal imaging image corresponding to the frame image, and detecting thermal radiation energy of the region where the ship of the thermal imaging image is located. If the detection result shows that the thermal radiation energy of a certain position in the area where the ship is located is larger than the preset energy threshold value and the thermal radiation energy of the position is far higher than the thermal radiation energy of other positions in the area, determining that the target ship exists in the area where the ship is located.
In addition, the inventors have considered that, when identifying a ship for a monitoring area, an erroneous identification of a water surface area may result in an object such as a shore house being treated as a ship. Because the condition that the heat radiation energy is higher than the preset energy threshold value possibly exists in objects such as houses and the like, the objects cannot be effectively distinguished from ships, and therefore, a rule exclusion zone is further arranged for a monitoring area, and if the fact that a target ship meets the preset alarm condition is detected in the area, the detection result is automatically ignored.
In some possible embodiments, the terrain environment is examined in advance for the monitoring area, the monitoring area picture is subjected to coordinate division, the object area which can not be effectively distinguished from the ship such as a house is marked, and the marked area is used as a rule exclusion area. And if the target detection result aiming at a certain frame of image is that the target ship in the image meets the preset alarm condition, acquiring the position information of the target ship. If the target ship is in the rule exclusion zone in the image, the detection result is ignored, and no processing is performed.
Through the flow, whether the target ship meeting the preset alarm condition in the target detection is an actual ship can be effectively determined. And triggering an alarm operation if the ship is determined to be an actual ship. When the alarm operation is triggered, an alarm identifier is added to a target ship triggering the alarm operation, and an image corresponding to the target ship is output. Specifically, as shown in fig. 2d, the target ship triggering the alarm operation is marked with a red frame, and the acousto-optic linkage alarm is started. According to the method, the supervisory personnel can be prompted through sound and flash lamps, and the ship triggering the alarm operation appears in the monitoring area. In addition, if there are other vessels in the outputted image that do not trigger the warning operation, the other vessels that do not trigger the warning operation may be marked with a green frame for ease of viewing.
In some possible embodiments, in order to more conveniently confirm the ship related information triggering the alarm operation, the multispectral camera starts a visible light mode in response to the alarm operation, monitors the monitoring area and feeds back the monitoring screen in real time while determining the triggering the alarm operation. And when the monitoring picture is fed back, confirming the target ship from the monitoring picture according to the ship characteristics of the target ship acquired during target detection. And centering the picture of the target ship, and amplifying the target ship to a specified multiplying power.
Having described the monitoring method provided herein, for the sake of facilitating understanding of the overall flow of the present application, another flow diagram of the monitoring method is shown in the present application, and in particular, as shown in fig. 3, including the following steps:
in step 301, by determining the weather condition of the monitored area and the preset time, if the weather condition of the monitored area is good within the preset time period, step 3011 is executed to obtain the monitoring video of the multispectral camera in the visible light mode. If the weather condition of the monitored area is worse and/or not within the preset time period, step 3012 is executed to obtain the monitoring video of the multispectral camera in the thermal imaging mode.
After the surveillance video is acquired, step 302 is performed, where the surveillance video is processed frame by frame, and each frame of image is subject to object detection. Step 3021 is performed during target detection to determine whether the detection result satisfies a preset alarm condition. And if the detection result does not meet the alarm condition, performing target detection on the next frame of image. If the detection result meets the preset alarm condition, step 3022 is executed to determine the location information of the target ship in the frame image, and determine the location of the target ship in the thermal imaging image corresponding to the frame image according to the location information.
Next, step 303 is executed to determine whether the target ship is in the rule exclusion area, and if the target ship is in the preset rule exclusion area, the detection result is ignored, and the target detection is continued for the next frame of image. If the target ship is not in the predetermined rule exclusion area, step 304 is executed to determine whether the target ship is an actual ship. When judging whether the target ship is an actual ship, detecting the thermal radiation energy of the position area of the thermal imaging image according to the position of the target ship in the thermal imaging image obtained in step 3022, if the thermal radiation energy of one position area is greater than a preset energy threshold and the thermal radiation energy of the other position area is far higher than the position area, determining that the target ship is an actual ship, executing step 305, and triggering an alarm operation.
It should be noted that, for the execution determination whether the detection result satisfies the preset alarm condition (i.e. step 3021), the execution determination whether the target ship is in the rule exclusion area (i.e. step 303), and the execution determination whether the target ship is an actual ship (i.e. step 304), the execution may be performed simultaneously, or the execution sequence of each step may be arranged according to the actual situation, which is not limited in this application.
Based on the same inventive concept, the present application further provides a monitoring apparatus 400, as shown in fig. 4, including:
the target detection module 401 is configured to perform target detection on a video to be processed, so as to obtain a detection result of a target object in the monitoring area;
and the alarm triggering module 402 is configured to trigger an alarm operation if the detection result meets a preset alarm condition of the monitoring area and it is determined that the monitoring area contains the target object according to the thermal imaging image of the target object.
In some possible embodiments, the detection result of the target detection module includes location information of the target object in the video, and the alarm triggering module includes:
the region clipping unit is used for clipping an image region corresponding to the position information from the thermal imaging image according to the position information of the target object in the video;
A heat radiation energy detection unit for detecting heat radiation energy of the image area;
and the target object determining unit is used for determining that the thermal imaging image contains the target object if the image area contains a detection target with thermal radiation energy larger than a preset energy threshold.
In some possible embodiments, for a trip wire detection, the detection result of the alarm triggering module includes location information of the target object and movement direction information of the target object, and the preset alarm condition of the trip wire detection is configured to:
the movement direction information of the target object is a specified movement direction, and the position information of the target object intersects a tripwire.
In some possible embodiments, for area detection, the detection result of the alarm triggering module includes location information of the target object, and the preset alarm condition of the area detection is configured to:
the position information of the target object is intersected with the warning area which is forbidden to pass.
In some possible embodiments, the detection result of the alarm triggering module includes location information of the target object and motion state information of the target object, and when a forbidden region for forbidden the target object to stop exists in the monitoring region, the alarm triggering module determines whether the detection result meets the preset alarm condition, and is configured to:
And if the position information of the target object is intersected with the forbidden region and the target object is in a static state, determining that the target object meets the preset alarm condition.
And if the position information of the target object does not intersect the forbidden region and/or the target object is not in a static state, determining that the target object does not meet the preset alarm condition.
In some possible embodiments, the apparatus further comprises:
the target image output module is used for taking at least one frame of image containing the target object as a target image if the target object triggers an alarm operation, and adding an alarm mark for prompting the position information of the target object to the target object in the target image;
and outputting the target image.
In some possible embodiments, the target image output module further comprises:
and the security identification adding unit is used for adding a security identification for indicating that the other target object does not trigger the alarm operation to the other target object if the other target object which does not trigger the alarm operation exists in the target image.
In some possible embodiments, the apparatus is further configured to:
detecting a weather state of the monitoring area;
if the weather state is a preset weather state, acquiring the video to be processed by adopting a thermal imaging image acquisition state; the preset weather state is used for indicating the natural phenomenon of a substance shielding the target object in the environment where the monitoring area is located;
and if the weather state is not the preset weather state, acquiring the video to be processed by adopting a visible light image acquisition device.
For implementation of each operation in the monitoring, reference may be made to the description of the foregoing method, and the description is omitted here.
Having described the monitoring method and apparatus of the exemplary embodiments of the present application, next, an electronic device according to another exemplary embodiment of the present application is described.
Those skilled in the art will appreciate that the various aspects of the present application may be implemented as a system, method, or program product. Accordingly, aspects of the present application may be embodied in the following forms, namely: an entirely hardware embodiment, an entirely software embodiment (including firmware, micro-code, etc.) or an embodiment combining hardware and software aspects may be referred to herein as a "circuit," module "or" system.
In some possible implementations, an electronic device according to the present application may include at least one processor, and at least one memory. The memory stores therein program code that, when executed by the processor, causes the processor to perform the steps in the monitoring method according to various exemplary embodiments of the present application described above in this specification. For example, the processor may perform steps as in a monitoring method.
An electronic device 130 according to this embodiment of the present application is described below with reference to fig. 5. The electronic device 130 shown in fig. 5 is only an example and should not be construed as limiting the functionality and scope of use of embodiments of the present application.
As shown in fig. 5, the electronic device 130 is in the form of a general-purpose electronic device. Components of electronic device 130 may include, but are not limited to: the at least one processor 131, the at least one memory 132, and a bus 133 connecting the various system components, including the memory 132 and the processor 131.
Bus 133 represents one or more of several types of bus structures, including a memory bus or memory controller, a peripheral bus, a processor, and a local bus using any of a variety of bus architectures.
Memory 132 may include readable media in the form of volatile memory such as Random Access Memory (RAM) 1321 and/or cache memory 1322, and may further include Read Only Memory (ROM) 1323.
Memory 132 may also include a program/utility 1325 having a set (at least one) of program modules 1324, such program modules 1324 include, but are not limited to: an operating system, one or more application programs, other program modules, and program data, each or some combination of which may include an implementation of a network environment.
The electronic device 130 may also communicate with one or more external devices 134 (e.g., keyboard, pointing device, etc.), one or more devices that enable a user to interact with the electronic device 130, and/or any device (e.g., router, modem, etc.) that enables the electronic device 130 to communicate with one or more other electronic devices. Such communication may occur through an input/output (I/O) interface 135. Also, electronic device 130 may communicate with one or more networks such as a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the Internet, through network adapter 136. As shown, network adapter 136 communicates with other modules for electronic device 130 over bus 133. It should be appreciated that although not shown, other hardware and/or software modules may be used in connection with electronic device 130, including, but not limited to: microcode, device drivers, redundant processors, external disk drive arrays, RAID systems, tape drives, data backup storage systems, and the like.
In some possible embodiments, aspects of a monitoring method provided herein may also be implemented in the form of a program product comprising program code for causing a computer device to carry out the steps in a monitoring according to the various exemplary embodiments of the application described herein above, when the program product is run on the computer device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. The readable storage medium can be, for example, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium would include the following: an electrical connection having one or more wires, a portable disk, a hard disk, random Access Memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM or flash memory), optical fiber, portable compact disk read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The program product for monitoring of embodiments of the present application may employ a portable compact disc read only memory (CD-ROM) and include program code and may run on an electronic device. However, the program product of the present application is not limited thereto, and in this document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The readable signal medium may include a data signal propagated in baseband or as part of a carrier wave with readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations of the present application may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the consumer electronic device, partly on the consumer electronic device, as a stand-alone software package, partly on the consumer electronic device, partly on the remote electronic device, or entirely on the remote electronic device or server. In the case of remote electronic devices, the remote electronic device may be connected to the consumer electronic device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external electronic device (e.g., connected through the internet using an internet service provider).
It should be noted that although several units or sub-units of the apparatus are mentioned in the above detailed description, such a division is merely exemplary and not mandatory. Indeed, the features and functions of two or more of the elements described above may be embodied in one element in accordance with embodiments of the present application. Conversely, the features and functions of one unit described above may be further divided into a plurality of units to be embodied.
Furthermore, although the operations of the methods of the present application are depicted in the drawings in a particular order, this is not required to or suggested that these operations must be performed in this particular order or that all of the illustrated operations must be performed in order to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step to perform, and/or one step decomposed into multiple steps to perform.
It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flowchart and/or block of the flowchart and block diagrams, and combinations of flowcharts and block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and block diagram block or blocks.
While preferred embodiments of the present application have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. It is therefore intended that the following claims be interpreted as including the preferred embodiments and all such alterations and modifications as fall within the scope of the application.
It will be apparent to those skilled in the art that various modifications and variations can be made in the present application without departing from the spirit or scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims and the equivalents thereof, the present application is intended to cover such modifications and variations.

Claims (14)

1. A method of monitoring, the method comprising:
Performing target detection on the video to be processed to obtain a detection result of a target object in the monitoring area;
if the detection result meets the preset alarm condition of the monitoring area and the monitoring area contains the target object according to the thermal imaging image of the target object, triggering alarm operation;
the detection result comprises the position information of the target object in the video, and the determination that the monitoring area contains the target object according to the thermal imaging image of the target object comprises
Cutting out an image area corresponding to the position information from the thermal imaging image according to the position information of the target object in the video;
detecting thermal radiation energy of the image area; if the image area contains a detection target with heat radiation energy larger than a preset energy threshold value and the image area is not in a preset rule exclusion area, determining that the thermal imaging image contains the target object; wherein the video to be processed is determined by:
detecting a weather state of the monitoring area;
if the weather state is a preset weather state, acquiring the video to be processed by adopting a thermal imaging image acquisition state; otherwise, acquiring the video to be processed by adopting a visible light image acquisition device; the preset weather state characterizes the natural phenomenon that substances shielding the target object exist in the environment where the monitoring area is located.
2. The method of claim 1, wherein for a trip wire detection, the detection result includes location information of the target object and movement direction information of the target object, the preset alarm condition for the trip wire detection includes:
the movement direction information of the target object is a specified movement direction, and the position information of the target object intersects a tripwire.
3. The method according to claim 1, wherein for area detection, the detection result includes location information of the target object, and the preset alarm condition of the area detection includes:
the position information of the target object is intersected with the warning area which is forbidden to pass.
4. The method according to claim 1, wherein the detection result includes position information of the target object and movement state information of the target object, and when a forbidden region for forbidden the target object to stop exists in the monitoring region, determining whether the detection result meets the preset alarm condition includes:
if the position information of the target object is intersected with the forbidden stop area and the target object is in a static state, determining that the target object meets the preset alarm condition;
And if the position information of the target object does not intersect the forbidden region and/or the target object is not in a static state, determining that the target object does not meet the preset alarm condition.
5. The method according to claim 1, wherein the method further comprises:
if the target object triggers an alarm operation, taking at least one frame of image containing the target object as a target image, and adding an alarm mark for prompting the position information of the target object to the target object in the target image;
and outputting the target image.
6. The method of claim 5, wherein the method further comprises:
and if other target objects which do not trigger the alarm operation exist in the target image, adding a safety identifier for indicating that the other target objects do not trigger the alarm operation to the other target objects in the target image.
7. A monitoring device, the device comprising:
the target detection module is used for carrying out target detection on the video to be processed to obtain a detection result of a target object in the monitoring area;
the alarm triggering module is used for triggering alarm operation if the detection result meets the preset alarm condition of the monitoring area and the monitoring area contains the target object according to the thermal imaging image of the target object;
The detection result of the target detection module comprises the position information of the target object in the video, and the alarm triggering module comprises
The region clipping unit is used for clipping an image region corresponding to the position information from the thermal imaging image according to the position information of the target object in the video;
a heat radiation energy detection unit for detecting heat radiation energy of the image area;
a target object determining unit, configured to determine that the thermal imaging image contains the target object if the image area contains a detection target whose thermal radiation energy is greater than a preset energy threshold and the image area is not in a preset rule exclusion area; wherein the video to be processed is determined by:
detecting a weather state of the monitoring area;
if the weather state is a preset weather state, acquiring the video to be processed by adopting a thermal imaging image acquisition state; otherwise, acquiring the video to be processed by adopting a visible light image acquisition device; the preset weather state characterizes the natural phenomenon that substances shielding the target object exist in the environment where the monitoring area is located.
8. The apparatus of claim 7, wherein for a trip wire detection, the detection result of the alarm triggering module includes position information of the target object and movement direction information of the target object, the preset alarm condition of the trip wire detection is configured to:
the movement direction information of the target object is a specified movement direction, and the position information of the target object intersects a tripwire.
9. The apparatus of claim 7, wherein for zone detection, the detection result of the alert triggering module includes location information of the target object, and the preset alert condition of the zone detection is configured to:
the position information of the target object is intersected with the warning area which is forbidden to pass.
10. The apparatus of claim 7, wherein the detection result of the alarm triggering module includes location information of the target object and motion state information of the target object, and when a forbidden region for forbidden the target object to stop exists in the monitored region, the alarm triggering module determines whether the detection result meets the preset alarm condition, and is configured to:
If the position information of the target object is intersected with the forbidden stop area and the target object is in a static state, determining that the target object meets the preset alarm condition;
and if the position information of the target object does not intersect the forbidden region and/or the target object is not in a static state, determining that the target object does not meet the preset alarm condition.
11. The apparatus of claim 7, wherein the apparatus further comprises:
the target image output module is used for taking at least one frame of image containing the target object as a target image if the target object triggers an alarm operation, and adding an alarm mark for prompting the position information of the target object to the target object in the target image;
and outputting the target image.
12. The apparatus of claim 11, wherein the target image output module further comprises:
and the security identification adding unit is used for adding a security identification for indicating that the other target object does not trigger the alarm operation to the other target object if the other target object which does not trigger the alarm operation exists in the target image.
13. An electronic device comprising at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-6.
14. A computer storage medium, characterized in that the computer storage medium stores a computer program for causing a computer to perform the method according to any one of claims 1-6.
CN202110162895.9A 2021-02-05 2021-02-05 Monitoring method, monitoring device, electronic equipment and storage medium Active CN112883856B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110162895.9A CN112883856B (en) 2021-02-05 2021-02-05 Monitoring method, monitoring device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110162895.9A CN112883856B (en) 2021-02-05 2021-02-05 Monitoring method, monitoring device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN112883856A CN112883856A (en) 2021-06-01
CN112883856B true CN112883856B (en) 2024-03-29

Family

ID=76055845

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110162895.9A Active CN112883856B (en) 2021-02-05 2021-02-05 Monitoring method, monitoring device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112883856B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115131740B (en) * 2022-08-30 2022-12-02 海易科技(北京)有限公司 Alarm information association method and device, electronic equipment and computer readable medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102426804A (en) * 2011-11-17 2012-04-25 浣石 Early warning system for protecting bridge from ship collision based on far-infrared cross thermal imaging
CN105141887A (en) * 2015-07-06 2015-12-09 国家电网公司 Submarine cable area video alarming method based on thermal imaging
CN207233223U (en) * 2017-06-30 2018-04-13 无锡市高桥检测科技有限公司 Bridge intelligent anti-collision system module connection structure
CN109901563A (en) * 2019-03-12 2019-06-18 中国能源建设集团广东省电力设计研究院有限公司 Marine wind electric field ship and the video frequency monitoring system of personnel
CN111414831A (en) * 2020-03-13 2020-07-14 深圳市商汤科技有限公司 Monitoring method and system, electronic device and storage medium
KR102161244B1 (en) * 2020-02-26 2020-09-29 (주)윈텍 Industrial equipment monitoring and alarm apparatus and its method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102426804A (en) * 2011-11-17 2012-04-25 浣石 Early warning system for protecting bridge from ship collision based on far-infrared cross thermal imaging
CN105141887A (en) * 2015-07-06 2015-12-09 国家电网公司 Submarine cable area video alarming method based on thermal imaging
CN207233223U (en) * 2017-06-30 2018-04-13 无锡市高桥检测科技有限公司 Bridge intelligent anti-collision system module connection structure
CN109901563A (en) * 2019-03-12 2019-06-18 中国能源建设集团广东省电力设计研究院有限公司 Marine wind electric field ship and the video frequency monitoring system of personnel
KR102161244B1 (en) * 2020-02-26 2020-09-29 (주)윈텍 Industrial equipment monitoring and alarm apparatus and its method
CN111414831A (en) * 2020-03-13 2020-07-14 深圳市商汤科技有限公司 Monitoring method and system, electronic device and storage medium

Also Published As

Publication number Publication date
CN112883856A (en) 2021-06-01

Similar Documents

Publication Publication Date Title
US20120269383A1 (en) Reliability in detecting rail crossing events
CN111738240A (en) Region monitoring method, device, equipment and storage medium
CN112776856A (en) Track foreign matter intrusion monitoring method, device and system and monitoring host equipment
CN111523362A (en) Data analysis method and device based on electronic purse net and electronic equipment
CN113721621B (en) Vehicle control method, device, electronic equipment and storage medium
CN112862821A (en) Water leakage detection method and device based on image processing, computing equipment and medium
KR20210135313A (en) Distracted Driving Monitoring Methods, Systems and Electronics
CN105632115A (en) Offshore oilfield security system
CN112883856B (en) Monitoring method, monitoring device, electronic equipment and storage medium
CN113780127A (en) Ship positioning and monitoring system and method
Chan Comprehensive comparative evaluation of background subtraction algorithms in open sea environments
WO2023279786A1 (en) Perimeter detection method and apparatus, and device and system
CN116229688A (en) Engineering construction safety risk early warning method and system
CN116311727A (en) Intrusion response method, device, equipment and readable storage medium
CN112346078A (en) Ship superelevation detection method and device, electronic equipment and storage medium
US20210181122A1 (en) Close object detection for monitoring cameras
KR102456190B1 (en) Black box system for offshore fishing vessels
KR102479959B1 (en) Artificial intelligence based integrated alert method and object monitoring device
CN105355090A (en) Ship information correlation method
CN112949359A (en) Convolutional neural network-based abnormal behavior identification method and device
CN111812610B (en) Water target supervision system, method, terminal equipment and storage medium
CN102740107A (en) Damage monitoring system of image surveillance equipment and method
CN111918032B (en) Unmanned ship-based overwater supervision method, system, device and intelligent equipment
CN116740874A (en) Intrusion detection method and related device
CN113642509A (en) Garbage bin overflow state detection method and device, storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20230824

Address after: Room 201, Building A, Integrated Circuit Design Industrial Park, No. 858, Jianshe 2nd Road, Economic and Technological Development Zone, Xiaoshan District, Hangzhou City, Zhejiang Province, 311215

Applicant after: Zhejiang Huagan Technology Co.,Ltd.

Address before: Hangzhou City, Zhejiang province Binjiang District 310053 shore road 1187

Applicant before: ZHEJIANG DAHUA TECHNOLOGY Co.,Ltd.

GR01 Patent grant
GR01 Patent grant