CN112543497A - Safety monitoring system of degree of depth perception - Google Patents
Safety monitoring system of degree of depth perception Download PDFInfo
- Publication number
- CN112543497A CN112543497A CN202011436849.5A CN202011436849A CN112543497A CN 112543497 A CN112543497 A CN 112543497A CN 202011436849 A CN202011436849 A CN 202011436849A CN 112543497 A CN112543497 A CN 112543497A
- Authority
- CN
- China
- Prior art keywords
- intelligent box
- monitoring
- wireless
- data
- acquisition device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000012544 monitoring process Methods 0.000 title claims abstract description 75
- 230000008447 perception Effects 0.000 title claims abstract description 10
- 238000001514 detection method Methods 0.000 claims abstract description 12
- 238000013473 artificial intelligence Methods 0.000 claims abstract description 11
- 238000013136 deep learning model Methods 0.000 claims description 6
- 238000012549 training Methods 0.000 claims description 6
- 238000004891 communication Methods 0.000 claims description 5
- 239000000463 material Substances 0.000 claims description 4
- 230000004927 fusion Effects 0.000 abstract description 3
- 238000012545 processing Methods 0.000 abstract description 3
- 238000005516 engineering process Methods 0.000 abstract description 2
- 230000001960 triggered effect Effects 0.000 abstract description 2
- 230000006399 behavior Effects 0.000 description 8
- 239000007789 gas Substances 0.000 description 5
- 238000000034 method Methods 0.000 description 5
- 238000013523 data management Methods 0.000 description 4
- 238000007726 management method Methods 0.000 description 4
- 230000003993 interaction Effects 0.000 description 3
- 238000004519 manufacturing process Methods 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 230000009471 action Effects 0.000 description 2
- 238000013135 deep learning Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 230000004438 eyesight Effects 0.000 description 2
- RWSOTUBLDIXVET-UHFFFAOYSA-N Dihydrogen sulfide Chemical compound S RWSOTUBLDIXVET-UHFFFAOYSA-N 0.000 description 1
- 230000005856 abnormality Effects 0.000 description 1
- 239000002253 acid Substances 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000004140 cleaning Methods 0.000 description 1
- 230000007797 corrosion Effects 0.000 description 1
- 238000005260 corrosion Methods 0.000 description 1
- 238000006477 desulfuration reaction Methods 0.000 description 1
- 230000023556 desulfurization Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000005538 encapsulation Methods 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 229910000037 hydrogen sulfide Inorganic materials 0.000 description 1
- 239000003112 inhibitor Substances 0.000 description 1
- 210000001503 joint Anatomy 0.000 description 1
- 239000007788 liquid Substances 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000005236 sound signal Effects 0.000 description 1
- 229910052717 sulfur Inorganic materials 0.000 description 1
- 239000011593 sulfur Substances 0.000 description 1
- 239000002341 toxic gas Substances 0.000 description 1
- 238000004642 transportation engineering Methods 0.000 description 1
- 238000012795 verification Methods 0.000 description 1
- 230000016776 visual perception Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W52/00—Power management, e.g. TPC [Transmission Power Control], power saving or power classes
- H04W52/02—Power saving arrangements
- H04W52/0209—Power saving arrangements in terminal devices
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/40—Scenes; Scene-specific elements in video content
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/172—Classification, e.g. identification
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B19/00—Alarms responsive to two or more different undesired or abnormal conditions, e.g. burglary and fire, abnormal temperature and abnormal rate of flow
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/65—Control of camera operation in relation to power supply
- H04N23/651—Control of camera operation in relation to power supply for reducing power consumption by affecting camera operations, e.g. sleep mode, hibernation mode or power off of selective parts of the camera
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/66—Remote control of cameras or camera parts, e.g. by remote control devices
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/181—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/188—Capturing isolated or intermittent images triggered by the occurrence of a predetermined event, e.g. an object reaching a predetermined position
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W84/00—Network topologies
- H04W84/18—Self-organising networks, e.g. ad-hoc networks or sensor networks
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02D—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
- Y02D30/00—Reducing energy consumption in communication networks
- Y02D30/70—Reducing energy consumption in communication networks in wireless communication networks
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Computer Networks & Wireless Communication (AREA)
- Artificial Intelligence (AREA)
- Biophysics (AREA)
- Evolutionary Computation (AREA)
- Computational Linguistics (AREA)
- Molecular Biology (AREA)
- Computing Systems (AREA)
- General Engineering & Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Biomedical Technology (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Human Computer Interaction (AREA)
- Life Sciences & Earth Sciences (AREA)
- Alarm Systems (AREA)
Abstract
The invention provides a safety monitoring system for depth perception, which comprises monitoring nodes, a sink node, an intelligent box and an image acquisition device, wherein the monitoring nodes and the sink node form a wireless sensor network, detection data of each monitoring area are collected and transmitted to the intelligent box, the intelligent box is in wireless connection with the image acquisition device, the device corresponding to the monitoring area is triggered to acquire video image data, an artificial intelligence algorithm is packaged in the intelligent box, the received audio and video data are analyzed and identified, and the result is output to a monitoring platform. The system of the invention encapsulates the monitoring module and the recognition processing module, has higher adaptability, simultaneously the independent operation of the system is not influenced by the original equipment system, and the reliability of the safety monitoring under the unattended station is improved by the multi-sensor data fusion and the perception of a machine vision technology simulator.
Description
Technical Field
The invention relates to the technical field of multidimensional intelligent detection systems, in particular to a depth-sensing safety monitoring system.
Background
The high-sulfur gas field belongs to a high-hydrogen-sulfide-content acid gas field, the ground gathering and transportation engineering process is complex, hydrogen sulfide is a toxic gas, and the production risk of the gas field is high. In order to ensure the production safety of the gas field, a safe emptying torch is required to be arranged at each well station as a safe discharge facility in the production process. The high-sulfur-content gas field emptying torch control system is a small control system based on a PLC (programmable logic controller), and functions of flame monitoring, torch ignition, automatic ignition after flameout and the like are realized. After the self-propelled well station is unattended, the remote detection of the ignition state of the emptying torch is particularly important.
With the development and popularization of automation and digitization of engineering operation, most of the existing engineering control systems adopt automation control, but for some sites which can happen by accident, manual monitoring is still needed to ensure safe and normal work, such as monitoring of liquid levels of square wells and desulfurization pools, monitoring of corrosion inhibitor filling operation, detection of surrounding disasters and the like.
The existing monitoring has unicity, data cannot be interacted, perception of a substitute cannot be simulated, safety is insufficient, careless mistakes can occur in system monitoring, and a monitoring result has uncertainty.
Disclosure of Invention
In order to solve the problems, the invention provides a safety monitoring system with deep perception, which simulates the perception of human vision, touch, hearing and the like in many aspects by constructing a sensor data interaction network, collects the collected data, transmits the collected data to an intelligent box packaged with an artificial intelligent recognition algorithm, performs data cleaning and deep learning training by using a multi-sensor data fusion and machine vision technology, and finally recognizes and outputs an event result state through the intelligent algorithm, thereby improving the accuracy and the safety of the conventional safety monitoring system.
The invention provides a safety monitoring system for depth perception, which has the following specific technical scheme:
the system comprises a monitoring node, a sink node, an intelligent box and an image acquisition device, wherein the monitoring node is in wireless connection with the sink node, the sink node is remotely connected with the intelligent box through a wireless network, and the intelligent box is in wireless connection with the image acquisition device;
the monitoring nodes adopt wireless sensors to acquire data information and are distributed and arranged at different positions or equipment according to different detection types, the wireless sensors are also connected with a controller, and the controller can adjust and set a threshold value to control the sensors to send trigger signals;
the sink node is a wireless sensor network node, forms a wireless detection network with the monitoring nodes, receives and summarizes signals transmitted by the monitoring nodes, and sends the signals to the intelligent box;
the intelligent box triggers an image acquisition device in a corresponding area to acquire image data or video data according to a received signal and transmits the acquired image or video to the intelligent box, an encapsulated artificial intelligence algorithm is stored in the intelligent box, a deep learning model is adopted, the image or video is analyzed and processed to obtain an identification result, and the intelligent box sends the output identification result to a management platform.
Further, wireless sensor with the controller integrates the encapsulation, and the outside is equipped with fixing device, sets up at on-the-spot equipment and corresponds detection area.
Furthermore, the image acquisition device adopts a wireless fixed-focus visible light camera or an infrared camera and is arranged in a field monitoring area, a low-power-consumption communication module and a control module are arranged in the camera, and after receiving the trigger signal, the communication module controls the camera to acquire video image data with a preset time length through the control module.
Furthermore, the image acquisition device and the wireless sensor are both provided with independent power supplies for supplying power.
Furthermore, the artificial intelligence algorithm in the intelligent box can perform face recognition and safety monitoring, a data set adopted by deep learning model training comprises a source image material library and labeled field collected video images, and a model database is stored in the intelligent box.
Furthermore, the intelligent box can be remotely connected with an automatic control system, and alarm result signals are converted into electric signals to control the work of equipment in the corresponding area.
The invention has the following beneficial effects:
1. the system has the advantages that the sensor network with information interaction is built, various data information is collected to simulate the sense of a human, detected data are collected and analyzed and processed through the intelligent box packaged with the artificial intelligence algorithm, the final judgment result is identified and output, the system does not influence the operation of the original automatic system and the safety monitoring system, and the uncertainty of the original monitoring system is reduced or eliminated by using new information for cross verification.
2. The monitoring nodes of the system adopt wireless sensors, the data gathering and transmitting nodes adopt sink nodes to receive data and transmit the data to the intelligent box, the sink nodes are wirelessly connected with the monitoring nodes, the intelligent box is further connected with wireless cameras, when abnormality is detected, the cameras matched with corresponding areas can be triggered to gather video image data and transmit the video image data to the intelligent box, an encapsulated artificial intelligence algorithm is stored in the intelligent box, the gathered data is subjected to data fusion, a final result is identified and output, the system integrates data acquisition and data processing, the system has high adaptability and wide use, the system is not mutually controlled or connected with any original system, the influence of fault points of the original system is avoided, and the system can adapt to systems of any original manufacturers.
3. The system can be in butt joint with an automatic control system, the intelligent box can convert different alarm signals into electric signals according to the result of recognition output when the alarm signals appear, and the automatic control equipment works.
4. The artificial intelligence algorithm in the intelligent box is packaged, a model database is stored, a trained deep learning algorithm model is used for face recognition and safety monitoring, monitoring and supervision of workers entering and leaving a field and danger recognition and monitoring of a monitoring area are carried out, the algorithm is packaged and stored, and the system algorithm is upgraded.
5. The monitoring nodes and the cameras are all wireless low-power-consumption equipment, the wireless cameras are in a dormant state for a long time and are awakened through the connected low-power-consumption communication modules, and the sensors and the cameras are all provided with independent power supplies for supplying power, so that the limitation of the field environment is avoided, and meanwhile, the power consumption of the system is reduced.
Drawings
FIG. 1 is a schematic diagram of the system architecture of the present invention;
FIG. 2 is a schematic view of the safety monitoring control flow of the present invention;
fig. 3 is a schematic diagram of the field supervisory control flow of the present invention.
Detailed Description
In the following description, technical solutions in the embodiments of the present invention are clearly and completely described, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The embodiment of the invention provides a safety monitoring system for depth perception, which comprises a monitoring node, a sink node, an intelligent box and an image acquisition device, wherein the monitoring node is connected with the sink node through a network;
the monitoring nodes are wirelessly connected with the sink nodes, a wireless node network for data interaction is formed among the monitoring nodes, the sink nodes are remotely connected with the intelligent box through a wireless network, the intelligent box is wirelessly connected with the image acquisition device, the image acquisition device in the embodiment adopts a wireless focusing visible light or infrared camera, and the following brief description is a wireless camera.
The wireless cameras are respectively arranged in each monitoring area and each monitoring area of a working site, the image acquisition devices arranged in the monitoring areas are correspondingly matched with the monitoring nodes, are in a dormant standby state in a normal state, and start to acquire video image data after receiving a trigger signal;
the image acquisition devices arranged in each monitoring area of a working site are in a dormant standby state in a normal state, are started through control signals such as an entrance guard and the like, and capture and acquire face information, behavior information, dressing information and equipment information of workers;
and the wireless camera sends the acquired video image data information to the intelligent box, analyzes and processes the information and outputs a corresponding result.
The sink node is a wireless sensor network node, forms a wireless monitoring network with the monitoring nodes, receives and summarizes signals transmitted by the monitoring nodes, and sends the signals to the intelligent box.
The data acquisition equipment of the monitoring node adopts wireless sensors which are distributed and arranged at different positions or equipment according to different detection types, the embodiment takes the detection of the ignition state of an emptying torch of an unattended station as an example for explanation, the wireless sensors comprise vibration sensors, audio pickups and temperature and humidity sensors, wherein the vibration sensors adopt two three-axis vibration sensors which are arranged at a T-shaped joint of a pipeline in an emptying area of the torch, so that pneumatic signals during emptying can be accurately and timely captured, the audio pickups acquire sound signals during emptying of the pipeline, and the temperature and humidity sensors detect the temperature and peripheral humidity data of the pipeline and acquire input data for simulating human senses;
the wireless sensors are respectively connected with a controller, the controller can adjust a set threshold value to control the sensors to send trigger signals, when detected data signals exceed a preset threshold value, the sensors send the trigger signals and detected data information to the intelligent box through sink nodes, the intelligent box triggers the wireless cameras which are correspondingly matched with the monitoring nodes to start, video data in a preset certain time period are collected according to a preset time length, or continuous shooting is carried out in the preset certain time period, the video image collection time is 20-60s in the embodiment, and the collected video image data are transmitted to the intelligent box as data simulating human vision.
The intelligent box is internally stored with a packaged artificial intelligence algorithm and a model database, the artificial intelligence algorithm adopts a deep learning model, the artificial intelligence algorithm packages the algorithm into an SDK data packet, the algorithm comprises face recognition, behavior recognition, dressing supervision, equipment supervision, safety monitoring and the like, and after the intelligent box receives the data collected by the wireless camera, different algorithms are correspondingly called to analyze and process the data;
for example, after a wireless camera arranged in a monitoring area of a working site captures image data of people who enter and exit and transmits the image data to the intelligent box, the intelligent box calls a face recognition algorithm and a behavior recognition algorithm, extracts face data characteristics, action characteristics and person clothing of the image, compares the face data characteristics, action characteristics and person clothing with data in the model database to perform face recognition, behavior recognition and clothing recognition, judges whether the person in the acquired image is a worker, the current person behavior and whether the clothing is standardized, outputs the recognition judgment result to a data management platform of a corresponding working area, and then sends the data to a QHSE management system;
the training of the deep learning model in the recognition algorithm of face recognition, behavior recognition, dressing supervision and equipment supervision respectively adopts a face image, an open-source behavior image material, and a marked on-site camera acquired worker behavior image, dressing image and equipment image.
After the intelligent box receives a trigger signal of the monitoring node, remotely controlling and starting a wireless camera in a corresponding area to collect video image data, transmitting the collected data to the intelligent box by the wireless camera according to preset collection parameters, calling a corresponding safety monitoring algorithm after the intelligent box receives the returned video image data, carrying out processing analysis according to the data transmitted by the monitoring node and the data transmitted by the wireless camera device, and outputting a final judgment and identification result, outputting the identification and identification result to a data management platform of a corresponding working area by the intelligent box, and then sending the data to a QHSE (QHSE) management system by the management platform;
in the safety monitoring algorithm, in the identification of the conditions of a monitoring area, a training data set of a depth network model comprises an open source image material library and a marked video image acquired on site, for example, in the unmanned site station emptying torch safety monitoring, the training data set adopts a large number of open flame images and marked torch ignition image data shot on site, so that the algorithm can perform accurate identification and judgment according to the video image acquired by the wireless camera, and better simulated human visual perception is achieved.
In this embodiment, the safety monitoring system can access the SCADA system, the intelligent box is with discernment output's result information remote transmission to data management platform, the data management platform turns into the signal of telecommunication of control according to the discernment result, automatic control field device's work, for example, if not light alarm signal with the torch and change PLC control signal into, the automatic control torch ignition, or convert the output result of face identification into the control signal of telecommunication, realize entrance guard's control.
The invention is not limited to the foregoing embodiments. The invention extends to any novel feature or any novel combination of features disclosed in this specification and any novel method or process steps or any novel combination of features disclosed.
Claims (6)
1. The safety monitoring system for depth perception is characterized by comprising a monitoring node, a sink node, an intelligent box and an image acquisition device, wherein the monitoring node is in wireless connection with the sink node, the sink node is remotely connected with the intelligent box through a wireless network, and the intelligent box is in wireless connection with the image acquisition device;
the monitoring nodes adopt wireless sensors to acquire data information and are distributed and arranged at different positions or equipment according to different detection types, the wireless sensors are also connected with a controller, and the controller can adjust and set a threshold value to control the sensors to send trigger signals;
the sink node is a wireless sensor network node, forms a wireless detection network with the monitoring nodes, receives and summarizes signals transmitted by the monitoring nodes, and sends the signals to the intelligent box;
the intelligent box triggers an image acquisition device in a corresponding area to acquire image data or video data according to a received signal and transmits the acquired image or video to the intelligent box, an encapsulated artificial intelligence algorithm is stored in the intelligent box, a deep learning model is adopted, the image or video is analyzed and processed to obtain an identification result, and the intelligent box sends the output identification result to a management platform.
2. The safety monitoring system of claim 1, wherein the wireless sensor is integrally packaged with the controller, and is externally provided with a fixing device, a device arranged on the field and a corresponding detection area.
3. The safety monitoring system according to claim 2, wherein the image acquisition device is a wireless fixed-focus visible light camera or infrared camera, and is arranged in a field monitoring area, a low-power-consumption communication module and a control module are arranged in the camera, and after receiving the trigger signal, the communication module controls the camera to acquire video image data of a preset time length through the control module.
4. The safety monitoring system according to claim 3, wherein the image acquisition device and the wireless sensor are provided with independent power supplies for supplying power.
5. The security monitoring system of claim 4, wherein the artificial intelligence algorithm in the intelligent box is capable of face recognition and various security monitoring, the training of the deep learning model uses a data set comprising an open source image material library and labeled live captured video images, and the intelligent box further stores a model database.
6. The safety monitoring system according to claim 5, wherein the intelligent box is remotely connected with an automatic control system, and converts the alarm signal into an electric signal to control the operation of the corresponding area equipment.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011436849.5A CN112543497A (en) | 2020-12-10 | 2020-12-10 | Safety monitoring system of degree of depth perception |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011436849.5A CN112543497A (en) | 2020-12-10 | 2020-12-10 | Safety monitoring system of degree of depth perception |
Publications (1)
Publication Number | Publication Date |
---|---|
CN112543497A true CN112543497A (en) | 2021-03-23 |
Family
ID=75019956
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011436849.5A Pending CN112543497A (en) | 2020-12-10 | 2020-12-10 | Safety monitoring system of degree of depth perception |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112543497A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114639065A (en) * | 2022-05-18 | 2022-06-17 | 山东捷瑞数字科技股份有限公司 | Method and device for monitoring operation safety of forming equipment |
CN114999215A (en) * | 2022-05-27 | 2022-09-02 | 北京筑梦园科技有限公司 | Vehicle information acquisition method and device and parking management system |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107331007A (en) * | 2017-06-29 | 2017-11-07 | 惠国征信服务股份有限公司 | A kind of wagon flow logistics supervisory systems and method based on video acquisition |
CN206629277U (en) * | 2017-04-13 | 2017-11-10 | 厦门信通慧安科技有限公司 | A kind of WSN monitoring systems |
CN108737509A (en) * | 2018-04-28 | 2018-11-02 | 深圳汇通智能化科技有限公司 | A kind of Intelligent data center robot inspection tour system based on augmented reality |
CN111578994A (en) * | 2020-05-19 | 2020-08-25 | 吴普侠 | Real-time monitoring system and method for forest ecological environment |
CN111770266A (en) * | 2020-06-15 | 2020-10-13 | 北京世纪瑞尔技术股份有限公司 | Intelligent visual perception system |
-
2020
- 2020-12-10 CN CN202011436849.5A patent/CN112543497A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN206629277U (en) * | 2017-04-13 | 2017-11-10 | 厦门信通慧安科技有限公司 | A kind of WSN monitoring systems |
CN107331007A (en) * | 2017-06-29 | 2017-11-07 | 惠国征信服务股份有限公司 | A kind of wagon flow logistics supervisory systems and method based on video acquisition |
CN108737509A (en) * | 2018-04-28 | 2018-11-02 | 深圳汇通智能化科技有限公司 | A kind of Intelligent data center robot inspection tour system based on augmented reality |
CN111578994A (en) * | 2020-05-19 | 2020-08-25 | 吴普侠 | Real-time monitoring system and method for forest ecological environment |
CN111770266A (en) * | 2020-06-15 | 2020-10-13 | 北京世纪瑞尔技术股份有限公司 | Intelligent visual perception system |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114639065A (en) * | 2022-05-18 | 2022-06-17 | 山东捷瑞数字科技股份有限公司 | Method and device for monitoring operation safety of forming equipment |
CN114999215A (en) * | 2022-05-27 | 2022-09-02 | 北京筑梦园科技有限公司 | Vehicle information acquisition method and device and parking management system |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN105867245B (en) | A kind of electric power information monitoring system | |
KR102161244B1 (en) | Industrial equipment monitoring and alarm apparatus and its method | |
CN105261079A (en) | Portable intelligent electric power inspection device | |
CN107014433A (en) | A kind of intelligent O&M cruising inspection system in hydroelectric power plant and its method | |
CN107588799A (en) | A kind of architectural electricity equipment on-line monitoring system | |
CN112543497A (en) | Safety monitoring system of degree of depth perception | |
CN103235562A (en) | Patrol-robot-based comprehensive parameter detection system and method for substations | |
CN111429107A (en) | Intelligent construction site management system | |
CN104977856B (en) | Substation monitoring analogue means, system and method | |
CN113988633A (en) | Anti-misoperation system of digital twin transformer substation | |
CN111131478A (en) | Light steel villa monitoring management system | |
CN113885526A (en) | Intelligent power plant autonomous inspection robot inspection system and method | |
CN115131505A (en) | Multi-system fusion's of transformer substation panorama perception system | |
CN112702570A (en) | Security protection management system based on multi-dimensional behavior recognition | |
CN115939996A (en) | Automatic inspection system of power inspection robot | |
CN111753780A (en) | Transformer substation violation detection system and violation detection method | |
CN211878460U (en) | Intelligent comprehensive monitoring system for distribution transformer environment | |
CN109406954A (en) | Transmission line malfunction monitors position indicator system | |
CN111932816A (en) | Fire alarm management method and device for offshore wind farm and island microgrid | |
CN118433231A (en) | Intelligent building site construction safety monitoring cloud edge cooperative early warning system based on edge mobile monitoring station, control system thereof and control method thereof | |
CN118196624A (en) | Intelligent safety helmet identification system and method for construction site | |
CN106843187A (en) | A kind of family intelligent monitoring system | |
CN113858194A (en) | A other robot that stands for job site auxiliary management | |
CN112433488A (en) | Equipment safety management system | |
CN117278960A (en) | Water environment intelligent inspection system and method with cooperation of unmanned aerial vehicle and ground sensing network |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20210323 |