CN112712604A - Machine vision-based on-site inspection quality non-inductive detection method - Google Patents
Machine vision-based on-site inspection quality non-inductive detection method Download PDFInfo
- Publication number
- CN112712604A CN112712604A CN202011426838.9A CN202011426838A CN112712604A CN 112712604 A CN112712604 A CN 112712604A CN 202011426838 A CN202011426838 A CN 202011426838A CN 112712604 A CN112712604 A CN 112712604A
- Authority
- CN
- China
- Prior art keywords
- patrol
- inspection
- time
- value
- area
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000007689 inspection Methods 0.000 title claims abstract description 61
- 238000001514 detection method Methods 0.000 title claims abstract description 17
- 230000001939 inductive effect Effects 0.000 title claims abstract description 16
- 238000000034 method Methods 0.000 claims abstract description 32
- 238000002372 labelling Methods 0.000 claims description 15
- 238000012544 monitoring process Methods 0.000 claims description 15
- 230000003068 static effect Effects 0.000 claims description 15
- 230000008878 coupling Effects 0.000 claims description 12
- 238000010168 coupling process Methods 0.000 claims description 12
- 238000005859 coupling reaction Methods 0.000 claims description 12
- 238000010276 construction Methods 0.000 claims description 3
- 230000001788 irregular Effects 0.000 claims description 3
- 238000013441 quality evaluation Methods 0.000 claims description 3
- 230000001953 sensory effect Effects 0.000 claims 3
- 230000008569 process Effects 0.000 abstract description 12
- 238000004519 manufacturing process Methods 0.000 abstract description 4
- 238000004080 punching Methods 0.000 abstract description 4
- 230000009471 action Effects 0.000 description 2
- 238000013527 convolutional neural network Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000001914 filtration Methods 0.000 description 2
- 238000012795 verification Methods 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000005611 electricity Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000000737 periodic effect Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000004083 survival effect Effects 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C1/00—Registering, indicating or recording the time of events or elapsed time, e.g. time-recorders for work people
- G07C1/20—Checking timed patrols, e.g. of watchman
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
Abstract
The invention discloses a machine vision-based on-site inspection quality non-inductive detection method, inspection personnel do not need to sign in at a fixed point and punch a card at a fixed point, and the whole process is non-inductive, and the inspection personnel can use the existing camera device and control system in the production site without additionally installing a card punching device, so that the structure is simple and the cost is low; the inspection time of inspection personnel at each key point can be automatically counted, the inspection quality is objectively evaluated, the inspection quality is further improved, and the situations of ' only walking and not looking ', only looking and not repairing ' and the like are effectively avoided; the patrol process is recorded completely, and the patrol evidence obtaining and the early warning and alarming of dangerous points can be realized.
Description
Technical Field
The invention relates to the technical field of metering automation, in particular to a machine vision-based on-site inspection quality non-inductive detection method.
Background
In China, electric energy becomes an important energy source which is essential for the survival and development of human society, and almost all human activities have close relation with electricity. The metering device is used as a key device for electric energy metering, how to ensure the accuracy and high efficiency of the calibration of the metering device, and how to improve the calibration efficiency of the metering device, becomes a problem which needs to be solved urgently in metering and detecting departments at all levels.
The method for carrying out normalized regular inspection on the automatic verification and detection system for metering is one of effective ways for ensuring the accuracy and reliability of the verification system, the test equipment and the electric energy metering device. On one hand, the traditional inspection modes such as fixed-point sign-in or fixed-point card punching depend on the responsibility of inspection personnel in the aspect of inspection effect; and on the other hand, the necessary field safety control capability is lacked. Aiming at the problems existing in the traditional inspection mode, the introduction of a machine vision-based on-site inspection quality non-inductive detection method is a great trend. The non-inductive inspection method for the periodic inspection quality of the operation site based on machine vision can remind potential safety hazards which may appear in the inspection process in time while realizing the non-inductive inspection of inspection work by means of the existing camera device in the production site, eliminates potential dangers and objectively evaluates the inspection quality.
Disclosure of Invention
The invention aims to solve the technical problem that the traditional fixed-point sign-in or fixed-point operation only needs to see or does not need to be repaired, and aims to provide a machine vision-based on-site inspection quality non-inductive detection method.
The invention is realized by the following technical scheme:
1. a machine vision-based on-site inspection quality non-inductive detection method comprises the following steps:
s1: the method comprises the steps that a camera device collects static images of a monitored area, the collected static images of the monitored area are sent to a system, and the system obtains static image data of a plurality of monitored areas;
s2: the system simultaneously performs inspection area operation and inspection point marking operation on the obtained static image data of the multiple monitoring areas to obtain an inspection route and an inspection point after construction; the patrol area refers to an area in which patrol personnel must carry out patrol work in the patrol process; the patrol location point refers to equipment or a device which is necessary for patrol personnel to carry out patrol work in the patrol process;
s3: the camera device carries out first continuous image acquisition operation on the monitored area, and sends the first acquired monitored area image to the system, and the system obtains a plurality of first acquired monitored area images; the continuous acquisition refers to reading image data of the camera device at a specific frequency; the specific frequency can be manually designated or automatically set;
s4: the system judges whether people exist in the multiple monitoring area images acquired for the first time, if the system judges that people exist, the next step S5 is carried out, and if the system judges that people do not exist, the previous step S3 is returned;
s5: after the system judges that people exist in the first monitoring area image, timing operation is started, and timing time is set to be T1;
s6: after the system carries out patrol personnel real-time positioning on the 'people' image in the monitored area, storing coordinate information of the patrol personnel and a track map formed by the set time of each positioning point; the basic principle of the method for real-time positioning and forming the patrolling staff patrolling trajectory graph is that the method comprises the steps of utilizing YOLO, convolutional neural networks and the like to collect staff in a target area in real time, acquiring staff coordinate information, then tracking and predicting the staff action trajectory in real time through SORT, DEEPSORT, Kalman filtering and Hungary algorithm technologies, and finally forming the patrolling trajectory graph.
S7: the camera device performs second continuous image acquisition operation on the monitored area, and sends the second acquired monitored area image to the system, and the system obtains a plurality of second acquired monitored area images;
s8: the system judges whether people exist in the multiple monitoring area images acquired for the second time, if the system judges that people do not exist, the next step S9 is carried out, and if the system judges that people exist, the step S6 is returned;
s9: after the system judges that no person exists in the second monitoring area image, timing operation is ended, and timing time is set to be T2;
s10: the system performs subtraction operation on the acquired time TI and the acquired time T2, the result of the subtraction operation is compared with the preset value of the system, if the result is greater than the preset value, the next step S11 is performed, and if the result is less than the preset value, the step 3 is returned;
s11: the system analyzes the patrol trace map of the patrol personnel, calculates the dwell time of each patrol point position, the coupling degree of the actual patrol trace map and the patrol route preset by the system, and evaluates the obtained patrol quality result.
Further, the labeling operation specifically refers to the specification of the patrol area, the patrol route and the patrol point location on the field area static diagram obtained by the camera device, and the coordinate data set corresponding to the patrol area, the patrol route and the patrol point location is obtained.
Further, the labeling method comprises rectangular labeling, circular labeling and irregular curve labeling.
Further, the method for calculating the stay time of the patrol position is a time accumulated value of the overlap of the patrol personnel track and the patrol position coordinate.
Further, the method for calculating the coupling degree between the actual patrol trace map and the required patrol route comprises the following steps:
(a)t1the accumulated time value of the actual patrol track overlapped with the required patrol route is obtained;
(b)t0actual patrol time;
further, the patrol quality evaluation method comprises the following steps: when the stay time or the coupling degree of the patrol point position is more than 1.2 times of the set threshold value, the patrol point position is judged to be excellent; when the value is less than 1.2 times of the set threshold value but more than 0.9 times of the set threshold value, the result is judged to be good; if the value is less than 0.9 times of the set threshold value but more than 0.8 times of the set threshold value, the value is determined to be normal; and if the value is less than 0.8 times of the set threshold value, determining that the product is not qualified.
Compared with the prior art, the invention has the following advantages and beneficial effects:
the invention relates to a machine vision-based on-site inspection quality non-inductive detection method, which has the following main advantages that:
firstly, in the process of inspection, inspection personnel do not need to sign in at fixed points and punch cards at fixed points, and no sense is participated in the whole process;
secondly, the existing camera device and control system in the production field can be used, and no additional card punching device is needed, so that the structure is simple and the cost is low;
thirdly, the inspection time of the inspection personnel at each key point can be automatically counted, the inspection quality is objectively evaluated, the inspection quality is further improved, and the situations of 'only walking without looking', 'only looking without repairing', and the like are effectively avoided;
fourthly, the patrol process is completely recorded, and the patrol evidence obtaining, the danger point early warning and other auxiliary functions can be realized.
Drawings
The accompanying drawings, which are included to provide a further understanding of the embodiments of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the principles of the invention. In the drawings:
FIG. 1 is a flow chart of the detection method of the present invention;
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is further described in detail below with reference to examples and accompanying drawings, and the exemplary embodiments and descriptions thereof are only used for explaining the present invention and are not meant to limit the present invention.
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present invention. However, it will be apparent to one of ordinary skill in the art that: it is not necessary to employ these specific details to practice the present invention. In other instances, well-known structures, circuits, materials, or methods have not been described in detail so as not to obscure the present invention.
Throughout the specification, reference to "one embodiment," "an embodiment," "one example," or "an example" means: the particular features, structures, or characteristics described in connection with the embodiment or example are included in at least one embodiment of the invention. Thus, the appearances of the phrases "one embodiment," "an embodiment," "one example" or "an example" in various places throughout this specification are not necessarily all referring to the same embodiment or example. Furthermore, the particular features, structures, or characteristics may be combined in any suitable combination and/or sub-combination in one or more embodiments or examples. Further, those of ordinary skill in the art will appreciate that the illustrations provided herein are for illustrative purposes and are not necessarily drawn to scale. As used herein, the term "and/or" includes any and all combinations of one or more of the associated listed items.
In the description of the present invention, it is to be understood that the terms "front", "rear", "left", "right", "upper", "lower", "vertical", "horizontal", "high", "low", "inner", "outer", etc. indicate orientations or positional relationships based on those shown in the drawings, and are only for convenience of description and simplicity of description, and do not indicate or imply that the referenced devices or elements must have a particular orientation, be constructed and operated in a particular orientation, and therefore, are not to be construed as limiting the scope of the present invention.
Examples
As shown in fig. 1, a machine vision-based on-site inspection quality non-inductive detection method, inspection personnel do not need to sign in at a fixed point and punch a card at a fixed point, and the whole process is non-inductive, and the inspection personnel can use the existing camera device and control system in the production site without additionally installing a card punching device, so that the structure is simple and the cost is low; the inspection time of inspection personnel at each key point can be automatically counted, the inspection quality is objectively evaluated, the inspection quality is further improved, and the situations of ' only walking and not looking ', only looking and not repairing ' and the like are effectively avoided; the patrol process is completely recorded, and the patrol evidence obtaining, dangerous point early warning and other auxiliary functions can be realized; the method comprises the following steps:
s1: the method comprises the steps that a camera device collects static images of a monitored area, the collected static images of the monitored area are sent to a system, and the system obtains static image data of a plurality of monitored areas;
the camera device takes a picture of the field area and comprises: one camera device takes pictures of a plurality of areas and one camera device takes pictures of one area.
S2: the system simultaneously performs inspection area operation and inspection point marking operation on the obtained static image data of the multiple monitoring areas to obtain an inspection route and an inspection point after construction; the patrol area refers to an area in which patrol personnel must carry out patrol work in the patrol process; the patrol location point refers to equipment or a device which is necessary for patrol personnel to carry out patrol work in the patrol process;
s3: the camera device carries out first continuous image acquisition operation on the monitored area, and sends the first acquired monitored area image to the system, and the system obtains a plurality of first acquired monitored area images; the continuous acquisition refers to reading image data of the camera device at a specific frequency; the specific frequency can be manually designated or automatically set;
s4: the system judges whether people exist in the multiple monitoring area images acquired for the first time, if the system judges that people exist, the next step S5 is carried out, and if the system judges that people do not exist, the previous step S3 is returned;
s5: after the system judges that people exist in the first monitoring area image, timing operation is started, and timing time is set to be T1;
s6: after the system carries out patrol personnel real-time positioning on the 'people' image in the monitored area, storing coordinate information of the patrol personnel and a track map formed by the set time of each positioning point; the basic principle of the method for real-time positioning and forming the patrolling staff patrolling trajectory graph is that the method comprises the steps of utilizing YOLO, convolutional neural networks and the like to collect staff in a target area in real time, acquiring staff coordinate information, then tracking and predicting the staff action trajectory in real time through SORT, DEEPSORT, Kalman filtering and Hungary algorithm technologies, and finally forming the patrolling trajectory graph.
S7: the camera device performs second continuous image acquisition operation on the monitored area, and sends the second acquired monitored area image to the system, and the system obtains a plurality of second acquired monitored area images;
s8: the system judges whether people exist in the multiple monitoring area images acquired for the second time, if the system judges that people do not exist, the next step S9 is carried out, and if the system judges that people exist, the step S6 is returned;
s9: after the system judges that no person exists in the second monitoring area image, timing operation is ended, and timing time is set to be T2;
s10: the system performs subtraction operation on the acquired time TI and the acquired time T2, the result of the subtraction operation is compared with the preset value of the system, if the result is greater than the preset value, the next step S11 is performed, and if the result is less than the preset value, the step 3 is returned;
s11: the system analyzes the patrol trace map of the patrol personnel, calculates the dwell time of each patrol point position, the coupling degree of the actual patrol trace map and the patrol route preset by the system, and evaluates the obtained patrol quality result.
Further, the labeling operation specifically refers to the specification of the patrol area, the patrol route and the patrol point location on the field area static diagram obtained by the camera device, and the coordinate data set corresponding to the patrol area, the patrol route and the patrol point location is obtained.
Further, the labeling method comprises rectangular labeling, circular labeling and irregular curve labeling.
Further, the method for calculating the stay time of the patrol position is a time accumulated value of the overlap of the patrol personnel track and the patrol position coordinate.
Further, the method for calculating the coupling degree between the actual patrol trace map and the required patrol route comprises the following steps:
(a)t1the accumulated time value of the actual patrol track overlapped with the required patrol route is obtained;
(b)t0actual patrol time;
further, the patrol quality evaluation method comprises the following steps: when the stay time or the coupling degree of the patrol point position is more than 1.2 times of the set threshold value, the patrol point position is judged to be excellent; when the value is less than 1.2 times of the set threshold value but more than 0.9 times of the set threshold value, the result is judged to be good; if the value is less than 0.9 times of the set threshold value but more than 0.8 times of the set threshold value, the value is determined to be normal; judging the product to be unqualified when the product is less than 0.8 times of the set threshold value; the threshold value and coefficients of 0.8, 0.9, 1.2, etc. can be adjusted or set according to actual conditions.
The above-mentioned embodiments are intended to illustrate the objects, technical solutions and advantages of the present invention in further detail, and it should be understood that the above-mentioned embodiments are merely exemplary embodiments of the present invention, and are not intended to limit the scope of the present invention, and any modifications, equivalent substitutions, improvements and the like made within the spirit and principle of the present invention should be included in the scope of the present invention.
Claims (6)
1. A machine vision-based on-site inspection quality non-inductive detection method is characterized by comprising the following steps:
s1: the method comprises the steps that a camera device collects static images of a monitored area, the collected static images of the monitored area are sent to a system, and the system obtains static image data of a plurality of monitored areas;
s2: the system simultaneously performs inspection area operation and inspection point marking operation on the obtained static image data of the multiple monitoring areas to obtain an inspection route and an inspection point after construction;
s3: the camera device carries out first continuous image acquisition operation on the monitored area, and sends the first acquired monitored area image to the system, and the system obtains a plurality of first acquired monitored area images;
s4: the system judges whether people exist in the multiple monitoring area images acquired for the first time, if the system judges that people exist, the next step S5 is carried out, and if the system judges that people do not exist, the previous step S3 is returned;
s5: after the system judges that people exist in the first monitoring area image, timing operation is started, and timing time is set to be T1;
s6: after the system carries out patrol personnel real-time positioning on the 'people' image in the monitored area, storing coordinate information of the patrol personnel and a track map formed by the set time of each positioning point;
s7: the camera device performs second continuous image acquisition operation on the monitored area, and sends the second acquired monitored area image to the system, and the system obtains a plurality of second acquired monitored area images;
s8: the system judges whether people exist in the multiple monitoring area images acquired for the second time, if the system judges that people do not exist, the next step S9 is carried out, and if the system judges that people exist, the step S6 is returned;
s9: after the system judges that no person exists in the second monitoring area image, timing operation is ended, and timing time is set to be T2;
s10: the system performs subtraction operation on the acquired time TI and the acquired time T2, the result of the subtraction operation is compared with the preset value of the system, if the result is greater than the preset value, the next step S11 is performed, and if the result is less than the preset value, the step 3 is returned;
s11: the system analyzes the patrol trace map of the patrol personnel, calculates the dwell time of each patrol point position, the coupling degree of the actual patrol trace map and the patrol route preset by the system, and evaluates the obtained patrol quality result.
2. The machine vision-based field inspection quality non-sensory detection method according to claim 1, wherein the labeling operation specifically refers to designation of an inspection area, an inspection route and an inspection point location for a field area static chart obtained by a camera device, and obtaining a coordinate data set corresponding to the inspection area, the inspection route and the inspection point location.
3. The machine vision-based on-site inspection tour quality non-inductive detection method according to claim 2, characterized in that the labeling method is rectangular labeling, circular labeling and irregular curve labeling.
4. The machine vision-based on-site patrol quality non-sensory detection method according to claim 1, wherein the method for calculating patrol location dwell time is a time accumulated value of the overlap of patrol person track and patrol location coordinate.
5. The machine vision-based on-site patrol quality non-sensory detection method according to claim 1, wherein the method for calculating the coupling degree between the actual patrol trace map and the required patrol route comprises the following steps:
(a)t1the accumulated time value of the actual patrol track overlapped with the required patrol route is obtained;
(b)t0actual patrol time;
6. the machine vision-based on-site inspection tour quality non-inductive detection method according to claim 1, characterized in that the inspection tour quality evaluation method is as follows: when the stay time or the coupling degree of the patrol point position is more than 1.2 times of the set threshold value, the patrol point position is judged to be excellent; when the value is less than 1.2 times of the set threshold value but more than 0.9 times of the set threshold value, the result is judged to be good; if the value is less than 0.9 times of the set threshold value but more than 0.8 times of the set threshold value, the value is determined to be normal; and if the value is less than 0.8 times of the set threshold value, determining that the product is not qualified.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011426838.9A CN112712604A (en) | 2020-12-09 | 2020-12-09 | Machine vision-based on-site inspection quality non-inductive detection method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011426838.9A CN112712604A (en) | 2020-12-09 | 2020-12-09 | Machine vision-based on-site inspection quality non-inductive detection method |
Publications (1)
Publication Number | Publication Date |
---|---|
CN112712604A true CN112712604A (en) | 2021-04-27 |
Family
ID=75542732
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011426838.9A Pending CN112712604A (en) | 2020-12-09 | 2020-12-09 | Machine vision-based on-site inspection quality non-inductive detection method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112712604A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115240289A (en) * | 2022-09-20 | 2022-10-25 | 泰豪信息技术有限公司 | Patrol monitoring method and device for prison patrol personnel, storage medium and electronic equipment |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1136937A2 (en) * | 2000-03-22 | 2001-09-26 | Kabushiki Kaisha Toshiba | Facial image forming recognition apparatus and a pass control apparatus |
EP1216534A1 (en) * | 1999-09-10 | 2002-06-26 | Ultra-Scan Corporation | Mobile fingerprint scanner and docking station |
CN201623806U (en) * | 2010-02-22 | 2010-11-03 | 江苏省电力公司无锡供电公司 | Intelligent video substation polling quality certificate system |
US20160182778A1 (en) * | 2012-01-25 | 2016-06-23 | Remote Ocean Systems, Inc. | Sensor system for high-radiation environments |
CN106097474A (en) * | 2016-06-08 | 2016-11-09 | 朱兰英 | System is analyzed in a kind of indoor patrol based on real-time virtual reality technology |
CN110827434A (en) * | 2019-09-23 | 2020-02-21 | 重庆特斯联智慧科技股份有限公司 | Community security patrol recording system and method for grid target identification |
CN111401146A (en) * | 2020-02-26 | 2020-07-10 | 长江大学 | Unmanned aerial vehicle power inspection method, device and storage medium |
-
2020
- 2020-12-09 CN CN202011426838.9A patent/CN112712604A/en active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1216534A1 (en) * | 1999-09-10 | 2002-06-26 | Ultra-Scan Corporation | Mobile fingerprint scanner and docking station |
EP1136937A2 (en) * | 2000-03-22 | 2001-09-26 | Kabushiki Kaisha Toshiba | Facial image forming recognition apparatus and a pass control apparatus |
CN201623806U (en) * | 2010-02-22 | 2010-11-03 | 江苏省电力公司无锡供电公司 | Intelligent video substation polling quality certificate system |
US20160182778A1 (en) * | 2012-01-25 | 2016-06-23 | Remote Ocean Systems, Inc. | Sensor system for high-radiation environments |
CN106097474A (en) * | 2016-06-08 | 2016-11-09 | 朱兰英 | System is analyzed in a kind of indoor patrol based on real-time virtual reality technology |
CN110827434A (en) * | 2019-09-23 | 2020-02-21 | 重庆特斯联智慧科技股份有限公司 | Community security patrol recording system and method for grid target identification |
CN111401146A (en) * | 2020-02-26 | 2020-07-10 | 长江大学 | Unmanned aerial vehicle power inspection method, device and storage medium |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115240289A (en) * | 2022-09-20 | 2022-10-25 | 泰豪信息技术有限公司 | Patrol monitoring method and device for prison patrol personnel, storage medium and electronic equipment |
CN115240289B (en) * | 2022-09-20 | 2022-12-20 | 泰豪信息技术有限公司 | Patrol monitoring method and device for prison patrol personnel, storage medium and electronic equipment |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110850723B (en) | Fault diagnosis and positioning method based on transformer substation inspection robot system | |
CN108491758A (en) | A kind of track detection method and robot | |
CN110517501A (en) | A kind of vehicle overload early warning system and method | |
CN110889339B (en) | Head and shoulder detection-based dangerous area grading early warning method and system | |
CN109803127A (en) | Urban safety building site monitoring system and method based on big data and technology of Internet of things | |
CN111753712A (en) | Method, system and equipment for monitoring safety of power production personnel | |
CN114155601A (en) | Vision-based method and system for detecting dangerous behaviors of operating personnel | |
CN109506625A (en) | Landslide dangerous situation monitoring method, system and its image data acquisition device | |
KR101989376B1 (en) | Integrated track circuit total monitoring system | |
KR101461184B1 (en) | Wether condition data extraction system using cctv image | |
CN107358778A (en) | A kind of fire-alarm of combination KNN algorithms | |
CN103729908A (en) | Intelligent inspection device of railway tunnel and application method thereof | |
CN112235537A (en) | Transformer substation field operation safety early warning method | |
CN108761290A (en) | Robot and its piping lane electric compartment method for inspecting, system, equipment, storage medium | |
CN210222962U (en) | Intelligent electronic fence system | |
CN107103300A (en) | One kind is left the post detection method and system | |
CN115600124A (en) | Subway tunnel inspection system and inspection method | |
CN109784672A (en) | A kind of warning system for real time monitoring and method for power grid exception | |
CN111275957A (en) | Traffic accident information acquisition method, system and camera | |
CN113371028A (en) | Intelligent inspection system and method for electric bus-mounted track | |
CN212180098U (en) | Infrared temperature measurement system based on face recognition | |
CN110210338A (en) | The dressing information of a kind of pair of target person carries out the method and system of detection identification | |
CN112712604A (en) | Machine vision-based on-site inspection quality non-inductive detection method | |
CN115880722A (en) | Intelligent identification method, system and medium worn by power distribution operating personnel | |
CN111767913A (en) | Motor train unit vehicle fault dynamic image detection method based on deep learning |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20210427 |
|
RJ01 | Rejection of invention patent application after publication |