CN115496398A - Electric power operation safety control method and system - Google Patents

Electric power operation safety control method and system Download PDF

Info

Publication number
CN115496398A
CN115496398A CN202211245971.3A CN202211245971A CN115496398A CN 115496398 A CN115496398 A CN 115496398A CN 202211245971 A CN202211245971 A CN 202211245971A CN 115496398 A CN115496398 A CN 115496398A
Authority
CN
China
Prior art keywords
image
moving target
information
frame
model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211245971.3A
Other languages
Chinese (zh)
Inventor
常政威
邓元实
杨琳
陈明举
熊兴中
徐昌前
赵俊
谢正军
蒲维
吴杰
丁宣文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Electric Power Research Institute of State Grid Sichuan Electric Power Co Ltd
Original Assignee
Electric Power Research Institute of State Grid Sichuan Electric Power Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Electric Power Research Institute of State Grid Sichuan Electric Power Co Ltd filed Critical Electric Power Research Institute of State Grid Sichuan Electric Power Co Ltd
Priority to CN202211245971.3A priority Critical patent/CN115496398A/en
Publication of CN115496398A publication Critical patent/CN115496398A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0635Risk analysis of enterprise or organisation activities
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management
    • G06Q10/103Workflow collaboration or project management
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/20Administration of product repair or maintenance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/06Energy or water supply
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects

Landscapes

  • Business, Economics & Management (AREA)
  • Engineering & Computer Science (AREA)
  • Human Resources & Organizations (AREA)
  • Strategic Management (AREA)
  • Economics (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Entrepreneurship & Innovation (AREA)
  • General Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • Tourism & Hospitality (AREA)
  • Marketing (AREA)
  • Quality & Reliability (AREA)
  • Operations Research (AREA)
  • Health & Medical Sciences (AREA)
  • Development Economics (AREA)
  • Educational Administration (AREA)
  • Game Theory and Decision Science (AREA)
  • Multimedia (AREA)
  • Data Mining & Analysis (AREA)
  • Public Health (AREA)
  • Water Supply & Treatment (AREA)
  • General Health & Medical Sciences (AREA)
  • Primary Health Care (AREA)
  • Image Processing (AREA)

Abstract

The invention belongs to the technical field of electric power operation management and control, and discloses an electric power operation safety management and control method and system, wherein a static three-dimensional model of an electric power operation field is established; collecting electrical information of each device on an electric power operation site; collecting video monitoring information, operation information and Beidou positioning information of a moving target entering an electric power operation site; sensing dangerous situations and detecting distances based on a collision principle according to the electrical information of each device, the video monitoring information of the moving target, the operation information of the moving target and the Beidou positioning information of the moving target; and if the result of the sensing of the dangerous situation is that the moving target enters the electrified area by mistake or the moving target collides electrified equipment by mistake or the result of the distance detection is that the distance between the moving target and the electrified equipment is smaller than the safe distance, performing real-time early warning. The electric power operation can be accurately managed and controlled in real time.

Description

Electric power operation safety control method and system
Technical Field
The invention relates to the technical field of electric power operation management and control, in particular to an electric power operation safety management and control method and system.
Background
Safety is a constant theme for the power industry. Through long-term practice and experience accumulation in the electric power industry of China, a series of more perfect safety management measures and technical measures are gradually formed, and the occurrence of malignant accidents is effectively reduced. However, due to the particularity of the power industry, the safety production situation of the power enterprises is still severe, and power grids, equipment and personal accidents at the operation site still happen occasionally.
At present, video monitoring, robots, unmanned aerial vehicles and personnel positioning technologies are gradually applied to develop power equipment inspection, emergency command and operation monitoring, three-dimensional models with different fineness degrees are gradually built in partial transformer substations and power transmission corridors, but the main emphasis is on improving operation and maintenance efficiency, and few researches on operation site safety control are carried out.
In view of this, the present application is specifically made.
Disclosure of Invention
The invention aims to provide an electric power operation safety control method and system, and the electric power operation intelligent safety control system based on a fine three-dimensional scene aims at potential safety hazards such as mistaken entering of a live area, insufficient safety distance, mistaken collision of equipment facilities and the like from the requirement of electric power operation site for preventing power grids, equipment and personal safety accidents, and realizes operation scene monitoring, operation management, deduction simulation and alarm management based on the fine three-dimensional scene modeling based on a fine three-dimensional scene modeling, a high-precision target positioning technology and a dangerous situation perception technology.
The invention is realized by the following technical scheme:
in one aspect, the invention provides a power operation safety control method, which comprises the following steps: establishing a static three-dimensional model of an electric power operation site; collecting electrical information of each device on an electric power operation site; collecting video monitoring information, operation information and Beidou positioning information of a moving target entering an electric power operation site; sensing danger situations and detecting distances based on a collision principle according to the electrical information of each device, the video monitoring information of the moving target, the operation information of the moving target and the Beidou positioning information of the moving target; and if the result of the sensing of the dangerous situation is that the moving target enters the electrified area by mistake or the moving target collides electrified equipment by mistake or the result of the distance detection is that the distance between the moving target and the electrified equipment is smaller than the safe distance, performing real-time early warning.
Further, the electrical information comprises an equipment ID, coordinates of the equipment in the static three-dimensional model, an electrified state of the equipment, a safety distance corresponding to each voltage level of the equipment and an anti-violation operation rule corresponding to the equipment; the job information includes work ticket information of the moving target.
Further, before the dangerous situation perception and the distance detection based on the collision principle, the method comprises the following steps: planning different paths for the moving target to reach the designated position according to the Beidou positioning information and the work order information of the moving target entering the electric power operation site; aiming at each path, carrying out safety distance detection and operation flow deduction on the moving target, judging whether the moving target meets the safety distance on the path according to a detection result, and judging whether the operation flow of the moving target meets anti-violation operation rules or not according to a deduction result; if the safety distance is not met or the anti-violation operation rule is not met, updating the operation information of the moving target; and if the safety distance and the anti-violation operation rule are met, sensing the dangerous situation and detecting the distance based on the collision principle.
Further, the distance detection based on the collision principle comprises the following steps: establishing a spatial local coordinate system by taking the central point of the static three-dimensional model as an origin, and acquiring coordinates of each device of the electric power operation site in the spatial local coordinate system; establishing a space model of the moving target, and acquiring Beidou positioning information of the space model and real-time coordinates of each feature point on the space model; calculating the motion direction of the space model according to the Beidou positioning information of the space model; and emitting a plurality of rays in the moving direction by taking the space model as a starting point, obtaining the actual distance between the space model and the nearest equipment according to the information returned by the rays, and judging whether the actual distance is less than or equal to the safe distance.
Further, the electric power operation safety control method further comprises the following steps: acquiring a preprocessing image containing a moving target, and acquiring edge contour information of the moving target from the preprocessing image; shooting to obtain a first frame of monitoring image containing a moving target, and recording the ambient light intensity L when the first frame of monitoring image is shot J (ii) a Marking a plurality of pixel points corresponding to the edge contour information in the first frame of monitoring image, connecting the plurality of pixel points end to form a surrounding line, and intercepting an image area surrounded by the surrounding line to obtain a second frame of monitoring image; intercepting a first model image from the static three-dimensional model, and recording the ambient illumination intensity L when the static three-dimensional model is constructed M (ii) a Marking a plurality of pixel points corresponding to the edge contour information in the first model image, connecting the plurality of pixel points end to form a bounding line, and intercepting an image area surrounded by the bounding line to obtain a second model image; according to the second frame of monitoring image and the ambient light intensity L J The ambient lightIllumination intensity L M Calculating the second model image to obtain a replacement image; and covering a second model image in the first model image by using the replacement image to obtain a live-action fusion model.
Further, the calculating the replacement image comprises the following steps: calculating to obtain an integral color characterization coefficient C of the second frame of monitoring image J
Figure BDA0003886633670000021
In the formula (1), R Ji Representing the value of R, G, of the RGB color values of the ith pixel in the second frame of monitored image Ji Representing the value of G, B, of the RGB color values of the ith pixel in the second frame of monitored image Ji Representing the value of B in the RGB color values of the ith pixel in the second frame of monitored image, and n representing the total number of pixels in the second frame of monitored image; calculating to obtain an integral color characterization coefficient C of the second model image M
Figure BDA0003886633670000022
In the formula (2), R Mi Representing the value of R, G, of the RGB color values of the ith pixel in the second frame of monitored image Mi Representing the value of G, B, among the RGB color values of the ith pixel in the second frame monitor image Mi Representing the value of B in the RGB color values of the ith pixel in the second frame of monitored image, and n representing the total number of pixels in the second frame of monitored image; when in use
Figure BDA0003886633670000031
According to
Figure BDA0003886633670000032
Adjusting the brightness of the second frame of monitoring image to obtain a replacement image; the brightness of the second frame of the monitored image is calculated as
Figure BDA0003886633670000033
In the formula (3), A represents the second frame monitoring image after adjustmentB represents the luminance value of the second frame of the monitor image before adjustment; when in use
Figure BDA0003886633670000034
Or
Figure BDA0003886633670000035
Then, the second model image and the ambient light intensity L are passed J And ambient light intensity L M Performing fusion processing on each pixel of the second frame of monitoring image to obtain a replacement image; the calculation formula of the fusion process is:
Figure BDA0003886633670000036
Figure BDA0003886633670000037
Figure BDA0003886633670000038
in formulae (4) to (6), R F Representing the value of R, G, in the RGB colour values of pixels in a replacement image F Representing the value of G, B, in the RGB colour values of pixels in a replacement image F Representing the value of B, R, of the RGB colour values of pixels in a replacement image J Representing the value of R, G, of the RGB colour values of the pixels in the second frame of monitored image J Representing the value of G, B, of the RGB colour values of the pixels in the monitored image of the second frame J Representing the value of B, R, among the RGB color values of the pixels in the monitored image of the second frame M Representing the value of R, G, of the RGB colour values of the pixels in the second model image M Representing the value of G, B, of the RGB colour values of the pixels in the second model image M Represents the value of B among RGB color values of pixels in the second model image, X represents the coordinates of the pixels in the X direction in the X-Y coordinate system, and Y represents the coordinates of the pixels in the Y direction in the X-Y coordinate system.
In another aspect, the present invention provides a safety control system for power operation, including: the model building module is used for building a static three-dimensional model of the electric power operation site; the data acquisition model is used for acquiring electrical information of each device on the electric power operation site, and acquiring video monitoring information, operation information and Beidou positioning information of a moving target entering the electric power operation site; the collision detection and situation perception module is used for sensing dangerous situations and detecting distances based on a collision principle according to the electrical information of each device, the video monitoring information of the moving target, the operation information of the moving target and the Beidou positioning information of the moving target; and the real-time early warning module is used for judging the sensing result and the detection result of the situation sensing layer, and if the result of the dangerous situation sensing means that the moving target mistakenly enters the electrified area or the moving target mistakenly touches the electrified equipment, or the result of the distance detection means that the distance between the moving target and the electrified equipment is smaller than the safe distance, the real-time early warning is carried out.
Further, the electric power operation safety control system further comprises: the deduction simulation module is used for planning different paths for the moving target to reach a specified position according to Beidou positioning information and work ticket information of the moving target entering the electric power operation site; aiming at each path, carrying out safety distance detection and operation flow deduction on the moving target, judging whether the moving target meets the safety distance on the path according to a detection result, and judging whether the operation flow of the moving target meets anti-violation operation rules or not according to a deduction result; if the safety distance is not met or the anti-violation operation rule is not met, updating the operation information of the moving target; and if the safety distance and the anti-violation operation rule are met, sensing the dangerous situation and detecting the distance based on the collision principle.
Further, the electric power operation safety control system further comprises: the video fusion module is used for acquiring a preprocessing image containing a moving target and acquiring edge contour information of the moving target from the preprocessing image; shooting to obtain a first frame of monitoring image containing a moving target, and recording the ambient light intensity L when the first frame of monitoring image is shot J (ii) a Marking a plurality of pixel points corresponding to the edge contour information in the first frame of monitoring image, connecting the plurality of pixel points end to form an enclosing line, and intercepting the enclosing lineObtaining a second frame of monitoring image from the surrounded image area; intercepting a first model image from the static three-dimensional model, and recording the ambient illumination intensity L when the static three-dimensional model is constructed M (ii) a Marking a plurality of pixel points corresponding to the edge contour information in the first model image, connecting the plurality of pixel points end to form a bounding line, and intercepting an image area surrounded by the bounding line to obtain a second model image; according to the second frame of monitoring image and the ambient light intensity L J The ambient light intensity L M Calculating the second model image to obtain a replacement image; and covering a second model image in the first model image by using the replacement image to obtain a live-action fusion model.
Further, the video fusion module includes: the first integral color representation coefficient calculation unit is used for calculating and obtaining the integral color representation coefficient of the second frame of monitoring image; the second integral color characterization coefficient calculation unit is used for calculating and obtaining an integral color characterization coefficient of the second model image; the brightness calculation unit is used for judging the ratio of the overall color characterization coefficient of the second frame of monitoring image to the overall color characterization coefficient of the second model image, and when the ratio is more than 0.8 and less than 1.15, the brightness of the second frame of monitoring image is adjusted according to the ratio to obtain a replacement image; and the pixel fusion unit is used for judging the ratio of the overall color characterization coefficient of the second frame of monitoring image to the overall color characterization coefficient of the second model image, and when the ratio is less than or equal to 0.8 or the ratio is greater than or equal to 1.15, fusing each pixel of the second frame of monitoring image through the second model image, the ambient illumination intensity when the static three-dimensional model is constructed and the ambient illumination intensity when the first frame of monitoring image is shot to obtain a replacement image.
Compared with the prior art, the invention has the following advantages and beneficial effects: aiming at an electric power operation application scene, technologies such as scene modeling, dynamic target modeling, multi-dimensional information fusion, deduction simulation, collision detection and danger situation perception are adopted to monitor an electric power operation field, and real-time accurate safety risk control of electric power operation is realized; in addition, a refined real-scene 3D model technology is adopted, the 3D scene model is fused with the video, and real-time local area change information is fused into the three-dimensional model, so that the effectiveness of patrol work is improved, and misjudgments of the staff on faults and safety risks are reduced.
Drawings
In order to more clearly illustrate the technical solutions of the exemplary embodiments of the present invention, the drawings that are required in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present invention and therefore should not be considered as limiting the scope, and that those skilled in the art may also derive other related drawings based on these drawings without inventive effort.
Fig. 1 is a logic architecture of a power operation safety control method according to embodiment 1 of the present invention;
fig. 2 is a schematic flow chart of a power operation safety control method according to embodiment 1 of the present invention;
fig. 3 is a schematic diagram of a deduction simulation process provided in embodiment 2 of the present invention;
fig. 4 is a schematic diagram of a framework of a safety control system for electric power operation according to embodiment 4 of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is further described in detail below with reference to examples and the accompanying drawings, and the exemplary embodiments and descriptions thereof are only used for explaining the present invention and are not used as limiting the present invention.
Example 1
The embodiment provides a power operation safety control method aiming at the current situation that safety management measures and technical measures of a power operation site in China need to be improved. The logic architecture of the method is shown in fig. 1, and the electric power operation intelligent safety management and control system based on the fine three-dimensional scene is based on the fine three-dimensional scene modeling, the high-precision targeting technology and the danger situation sensing technology to realize the operation scene monitoring, the operation management, the deduction simulation and the alarm management based on the fine three-dimensional scene modeling aiming at the potential safety hazards of mistakenly entering a live area, insufficient safety distance, mistakenly colliding equipment facilities and the like from the requirement of preventing power grids, equipment and personal safety accidents on an electric power operation site.
The electric power operation safety control method comprises the following steps:
step 1: establishing a static three-dimensional model of an electric power operation site; and collecting electrical information of each device on the electric power operation site. The electrical information comprises an equipment ID, the coordinate of the equipment in the static three-dimensional model, the charged state of the equipment, the safety distance corresponding to each voltage level of the equipment and the anti-violation operation rule corresponding to the equipment; the job information includes work ticket information of the moving target.
And 2, step: the method comprises the steps of collecting video monitoring information, operation information and Beidou positioning information of a moving target entering an electric power operation site.
And step 3: and sensing the dangerous situation and detecting the distance based on the collision principle according to the electrical information of each device, the video monitoring information of the moving target, the operation information of the moving target and the Beidou positioning information of the moving target.
And 4, step 4: and if the result of the sensing of the dangerous situation is that the moving target enters the electrified area by mistake or the moving target collides electrified equipment by mistake or the result of the distance detection is that the distance between the moving target and the electrified equipment is smaller than the safe distance, performing real-time early warning.
Taking a transformer substation as an example, based on a transformer substation operation safety control flow as shown in fig. 2, a static scene fine three-dimensional model of the transformer substation and a fine three-dimensional model library of a power operation field dynamic object, such as a construction vehicle and an operator, are established. The information collected and utilized by the transformer substation operation comprises electrified electrical state information, video image information, beidou positioning information, transformer substation anti-violation operation rule information and the like. By combining high-precision three-dimensional spatial information, operation deduction simulation is carried out based on a collision detection algorithm and the like, and dangerous situation sensing such as online misoperation, mistaken touch, mistaken entering of a charged area and the like is carried out on a moving target object, so that operation scene monitoring and real-time early warning are realized.
In the step 4, the distance detection based on the collision principle includes the following steps:
s41: establishing a spatial local coordinate system by taking the central point of the static three-dimensional model as an origin, and acquiring coordinates of each device of the electric power operation site in the spatial local coordinate system;
s42: establishing a space model of the moving target, and acquiring Beidou positioning information of the space model and real-time coordinates of each feature point on the space model;
s43, calculating the motion direction of the space model according to Beidou positioning information of the space model; and emitting a plurality of rays in the movement direction by taking the space model as a starting point, obtaining the actual distance between the space model and the nearest equipment according to the information returned by the rays, and judging whether the actual distance is less than or equal to the safety distance.
The following explains the distance detection based on the collision principle:
1. principle of collision detection
In a three-dimensional model scene, a ray is emitted from one point along a certain direction, when an object exists in front of the ray, the coordinate of a collision point is returned, and when no shielding exists, an undefined value is returned. When the collision point is returned, the distance between the current object and the collision point is calculated in real time, and if the distance is smaller than the safe distance of the collision point (the safe distance of the charged equipment), an alarm prompt is triggered.
2. Electric power operation field application
Taking the safety distance detection of the live equipment of the transformer substation as an example, the minimum safety distance of 220KV is 3.00 meters, and the minimum safety distance of 500KV is 5.00 meters. The warning method for positioning the safe distance based on the three-dimensional scene of the transformer substation comprises the following key steps:
(1) And establishing a spatial local coordinate system by taking the center point of the model as a dot, so that the coordinate position of each charged device can be known, and whether each device is charged or not and the safe distance of the voltage level can be stored in a background database. And establishing a working interval in three dimensions, and storing the point positions of the interval region in a database.
(2) For a moving object entering a transformer substation, the Beidou positioning terminal is carried at first, space modeling is carried out on different moving objects by taking the terminal as a circular point, and feature points on a moving object model are calculated (taking personnel as an example, the feature points of the personnel comprise a head top and a foot bottom, the amplitude of arm swing during walking, the positions of two palms when the arm is opened and the like, and the more the feature points are, the more the calculation is more accurate). And reserving Cartesian coordinate difference values of the feature point positions and the positioning terminal, and adding the reserved Cartesian difference values to the moved coordinate values after the object moves to obtain new feature point coordinates of the moving object.
(3) And calculating the moving direction of the object according to the point positions reported by the Beidou positioning, and transmitting a plurality of rays to the moving direction, wherein the rays can acquire the recently intersected equipment id and the distance between the equipment id and the ray. And judging whether the distance between the system and the equipment is greater than the safety distance stored in the database, if so, judging that the distance is normal, and if not, giving an alarm by the system.
When the coordinates of each feature point are obtained, the coordinates of each charged device are known, and the distance between the feature point A (X1, Y1, Z1) and the feature point B (X2, Y2, Z2) is set as the distance between the feature point A (X1, Y1, Z1) and the feature point B (X2, Y2, Z2) and the distance between the feature point A and the feature point AB
Figure BDA0003886633670000071
And if the distance is less than the safe distance of the equipment, the system gives an alarm.
It should be further noted that in step 4, the sensing of the dangerous situation is to acquire abnormal video information of the risk points through a camera, and automatically identify the acquired abnormal video information by compiling image identification algorithms for different risk points in advance. Such as: risk point 1 test stand earth connection drops, risk point 2 corner earth connection drops, risk point 3 operating personnel does not stand on the insulating pad waiting risk point, and after the risk point affirms triggering, the system automatic recording risk process photo and warning suggestion.
Example 2
The present embodiment adds a field work deduction simulation to the embodiment 1. Namely, before the dangerous situation perception and the distance detection based on the collision principle are carried out, the method comprises the following steps:
a1: planning different paths for the moving target to reach the designated position according to the Beidou positioning information and the work order information of the moving target entering the electric power operation site;
a2: aiming at each path, carrying out safety distance detection and operation flow deduction on the moving target, judging whether the moving target meets the safety distance on the path according to a detection result, and judging whether the operation flow of the moving target meets anti-violation operation rules or not according to a deduction result;
a: if the safety distance is not met or the anti-violation operation rule is not met, updating the operation information of the moving target; and if the safety distance and the anti-violation operation rule are met, sensing the dangerous situation and detecting the distance based on the collision principle.
As shown in fig. 3, a route is manually planned based on the work information, a route to a work area is simulated, and survey points are set. Taking lifting as an example, selecting a simulation maintenance vehicle with matched models to perform deduction simulation, extending arms, moving arms, lifting and placing lifting objects at a specified survey point through a keyboard to perform deduction, performing collision detection on a three-dimensional model scene according to a safe distance corresponding to the voltage grade of the live equipment in the operation in real time, giving a prompt if the operation distance is smaller than the safe distance, displaying videos of surrounding cameras at the same time, and automatically rotating to the position of the maintenance vehicle. And by deducing the simulation module, the feasibility of the route is verified, and a basis is provided for the maintenance operation scheme.
1. Deduction path planning
And deducing the time required for appointing positions from different path lanes according to the actual operation area/position of the maintenance car/personnel, and determining the approach route of the actual operation after deducting whether the maintenance car/personnel meets the safety distance.
2. Hoisting deduction
In consideration of dynamic adjustment of the operation plan/power failure plan, the operation personnel can reduce or avoid investigation into the transformer substation, and the path and the operation flow are deduced according to the power failure plan of the operation day.
Example 3
The embodiment adds a video fusion technology to the embodiment 2. The method is characterized in that a digital twin model video fusion technology based on local scene updating is combined with a monitoring camera arranged at a point in a transformer substation, and the acquired video data and a live-action digital twin model are intelligently fused by utilizing an AI algorithm of a digital twin real-time video fusion engine, so that real-time seamless fusion is realized, and real-time visual monitoring is achieved.
In the overall view, firstly, real-time data acquisition is carried out on a monitored target by utilizing shooting equipment to obtain a first frame of monitoring image, and a second monitoring image is extracted according to target contour information. According to the ambient light intensity L when shooting J And the ambient illumination intensity L when constructing the three-dimensional live-action model M And processing the second monitoring image and calculating a replacement image. And updating the local scene at the corresponding position in the three-dimensional live-action model by using the generated replacement image so as to obtain a real-time live-action fusion model. And displaying the live-action fusion model on the display equipment according to the input instruction, namely realizing the three-dimensional live-action inspection of the transformer substation. By inspecting in the real-scene fusion model obtained by updating the local scene, the real-time local area change information can be fused into the three-dimensional model, so that the effectiveness of inspection work is improved, and the misjudgment of the staff on faults and safety risks is reduced. The method comprises the following steps:
b1: and (5) initializing. Selecting a target needing real-time monitoring in the transformer substation, and arranging shooting equipment in the transformer substation according to preset position and shooting angle information.
B2: and extracting target information. And acquiring a preprocessing image containing the target by utilizing shooting equipment, and acquiring edge contour information of the target from the image. In order to ensure the accuracy of the three-dimensional real-scene inspection, the edge contour information of the target is usually obtained by a manual labeling method, that is, the edge contour line of the target is labeled in the preprocessed image, the coordinates of each pixel point forming the edge contour line are recorded, and the coordinates of each pixel point forming the edge contour line form the edge contour information. In order to further improve the precision, a bilateral filter is adopted to extract the edge features in the preprocessed image, the adjacent edge features are connected, and finally the edge contour line of the target is obtained.
3. Replacement image calculation
Collecting a first frame of monitoring image through shooting equipment, and recording the ambient illumination intensity L during shooting J . According to the targetAnd the edge profile information marks corresponding pixel points in the first frame of monitoring image. And connecting the adjacent marked pixel points in the first frame of monitoring image to form an end-to-end enclosing line, wherein the area enclosed by the enclosing line is a second monitoring image only containing the target.
Rebuilding a three-dimensional live-action model of the transformer substation, and acquiring the ambient illumination intensity L when the three-dimensional live-action model of the transformer substation is built M . And intercepting a first model image in the three-dimensional live-action model according to the preset position information and the preset shooting angle information. Marking corresponding pixel points in the first model image according to the edge contour information; and connecting the adjacent marked pixel points in the first model image to form an enclosing line which is connected end to end, wherein the area enclosed by the enclosing line is the second model image. And calculating to obtain a replacement image through the second model image, the ambient illumination intensity and the processing of the second monitoring image. The calculation process is as follows:
(1) Integral color characterization coefficient C of second frame monitoring image J The calculation formula of (A) is as follows:
Figure BDA0003886633670000091
in the formula (1), R Ji Representing the value of R, G, of the RGB color values of the ith pixel in the second frame of the monitored image Ji Representing the value of G, B, of the RGB color values of the ith pixel in the second frame of monitored image Ji And the value of B in the RGB color values of the ith pixel in the second frame monitoring image is represented, and n represents the total number of pixels in the second frame monitoring image.
(2) Overall color characterization coefficient C of the second model image M The calculation formula of (c) is:
Figure BDA0003886633670000092
in the formula (2), R Mi Representing the value of R, G, of the RGB color values of the ith pixel in the second frame of the monitored image Mi Representing the value of G, B, of the RGB color values of the ith pixel in the second frame of monitored image Mi Indicating the ith pixel in the monitored image of the second frameAnd n represents the total number of pixels in the second frame of monitored images.
(3) The replacement image is calculated based on the overall color characterization system for both images.
When in use
Figure BDA0003886633670000093
According to
Figure BDA0003886633670000094
Adjusting the brightness of the second frame of monitoring image to obtain a replacement image; the brightness of the second frame of monitored image is calculated as
Figure BDA0003886633670000095
In the formula (3), a represents the brightness value of the second frame of monitoring image after adjustment, and B represents the brightness value of the second frame of monitoring image before adjustment;
when in use
Figure BDA0003886633670000096
Or
Figure BDA0003886633670000097
Then, the second model image and the ambient light intensity L are passed J And ambient light intensity L M And performing fusion processing on each pixel of the second frame of monitoring image to obtain a replacement image.
(4) And covering the second model image in the first model image by using the replacement image to obtain the live-action fusion model. Specifically, the fusion processing is performed on each pixel of the second monitored image, as shown in the following formula:
Figure BDA0003886633670000098
Figure BDA0003886633670000101
Figure BDA0003886633670000102
in formulae (4) to (6), R F Representing the value of R, G, in the RGB colour values of pixels in a replacement image F Representing the value of G, B, in the RGB colour values of pixels in a replacement image F Representing the value of B, R, of the RGB colour values of pixels in a replacement image J Representing the value of R, G, of the RGB colour values of the pixels in the second frame of monitored image J Representing the value of G, B, of the RGB color values of the pixels in the second frame of the monitored image J Representing the value of B, R, among the RGB color values of the pixels in the monitored image of the second frame M Representing the value of R, G, of the RGB colour values of the pixels in the second model image M Representing the value of G, B, of the RGB colour values of the pixels in the second model image M Represents the value of B among RGB color values of pixels in the second model image, X represents the coordinates of the pixels in the X direction in the X-Y coordinate system, and Y represents the coordinates of the pixels in the Y direction in the X-Y coordinate system.
It should be noted that the real-time fusion of the video and the three-dimensional model is to display the real-time video on site in the three-dimensional model in a mode of popping out a video window or embedding the video, and is an important means for safety operation control and alarm linkage, including video fusion and fusion video editing.
The video fusion is to obtain real-time video streams from various NVR/IPC devices, and provide the real-time video streams for the front end to perform video fusion, pop-up video, video roaming and cradle head rotation operations after processing and format conversion.
For the merged Video editing, after a client obtains a standard Flash Video (FLV) Video stream, the client decodes the Video stream through flv.js and displays a real-time Video in a merging or popping mode in a three-dimensional scene. The real-time fusion rendering of the video and the three-dimensional model is to perform the operations of rectangular or circular cutting, position adjustment, size adjustment, rotation and the like on the video and the corresponding model by matting, and fuse the real video in the real-scene three-dimensional model. The core technology is how to make two-dimensional video consistent with three-dimensional model rendering, namely, when the mouse operates the model, the two-dimensional video is changed along with rendering when the mouse rotates left and right, rotates up and down, translates, amplifies and reduces. The method mainly comprises two implementation schemes:
the first scheme is based on a real-time rendering technology of a three-dimensional model, action events of user operation are obtained in real time, and events such as amplification, reduction, left translation, right translation, up translation, down translation, rotation around an X axis and a rotation angle, rotation around a Y axis and a rotation angle, rotation around a Z axis and a rotation angle are obtained, and the real-time rendering fusion and display of a two-dimensional video and the three-dimensional model are realized by adopting the real-time rendering technology of the model; the second scheme is based on the camera visual angle data information in the model display technology, and the visual angle of the two-dimensional video is updated in real time according to the rotating visual angle of the model, so that the real-time rendering fusion and display of the two-dimensional video and the three-dimensional model can be realized.
The seamless fusion function of the surveillance video and the three-dimensional model is described below. In practical application, the data acquired by the monitoring video and the three-dimensional model have different weather, light and the like, so that an obvious gap exists in fusion, and seamless fusion of filtering at a junction is realized based on an image weighting fusion technology.
The principle of the weighted average image fusion algorithm is that the same weight is directly taken for the pixel value of the original image, and then the pixel value of the fused image is obtained by weighted average. For example, to fuse two images a, B, the pixel value of their fused images is a × 50% + B × 50%; however, the applied object is the fusion between the video data and the texture data of the three-dimensional model, the video data needs to be extracted as image data according to the frame rate, then the image is fused based on the weighted average image fusion algorithm, and then the image data is converted into the video data, and then the rendering display of the three-dimensional model and the video data is performed.
Example 4
The present embodiment corresponding to embodiment 1 provides an electric power operation safety control system, and a system framework is shown in fig. 4, and includes:
the model building module is used for building a static three-dimensional model of the electric power operation site;
the data acquisition model is used for acquiring electrical information of each device on the electric power operation site, and acquiring video monitoring information, operation information and Beidou positioning information of a moving target entering the electric power operation site;
the collision detection and situation awareness module is used for sensing dangerous situations and detecting distances based on a collision principle according to the electrical information of each device, the video monitoring information of the moving target, the operation information of the moving target and the Beidou positioning information of the moving target;
the real-time early warning module is used for judging a sensing result and a detection result of the situation sensing layer, and if the sensing result of the dangerous situation is that a moving target mistakenly enters a charged area or the moving target mistakenly touches charged equipment, or the distance detection result is that the distance between the moving target and the charged equipment is smaller than a safe distance, the real-time early warning is carried out;
the deduction simulation module is used for planning different paths of the moving target to reach a specified position according to Beidou positioning information and work order information of the moving target entering an electric power operation site; aiming at each path, carrying out safety distance detection and operation flow deduction on the moving target, judging whether the moving target meets the safety distance on the path according to a detection result, and judging whether the operation flow of the moving target meets anti-violation operation rules or not according to a deduction result; if the safety distance is not met or the anti-violation operation rule is not met, updating the operation information of the moving target; if the safety distance and the anti-violation operation rule are met, sensing the dangerous situation and detecting the distance based on the collision principle;
the video fusion module is used for acquiring a preprocessed image containing a moving target and acquiring edge contour information of the moving target from the preprocessed image; shooting to obtain a first frame of monitoring image containing a moving target, and recording the ambient light intensity L when shooting the first frame of monitoring image J (ii) a Marking a plurality of pixel points corresponding to the edge contour information in the first frame of monitoring image, connecting the plurality of pixel points end to form a surrounding line, and intercepting an image area surrounded by the surrounding line to obtain a second frame of monitoring image; intercepting a first model image from the static three-dimensional model, and recording ambient light when the static three-dimensional model is constructedIllumination intensity L M (ii) a Marking a plurality of pixel points corresponding to the edge contour information in the first model image, connecting the plurality of pixel points end to form a bounding line, and intercepting an image area surrounded by the bounding line to obtain a second model image; according to the second frame of monitoring image and the ambient light intensity L J The ambient light intensity L M Calculating the second model image to obtain a replacement image; and covering a second model image in the first model image by using the replacement image to obtain a live-action fusion model.
Wherein, the video fusion module includes:
the first integral color representation coefficient calculation unit is used for calculating and obtaining an integral color representation coefficient of the second frame of monitoring image;
the second integral color characterization coefficient calculation unit is used for calculating and obtaining an integral color characterization coefficient of the second model image;
the brightness calculation unit is used for judging the ratio of the overall color characterization coefficient of the second frame of monitoring image to the overall color characterization coefficient of the second model image, and when the ratio is more than 0.8 and less than 1.15, the brightness of the second frame of monitoring image is adjusted according to the ratio to obtain a replacement image;
and the pixel fusion unit is used for judging the ratio of the overall color characterization coefficient of the second frame of monitoring image to the overall color characterization coefficient of the second model image, and when the ratio is less than or equal to 0.8 or the ratio is greater than or equal to 1.15, fusing each pixel of the second frame of monitoring image through the second model image, the ambient illumination intensity when the static three-dimensional model is constructed and the ambient illumination intensity when the first frame of monitoring image is shot to obtain a replacement image.
The following description will be made by taking the substation a as an example, and the power operation safety control system is further described as follows:
a refined three-dimensional scene model of the substation A is established through the system, and then multi-dimensional information such as high-precision positioning, electrical states and safety operation rules is combined to realize danger situation perception based on multi-dimensional information fusion. The method specifically comprises the following functions:
1. job management
And (4) aiming at the actual electric power overhaul work, formulating a work ticket and inputting operation information in the system. After the electric power overhaul operation scheme is determined, information such as operation time, operators, construction vehicles, positioning equipment, electrified intervals and the like needs to be perfected in the system. Through the job information list, the job execution state at the present time can be viewed. The specific operation details including personnel information, alarm information and historical tracks can be checked.
In this embodiment, the transformer substation a is taken as an example to further explain the electric power operation safety control method, and a dangerous situation perception based on multi-dimensional information fusion is realized by establishing a refined three-dimensional scene model of the transformer substation in Pingchun and combining with multi-dimensional information such as high-precision positioning, an electrical state, and a safety operation rule. The method specifically comprises the following steps:
1. and (5) managing the operation. And (4) aiming at the actual electric power overhaul work, formulating a work ticket and inputting operation information in the system. After the electric power overhaul operation scheme is determined, information such as operation time, operators, construction vehicles, positioning equipment, electrified intervals and the like needs to be perfected in the system. Through the job information list, the job execution state at the present time can be viewed. The specific operation details including personnel information, alarm information and historical tracks can be checked.
(1) A list of job information. Before the actual approach, the work ticket content, including the time and area of the job task (determined by the earlier deduction and survey) must be synchronized in the system.
(2) And (4) recording the operation information. And supporting the entry of job information, configuring risk points and recording job contents.
(3) And inquiring the job details. According to the business needs, the information of the operation process can be inquired and browsed, the risk area and the operation path associated with the operation process can be displayed, and the automatic alarm records of the multi-risk actions can be realized.
2. Deduction simulation
And manually planning a route according to the operation information, simulating a path reaching a working area, and setting an exploration point. Taking lifting as an example, selecting a simulation maintenance vehicle with matched models to perform deduction simulation, extending arms, moving arms, lifting and placing lifting objects at a specified survey point through a keyboard to perform deduction, performing collision detection on a three-dimensional model scene according to a safe distance corresponding to the voltage grade of the live equipment in the operation in real time, giving a prompt if the operation distance is smaller than the safe distance, displaying videos of surrounding cameras at the same time, and automatically rotating to the position of the maintenance vehicle. And by deducing the simulation module, the feasibility of the route is verified, and a basis is provided for the maintenance operation scheme.
(1) And deducing the path planning. And deducing the time required for appointing positions from different path lanes according to the actual operation area/position of the maintenance vehicle/personnel, and determining the approach route of the actual operation after deduction according to whether the maintenance vehicle/personnel meets the safety distance.
(2) And (5) hoisting deduction. In consideration of dynamic adjustment of the operation plan/power failure plan, the operation personnel can reduce or avoid investigation into the transformer substation, and the path and the operation flow are deduced according to the power failure plan of the operation day.
3. Live-action monitoring
When the operation starts, the system marks the positions of an operator and a construction vehicle in a three-dimensional model scene, performs situation sensing by combining real-time positioning coordinates of the operator and the construction vehicle according to electric and electric quantity information such as the range of a charged area, the safety distance of charged equipment and the like of the operation, gives an early warning prompt if the operator is positioned in the charged area and a buffer area of the charged area, gives an alarm prompt and records the alarm information if the operator is positioned in the charged area, simultaneously displays videos of surrounding cameras, and automatically rotates to a positioning position. By sensing dangerous situations such as online misoperation, mistaken collision, mistaken entering of a charged area and the like of a moving target object, operation live-action monitoring and real-time alarming based on a three-dimensional scene are realized, and management and control of operation implementation and supervision links are achieved.
(1) And (3) a three-dimensional scene model. In a substation scenario, any location is clicked (under the condition of camera coverage), and the support covers the position of any equipment.
(2) And positioning and real-time presenting the operating personnel. And 4, deducing the access of the maintenance vehicle/personnel and the operation process according to an operation plan, a power failure plan and the equipment electrification condition. For example, operation deduction is carried out at a bus, the optimal operation position of the maintenance vehicle is found through repeated deduction in combination with a safe distance and an accurate positioning technology, and the maintenance vehicle/personnel directly operate in the appointed position after entering the field through the offline mark. When an operator/maintenance vehicle enters the field, the Beidou terminal is equipped, and after the Beidou terminal is powered on, situation awareness early warning of safety operation is started.
4. Alarm management
All historical alarm information can be checked in the alarm information list, and the alarm information of a certain task can also be checked through job management (see the real scene monitoring, the early warning in the deduction simulation and the alarm dynamic display part in detail). Clicking the alarm details, and checking the current position of the current maintenance vehicle/operator and a real-time situation perception risk value; meanwhile, the working operation track can be previewed, and the number of early warning times and the number of alarming times generated at present can be checked.
5. Configuration management
(1) And managing risk points. And the user is supported to self-define the setting of the risk points according to the actual management requirement. When the system judges that personnel and construction machinery enter the risk point range, an alarm signal is triggered, the system is linked with the current camera to capture a picture as documentary materials traceable in the later period, and meanwhile, intelligent broadcasting is linked to broadcast the picture.
(2) A list of risk points. The risk areas can be automatically divided in the interface, and risk points can be inquired, renamed and remarked.
6. And managing the charged interval. The user can manually set the safety interval of the electrified equipment according to the safety management standard and the station working practice.
7. And managing the charged equipment. And the electrification condition of the electrified equipment in the interface is supported, and the association setting is actually carried out according to the station.
8. And managing the camera. The system accesses videos through an ONVIF protocol of NVR, draws a video equipment list through websocket real-time communication, finds the position of corresponding video equipment in a live-action three-dimensional scene, adds a video equipment model, and adjusts the video equipment to an optimal observation angle through adjusting basic attribute information of the equipment, such as equipment position, equipment orientation, horizontal orientation of observation sight line and the like.
9. Practical tool
(1) And (4) space measurement. The distance, height, area and angle measuring function is provided, and measuring means are provided for pre-researching a three-dimensional live-action scene equipment maintenance scheme, accurately installing an equipment model, measuring road surface width, measuring equipment height, measuring equipment base area and the like.
(2) And (5) monitoring and analyzing. According to the live-action three-dimensional model, corresponding video equipment is adapted, or the video equipment with distributed points is referred, the spatial position relation between the target equipment point position and the camera capable of observing the target point position is calculated based on a perspective projection or an orthographic projection algorithm, and through the algorithms such as shielding screening and the like, the camera with a better visual angle and less shielding is screened, a real-time video is displayed, and the multi-angle and multi-azimuth video display of the target equipment is performed.
The above-mentioned embodiments are intended to illustrate the objects, technical solutions and advantages of the present invention in further detail, and it should be understood that the above-mentioned embodiments are merely exemplary embodiments of the present invention, and are not intended to limit the scope of the present invention, and any modifications, equivalent substitutions, improvements and the like made within the spirit and principle of the present invention should be included in the scope of the present invention.

Claims (10)

1. A safety control method for electric power operation is characterized by comprising the following steps:
establishing a static three-dimensional model of an electric power operation site; collecting electrical information of each device on an electric power operation site;
collecting video monitoring information, operation information and Beidou positioning information of a moving target entering an electric power operation site;
sensing danger situations and detecting distances based on a collision principle according to the electrical information of each device, the video monitoring information of the moving target, the operation information of the moving target and the Beidou positioning information of the moving target;
and if the result of the sensing of the dangerous situation is that the moving target enters the electrified area by mistake or the moving target collides electrified equipment by mistake or the result of the distance detection is that the distance between the moving target and the electrified equipment is smaller than the safe distance, performing real-time early warning.
2. The electric power operation safety control method according to claim 1, wherein the electrical information comprises an equipment ID, coordinates of the equipment in the static three-dimensional model, an electrified state of the equipment, a safety distance corresponding to each voltage level of the equipment and an anti-violation operation rule corresponding to the equipment; the job information includes work ticket information of the moving target.
3. The electric power operation safety control method according to claim 2, wherein before the dangerous situation sensing and the distance detection based on the collision principle, the method comprises the following steps:
planning different paths for the moving target to reach the designated position according to the Beidou positioning information and the work order information of the moving target entering the electric power operation site;
aiming at each path, carrying out safety distance detection and operation flow deduction on the moving target, judging whether the moving target meets the safety distance on the path according to a detection result, and judging whether the operation flow of the moving target meets anti-violation operation rules or not according to a deduction result;
if the safety distance is not met or the anti-violation operation rule is not met, updating the operation information of the moving target; and if the safety distance and the anti-violation operation rule are met, sensing the dangerous situation and detecting the distance based on the collision principle.
4. The electric power operation safety control method according to claim 1, wherein the distance detection based on the collision principle comprises the steps of:
establishing a spatial local coordinate system by taking the central point of the static three-dimensional model as an origin, and acquiring coordinates of each device of the electric power operation site in the spatial local coordinate system;
establishing a space model of the moving target, and acquiring Beidou positioning information of the space model and real-time coordinates of each feature point on the space model;
calculating the motion direction of the space model according to the Beidou positioning information of the space model; and emitting a plurality of rays in the movement direction by taking the space model as a starting point, obtaining the actual distance between the space model and the nearest equipment according to the information returned by the rays, and judging whether the actual distance is less than or equal to the safety distance.
5. The electric power operation safety control method according to claim 1, further comprising the steps of:
acquiring a preprocessing image containing a moving target, and acquiring edge contour information of the moving target from the preprocessing image;
shooting to obtain a first frame of monitoring image containing a moving target, and recording the ambient light intensity L when the first frame of monitoring image is shot J (ii) a Marking a plurality of pixel points corresponding to the edge contour information in the first frame of monitoring image, connecting the plurality of pixel points end to form a surrounding line, and intercepting an image area surrounded by the surrounding line to obtain a second frame of monitoring image;
intercepting a first model image from the static three-dimensional model, and recording the ambient illumination intensity L when the static three-dimensional model is constructed M (ii) a Marking a plurality of pixel points corresponding to the edge contour information in the first model image, connecting the plurality of pixel points end to form a bounding line, and intercepting an image area surrounded by the bounding line to obtain a second model image;
according to the second frame of monitoring image and the ambient light intensity L J The ambient light intensity L M Calculating the second model image to obtain a replacement image;
and covering a second model image in the first model image by using the replacement image to obtain a live-action fusion model.
6. The method according to claim 5, wherein the step of calculating the replacement image comprises the steps of:
calculating outObtaining the integral color characterization coefficient C of the second frame of monitoring image J
Figure FDA0003886633660000021
In the formula (1), R Ji Representing the value of R, G, of the RGB color values of the ith pixel in the second frame of the monitored image Ji Representing the value of G, B, of the RGB color values of the ith pixel in the second frame of monitored image Ji Representing the value of B in the RGB color values of the ith pixel in the second frame of monitored image, and n representing the total number of pixels in the second frame of monitored image;
calculating to obtain an integral color characterization coefficient C of the second model image M
Figure FDA0003886633660000022
In the formula (2), R Mi Representing the value of R, G, of the RGB color values of the ith pixel in the second frame of monitored image Mi Representing the value of G, B, of the RGB color values of the ith pixel in the second frame of monitored image Mi Representing the value of B in the RGB color values of the ith pixel in the second frame of monitored image, and n representing the total number of pixels in the second frame of monitored image;
when in use
Figure FDA0003886633660000023
According to when
Figure FDA0003886633660000024
Adjusting the brightness of the second frame of monitoring image to obtain a replacement image; the brightness of the second frame of the monitored image is calculated as
Figure FDA0003886633660000025
In the formula (3), a represents the brightness value of the second frame of monitoring image after adjustment, and B represents the brightness value of the second frame of monitoring image before adjustment;
when in use
Figure FDA0003886633660000026
Or
Figure FDA0003886633660000027
Then, the second model image and the ambient light intensity L are passed J And ambient light intensity L M Performing fusion processing on each pixel of the second frame of monitoring image to obtain a replacement image; the calculation formula of the fusion process is:
Figure FDA0003886633660000031
Figure FDA0003886633660000032
Figure FDA0003886633660000033
in formulae (4) to (6), R F Representing the value of R, G, in the RGB colour values of pixels in a replacement image F Representing the value of G, B, in the RGB colour values of pixels in a replacement image F Representing the value of B, R, of the RGB colour values of pixels in a replacement image J Representing the value of R, G, of the RGB colour values of the pixels in the second frame of monitored image J Representing the value of G, B, of the RGB color values of the pixels in the second frame of the monitored image J Representing the value of B, R, of the RGB colour values of the pixels in the monitored image of the second frame M Representing the value of R, G, of the RGB colour values of the pixels in the second model image M Representing the value of G, B, of the RGB colour values of the pixels in the second model image M Represents the value of B among RGB color values of pixels in the second model image, X represents the coordinates of the pixels in the X direction in the X-Y coordinate system, and Y represents the coordinates of the pixels in the Y direction in the X-Y coordinate system.
7. The utility model provides an electric power operation safety control system which characterized in that includes:
the model building module is used for building a static three-dimensional model of the electric power operation site;
the data acquisition model is used for acquiring electrical information of each device on the electric power operation site, and acquiring video monitoring information, operation information and Beidou positioning information of a moving target entering the electric power operation site;
the collision detection and situation awareness module is used for sensing dangerous situations and detecting distances based on a collision principle according to the electrical information of each device, the video monitoring information of the moving target, the operation information of the moving target and the Beidou positioning information of the moving target;
and the real-time early warning module is used for judging the sensing result and the detection result of the situation sensing layer, and if the result of the dangerous situation sensing means that the moving target mistakenly enters the electrified area or the moving target mistakenly touches the electrified equipment, or the result of the distance detection means that the distance between the moving target and the electrified equipment is smaller than the safe distance, the real-time early warning is carried out.
8. The electric power operation safety control system according to claim 7, further comprising: the deduction simulation module is used for planning different paths of the moving target to reach a specified position according to Beidou positioning information and work order information of the moving target entering an electric power operation site; aiming at each path, carrying out safety distance detection and operation flow deduction on the moving target, judging whether the moving target meets the safety distance on the path according to a detection result, and judging whether the operation flow of the moving target meets anti-violation operation rules or not according to a deduction result; if the safety distance is not met or the anti-violation operation rule is not met, updating the operation information of the moving target; and if the safety distance and the anti-violation operation rule are met, sensing the dangerous situation and detecting the distance based on the collision principle.
9. The electric power operation safety control system according to claim 7 or 8, further comprising: a video fusion module for obtaining a pre-processed image containing a moving object fromAcquiring edge contour information of a moving target from the preprocessed image; shooting to obtain a first frame of monitoring image containing a moving target, and recording the ambient light intensity L when the first frame of monitoring image is shot J (ii) a Marking a plurality of pixel points corresponding to the edge contour information in the first frame of monitoring image, connecting the plurality of pixel points end to form a surrounding line, and intercepting an image area surrounded by the surrounding line to obtain a second frame of monitoring image; intercepting a first model image from the static three-dimensional model, and recording the ambient illumination intensity L when the static three-dimensional model is constructed M (ii) a Marking a plurality of pixel points corresponding to the edge contour information in the first model image, connecting the plurality of pixel points end to form a bounding line, and intercepting an image area surrounded by the bounding line to obtain a second model image; according to the second frame of monitoring image and the ambient light intensity L J The ambient light intensity L M Calculating the second model image to obtain a replacement image; and covering a second model image in the first model image by using the replacement image to obtain a live-action fusion model.
10. The electric power operation safety control system according to claim 9, wherein the video fusion module comprises:
the first integral color representation coefficient calculation unit is used for calculating and obtaining an integral color representation coefficient of the second frame of monitoring image;
the second integral color characterization coefficient calculation unit is used for calculating and obtaining an integral color characterization coefficient of the second model image;
the brightness calculation unit is used for judging the ratio of the overall color characterization coefficient of the second frame of monitoring image to the overall color characterization coefficient of the second model image, and when the ratio is more than 0.8 and less than 1.15, the brightness of the second frame of monitoring image is adjusted according to the ratio to obtain a replacement image;
and the pixel fusion unit is used for judging the ratio of the overall color characterization coefficient of the second frame of monitoring image to the overall color characterization coefficient of the second model image, and when the ratio is less than or equal to 0.8 or the ratio is greater than or equal to 1.15, fusing each pixel of the second frame of monitoring image through the second model image, the ambient illumination intensity when the static three-dimensional model is constructed and the ambient illumination intensity when the first frame of monitoring image is shot to obtain a replacement image.
CN202211245971.3A 2022-10-12 2022-10-12 Electric power operation safety control method and system Pending CN115496398A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211245971.3A CN115496398A (en) 2022-10-12 2022-10-12 Electric power operation safety control method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211245971.3A CN115496398A (en) 2022-10-12 2022-10-12 Electric power operation safety control method and system

Publications (1)

Publication Number Publication Date
CN115496398A true CN115496398A (en) 2022-12-20

Family

ID=84473821

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211245971.3A Pending CN115496398A (en) 2022-10-12 2022-10-12 Electric power operation safety control method and system

Country Status (1)

Country Link
CN (1) CN115496398A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116384255A (en) * 2023-05-11 2023-07-04 四川新迎顺信息技术股份有限公司 Park dangerous situation perception method and system based on multi-source data fusion
CN116797031A (en) * 2023-08-25 2023-09-22 深圳市易图资讯股份有限公司 Safety production management method and system based on data acquisition
CN117114420A (en) * 2023-10-17 2023-11-24 南京启泰控股集团有限公司 Image recognition-based industrial and trade safety accident risk management and control system and method
CN117765451A (en) * 2024-02-22 2024-03-26 江苏征途技术股份有限公司 Joint control analysis method and system based on AI intelligent auxiliary control system equipment
CN117854252A (en) * 2024-03-08 2024-04-09 广东电网有限责任公司东莞供电局 Out-of-range alarm method, device and equipment for electric power places and storage medium

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116384255A (en) * 2023-05-11 2023-07-04 四川新迎顺信息技术股份有限公司 Park dangerous situation perception method and system based on multi-source data fusion
CN116384255B (en) * 2023-05-11 2023-08-11 四川新迎顺信息技术股份有限公司 Park dangerous situation perception method and system based on multi-source data fusion
CN116797031A (en) * 2023-08-25 2023-09-22 深圳市易图资讯股份有限公司 Safety production management method and system based on data acquisition
CN116797031B (en) * 2023-08-25 2023-10-31 深圳市易图资讯股份有限公司 Safety production management method and system based on data acquisition
CN117114420A (en) * 2023-10-17 2023-11-24 南京启泰控股集团有限公司 Image recognition-based industrial and trade safety accident risk management and control system and method
CN117114420B (en) * 2023-10-17 2024-01-05 南京启泰控股集团有限公司 Image recognition-based industrial and trade safety accident risk management and control system and method
CN117765451A (en) * 2024-02-22 2024-03-26 江苏征途技术股份有限公司 Joint control analysis method and system based on AI intelligent auxiliary control system equipment
CN117765451B (en) * 2024-02-22 2024-04-30 江苏征途技术股份有限公司 Joint control analysis method and system based on AI intelligent auxiliary control system equipment
CN117854252A (en) * 2024-03-08 2024-04-09 广东电网有限责任公司东莞供电局 Out-of-range alarm method, device and equipment for electric power places and storage medium

Similar Documents

Publication Publication Date Title
CN115496398A (en) Electric power operation safety control method and system
CN111091609B (en) Transformer substation field operation management and control system and method based on three-dimensional dynamic modeling
EP3138754B1 (en) Rail track asset survey system
CN112633535A (en) Photovoltaic power station intelligent inspection method and system based on unmanned aerial vehicle image
CN113781450A (en) Automatic intelligent defect analysis system based on unmanned aerial vehicle image acquisition of power transmission and distribution line
CN112149212B (en) Visual construction management platform for engineering project
CN106710001A (en) Substation inspection robot based centralized monitoring and simulation system and method thereof
CN108491758A (en) A kind of track detection method and robot
CN106787186A (en) Ultra-high voltage transformer station comprehensive intelligent managing and control system based on three-dimensional live integration
CN109559381B (en) Transformer substation acceptance method based on AR space measurement technology
CN111222190B (en) Ancient building management system
CN113225387B (en) Visual monitoring method and system for machine room
AU2017232220B2 (en) Railroadtrack asset survey system
CN106875081A (en) A kind of enhancing virtual reality method for electricity substation
CN110610542A (en) Substation equipment state monitoring panoramic analysis system
CN214704735U (en) Comprehensive pipe rack wisdom system of patrolling and examining
CN110874866A (en) Transformer substation three-dimensional monitoring method and system based on videos
CN102834848A (en) Method for visualizing zones of higher activity in monitoring scenes
CN116720242A (en) Digital twin panoramic monitoring system for high-voltage cable tunnel
CN114419231A (en) Traffic facility vector identification, extraction and analysis system based on point cloud data and AI technology
CN111384776B (en) VR-based transformer substation three-dimensional panoramic state monitoring method and system
CN213518003U (en) A patrol and examine robot and system of patrolling and examining for airport pavement
CN115665213B (en) Digital twin system of new equipment on-line commissioning base
CN116033240A (en) Equipment inspection method and system based on station operation cockpit
KR102354870B1 (en) Game Engine-based Digital Construction Safty Monitoring System

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination