CN110989599B - Autonomous operation control method and system for fire-fighting robot of transformer substation - Google Patents

Autonomous operation control method and system for fire-fighting robot of transformer substation Download PDF

Info

Publication number
CN110989599B
CN110989599B CN201911253475.0A CN201911253475A CN110989599B CN 110989599 B CN110989599 B CN 110989599B CN 201911253475 A CN201911253475 A CN 201911253475A CN 110989599 B CN110989599 B CN 110989599B
Authority
CN
China
Prior art keywords
fire
fighting
fighting robot
area
transformer substation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911253475.0A
Other languages
Chinese (zh)
Other versions
CN110989599A (en
Inventor
王海磊
阮鹏程
王宇航
马晓锋
李建祥
许玮
慕世友
周大洲
王海鹏
郭锐
张海龙
刘海波
赵玉良
司金保
曾金保
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
State Grid Intelligent Technology Co Ltd
Original Assignee
State Grid Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by State Grid Intelligent Technology Co Ltd filed Critical State Grid Intelligent Technology Co Ltd
Priority to CN201911253475.0A priority Critical patent/CN110989599B/en
Publication of CN110989599A publication Critical patent/CN110989599A/en
Application granted granted Critical
Publication of CN110989599B publication Critical patent/CN110989599B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1661Programme controls characterised by programming, planning systems for manipulators characterised by task planning, object-oriented languages
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0225Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving docking at a fixed facility, e.g. base station or loading bay
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/10Constructive solid geometry [CSG] using solid primitives, e.g. cylinders, cubes

Abstract

The invention discloses a transformer substation fire-fighting robot autonomous operation control method and a system, wherein the method comprises the following steps: establishing a three-dimensional data model of the transformer substation, laying wireless AP (access point) equipment at a plurality of fire-fighting medium supply points in the transformer substation, and establishing a wireless network in the transformer substation; positioning the fire-fighting robot according to the wireless network; determining the position of an optimal fire-fighting medium supply point, and planning an optimal path for the fire-fighting robot to reach abnormal equipment; controlling the fire-fighting robot to automatically run to a fire-fighting medium supply point position to complete automatic butt joint of the fire-fighting hose; and controlling the fire-fighting robot to automatically run to the position near the abnormal equipment, and automatically calculating the optimal spraying angle and spraying flow according to the position of the fire point. The invention can realize the whole closed-loop operation process of finding fire, driving to a place, automatically connecting fire-fighting medium supply equipment, selecting an optimal operation position and automatically extinguishing fire for the transformer substation fire-fighting robot, and fills the blank in the technical field of automatic fire-fighting operation of the transformer substation robot.

Description

Autonomous operation control method and system for fire-fighting robot of transformer substation
Technical Field
The invention relates to the technical field of transformer substation fire-fighting robots, in particular to a transformer substation fire-fighting robot autonomous operation control method and a transformer substation fire-fighting robot autonomous operation control system.
Background
The statements in this section merely provide background information related to the present disclosure and may not constitute prior art.
A large number of high-voltage and high-current devices exist in a transformer substation, and a fire disaster is easily caused by the problems of device faults, line defects and the like. The fire accident of transformer substation is monitored through the fire-fighting robot of transformer substation among the prior art, and in case the condition of a fire takes place, the fire-fighting robot of transformer substation can control the condition of a fire very first time, has improved monitoring and early warning level of the early fire hidden danger of transformer substation.
However, the inventor found that, in an unattended substation, the geographical location is mostly far away, and once a fire occurs, it takes a certain time to arrive at a fire scene from a professional fire brigade, so that the fire cannot be extinguished in an optimal fire extinguishing period, a serious fire accident may be caused by small fire, and a large fire accident may have a serious influence on the building safety and electricity utilization safety of a power system in a plurality of areas. Therefore, realizing the autonomous operation of the fire-fighting robot in the substation is a main problem to be solved currently, and the solution of the above problems mainly faces the following difficulties:
1. when a fire accident occurs in a transformer substation, the environment in the substation is very complicated due to dense smoke generated by equipment combustion; the existing robot products in the transformer substation mainly realize navigation in the transformer substation by means of laser modeling navigation, visual navigation and the like, and the navigation precision of the robot is often influenced by a fire source and generated dense smoke in the transformer substation, so that the reliability of fire-fighting operation is influenced.
2. At present, the fire-fighting robot mostly adopts the fire engine to carry out pressurization water supply, and can not realize continuous water supply.
3. The prior art can only detect whether the butt joint of the fire-fighting robot and the fire hose is successful, but cannot monitor the whole butt joint process, so that automatic butt joint cannot be realized; at present, the butt joint process of a fire-fighting robot and a fire hose, the unfolding and folding process of the fire hose and the breakage detection of the fire hose inevitably need manual participation.
Disclosure of Invention
In order to solve the problems, the invention provides a method and a system for controlling autonomous operation of a fire-fighting robot of a transformer substation, a wireless network of a navigation positioning system is built by taking a water supply point as a reference position, so that the interference of environmental factors can be effectively eliminated, and the navigation accuracy is improved; the whole butt joint process of the fire-fighting robot and the fire hose can be monitored in a full flow, and autonomous operation of the fire-fighting robot in the transformer substation is achieved.
In some embodiments, the following technical scheme is adopted:
a transformer substation fire-fighting robot autonomous operation control method comprises the following steps:
establishing a three-dimensional data model of the transformer substation, laying wireless AP (access point) equipment at a plurality of fire-fighting medium supply points in the transformer substation, and establishing a wireless network in the transformer substation; positioning the fire-fighting robot according to the wireless network;
determining the position of an optimal fire-fighting medium supply point according to received equipment abnormity monitoring data sent by fire monitoring equipment in the transformer substation, and planning an optimal path for a fire-fighting robot to reach the abnormal equipment;
controlling the fire-fighting robot to automatically operate to a fire-fighting medium supply point position to complete automatic butt joint of fire-fighting hoses;
controlling the fire-fighting robot to automatically run to the vicinity of abnormal equipment, and automatically selecting a fire extinguishing medium according to the type of the equipment on fire; and automatically calculating the optimal spraying angle and spraying flow according to the ignition point position.
In other embodiments, the following technical solutions are adopted:
a transformer substation fire-fighting robot autonomous operation control system comprises:
means for building a three-dimensional data model of the substation;
the device is used for laying wireless AP equipment at a plurality of fire-fighting medium supply points in the substation and constructing a wireless network in the substation;
means for locating a fire-fighting robot according to the wireless network;
the device is used for determining the position of an optimal fire-fighting medium supply point according to received equipment abnormity monitoring data sent by fire monitoring equipment in the transformer substation and planning an optimal path for the fire-fighting robot to reach the abnormal equipment;
the device is used for controlling the fire-fighting robot to automatically operate to a fire-fighting medium supply point position to complete automatic butt joint of fire-fighting hoses;
a device for controlling the fire-fighting robot to automatically operate to the vicinity of the abnormal equipment and automatically selecting a fire extinguishing medium according to the type of the equipment on fire;
and the device is used for automatically calculating the optimal spraying angle and spraying flow according to the ignition point position.
In some embodiments, the following technical scheme is adopted:
the transformer substation fire-fighting robot adopts the transformer substation fire-fighting robot autonomous operation control method to achieve autonomous fire-fighting operation in a transformer substation.
Compared with the prior art, the invention has the beneficial effects that:
(1) the innovation provides a transformer substation fire-fighting robot autonomous operation control technology, realize the nimble configuration of the developments of fire-fighting robot and water supply point, use the water supply point to construct transformer substation's three-dimensional visual model as the reference position simultaneously, build navigation location wireless network, with the position of confirming the optimal fire-fighting medium supply point, plan out the optimal route that fire-fighting robot reachd unusual equipment simultaneously, can effectively get rid of the interference of environmental factor, improve the navigation precision, realize in time putting out a fire when taking place the condition of a fire, and can be quick, pinpoint the fire source, restrain the condition of a fire in the state of sprouting.
(2) The butt joint monitoring technology of the fire hose with the multi-sensor fusion is innovatively designed, the butt joint state is accurately confirmed by utilizing the real-time monitoring of the first sensor and the pressure sensor, the intellectualization of the butt joint process is ensured, and the operation progress is accurately grasped, so that the processing can be conveniently carried out at the first time when the butt joint is wrong; and meanwhile, the process of unfolding and folding the water band is monitored, and the robot is assisted to find the optimal operation point.
(3) Set up fire-fighting medium supply point in the transformer substation, the robot can be automatic rather than the butt joint, has solved the problem that adopts the fire engine can't last supply with fire-fighting medium, can realize that transformer substation's fire-fighting robot discovers the condition of a fire, arrives the place, the whole closed loop operation flow of automatic connection fire-fighting medium, selects best operation position and automatic fire extinguishing, has filled the blank in the automatic fire-fighting operation technical field of transformer substation's robot.
(4) Utilize the camera of a plurality of different angles in the station, utilize the oblique photography technique to establish the model in key position, utilize the fire control medium supply equipment of putting in water supply point department to be the coordinate point, build robot navigation positioning system's wireless network, realize can reducing the interference under the condition that has condition of a fire and smog in the station, improve the degree of accuracy of navigation, provide the prerequisite for the autonomous operation of fire-fighting robot.
(5) Through the overall process control to the butt joint process of fire control robot body and fire hose, can accurately hold the operation progress to in the very first time of butt joint mistake takes place to handle, monitor the process that the hosepipe expandes and packs up simultaneously, provide necessary technological basis for the autonomous operation of fire control robot.
(6) The technology for quickly identifying the ignition point of the substation equipment based on the infrared image is innovatively provided, collected data are combined, flame image information is converted into flame space coordinate information, the problem that the position of flame cannot be accurately positioned is solved, and the flame is quickly and accurately positioned.
(7) The fire-fighting robot spray curve adjusting technology based on multi-view vision is innovatively provided, a fire-fighting medium spray curve model is built, the optimal spray angle and spray flow are determined, and the fire-fighting operation effect is improved. By combining the double-spraying mode and the three-stage pressurizing capacity of the water column and the water mist of the robot, different spraying modes are designed, the algorithm is optimized and adjusted, the operation efficiency and the fire extinguishing capacity are improved, the rapid identification and analysis positioning capacity of the ignition point of the equipment is improved, data support is provided for selection of the optimal fire extinguishing mode, and the automatic aiming of a spraying device can be realized by combining the spraying curve calculation algorithm, so that the fire source can be quickly and effectively extinguished.
(8) During the operation process of the fire-fighting robot, the multi-view vision equipment is used for monitoring the operation progress in real time, and the spraying angle and flow are adjusted according to the fire condition, so that the fire is accurately extinguished, and other normally-operated equipment in a station is not influenced; powerful technical support is provided for autonomous and accurate operation of the fire-fighting robot.
Drawings
Fig. 1 is a flowchart of an autonomous operation control method for a fire-fighting robot of a substation according to an embodiment of the present invention;
fig. 2 is a schematic diagram of a wireless network in a substation according to an embodiment of the present invention;
FIG. 3 is a schematic diagram illustrating a docking process of a fire-fighting robot and a fire hose of a substation in the first embodiment of the invention;
fig. 4 is a schematic diagram of an injection curve adjustment process according to an embodiment of the invention.
Detailed Description
It should be noted that the following detailed description is exemplary and is intended to provide further explanation of the disclosure. Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs.
It is noted that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of example embodiments according to the present application. As used herein, the singular forms "a", "an", and "the" are intended to include the plural forms as well, and it should be understood that when the terms "comprises" and/or "comprising" are used in this specification, they specify the presence of stated features, steps, operations, devices, components, and/or combinations thereof, unless the context clearly indicates otherwise.
Example one
In one or more embodiments, an autonomous operation control method for a substation fire-fighting robot is disclosed, wherein when monitoring equipment in a substation, such as a fire sensor and an infrared thermal imaging system, finds that the temperature of the equipment is abnormal, the monitoring system sends abnormal equipment information to a background control system of the substation fire-fighting robot, the background control system plans an optimal path for the robot to reach the abnormal equipment according to a three-dimensional model of the substation, analyzes an optimal fire-fighting medium supply equipment position near the abnormal equipment, the robot reaches the optimal fire-fighting medium supply equipment position according to the optimal path to complete automatic butt joint of a fire hose, then rapidly moves to the position near the abnormal equipment, analyzes an ignition point of the abnormal equipment through a visible light and infrared thermal imaging system, calculates an optimal fire extinguishing angle by using a multi-vision jet curve calculation method, and calculates the optimal fire extinguishing angle from fire water, fire water, fire water, fire water, fire water, fire water, fire water, fire water, fire, And selecting the optimal fire extinguishing medium from the fire extinguishing media such as dry powder, mixed foam and the like to perform fire extinguishing operation, wherein when a fire is extinguished, the robot automatically disconnects the fire hose, the fire extinguishing medium supply equipment finishes automatic recovery of the hose, and the robot returns to the standby position.
The specific process of the method is shown in fig. 1, and comprises the following steps:
(1) establishing a three-dimensional data model of the transformer substation;
(2) laying wireless AP equipment at a plurality of fire-fighting medium supply points in the substation to construct a wireless network in the substation;
(3) positioning the fire-fighting robot according to the wireless network;
(4) determining the position of an optimal fire-fighting medium supply point according to received equipment abnormity monitoring data sent by fire monitoring equipment in the transformer substation, and planning an optimal path for a fire-fighting robot to reach abnormal equipment;
(5) controlling the fire-fighting robot to automatically run to a fire-fighting medium supply point position to complete automatic butt joint of the fire-fighting hose;
(6) controlling the fire-fighting robot to automatically run to the vicinity of abnormal equipment, and automatically selecting a fire extinguishing medium according to the type of the equipment on fire; and automatically calculating the optimal spraying angle and spraying flow according to the ignition point position.
In the step (1), the specific process of establishing the three-dimensional data model of the transformer substation comprises the following steps:
(1-1) establishing a three-dimensional visual model of the transformer substation;
by using the multi-view vision equipment carried by the robot and using the structural characteristic of the equipment in the station as constraint, an integral primary model is obtained by using multi-view reconstruction. The parallax of two images is shot by using multiple cameras (two cameras can be used in some embodiments) to construct a three-dimensional scene, and after a target is detected, the three-dimensional information of the target is obtained by calculating the position deviation between corresponding points of the images.
By analyzing the pictures returned by the camera, various distance information such as the distance between the devices and the distance between the shooting points and the devices in the pictures can be analyzed. The length information and the height information of the devices in the station are known and can be used as reference when ranging.
The model building through the stereo vision technology is a mature technology, but the precision is not high enough, so that a primary model is built.
Carry on many sensors through unmanned aerial vehicle, adopt oblique photography technique, follow simultaneously from perpendicular, foresight, back vision, left side view, five different angles of right side view gather the image, acquire abundant building top surface and the high resolution texture that looks sideways at. The method can truly reflect the situation of the ground object, acquire the texture information of the object with high precision, and generate a real three-dimensional model through advanced technologies of positioning, fusion, modeling and the like.
And (4) carrying out dense matching by combining a primary model established by the multi-view vision equipment to generate an accurate three-dimensional vision model. The method for matching the digital surface model comprises the steps of detecting angular points by using operators, then carrying out feature description on the detected angular points through feature descriptors, and matching image feature points according to corresponding matching criteria.
(1-2) establishing a three-dimensional laser model of the transformer substation;
by scanning the outdoor equipment of the transformer substation through the laser, high-precision three-dimensional point cloud data can be obtained, and a more accurate three-dimensional laser model is established.
And (1-3) matching a visual three-dimensional model established by multi-view vision and unmanned aerial vehicle inclined modeling with a laser model established by laser navigation equipment through an algorithm, so that coordinate points of the two models correspond to one another, and performing registration fusion to obtain a new three-dimensional data model of the transformer substation.
The process of performing registration fusion is as follows:
image registration: two images with the same size are mapped into the same coordinate system, so that the characteristics of the two images correspond to each other. One of the images has unchanged coordinates and is called a fixed image, and the other image is translated, rotated and scaled and is called a floating image.
Image fusion: after the two images are registered, the two images can be superposed, and the process is called simple image fusion. I.e. the joining of a number of images into a large figure.
The image registration fusion result can be optimized through a neural network optimization algorithm, the structure of the neural network algorithm is adjusted through neural network optimization training according to the environment in the station and the characteristics of equipment in the station, and the optimal image registration fusion result is finally output.
In the new transformer substation three-dimensional data model established above, an unmanned aerial vehicle is used for three-dimensional modeling by adopting an oblique photography technology, and at the moment, a robot working in the substation can be used as a part of the model to obtain a coordinate in a visual model; the laser scanning equipment carried on the robot can be matched with the laser model established in advance to obtain a coordinate position in the laser model; the two coordinates are registered and fused according to a model fusion rule, and the coordinate position of the robot in the new model can be obtained.
In the step (2), referring to fig. 2, wireless AP devices are arranged at a plurality of water supply point positions in the station, a wireless network in the substation is constructed, each point in the station is ensured to be covered by a signal, and the coordinate position of the wireless AP at the water supply point position is marked in a new three-dimensional model.
In the step (3), the robot connected into the wireless network calculates the coordinate position of the fire-fighting robot relative to the wireless AP through a triangulation method according to the coordinate position of the wireless AP in the new three-dimensional model;
the principle of the triangulation method is as follows: and determining the position of the unknown point according to the distances between the three points of the known position on the map and the unknown point.
The coordinate position of the robot in the three-dimensional data model of the transformer substation is compared with the coordinate position of the fire-fighting robot relative to the wireless AP, the purpose of comparison is to improve the accuracy, if the two coordinates are within a specified error range, the current coordinate of the robot is considered to be accurate, if the two coordinates exceed a specified value, the positioning is considered to be not accurate enough, and an alarm is sent out to adjust.
Under the environment that dense smoke shields, the robot can receive the influence in the model that establishes through the vision means coordinate precision, but the precision in the laser model is not influenced, shields under the circumstances at dense smoke, uses laser coordinate + wireless AP triangle location coordinate, can improve the precision under the circumstances that the vision coordinate is malfunctioning.
In the step (4), determining the position of an optimal fire-fighting medium supply point according to received equipment abnormity monitoring data sent by fire monitoring equipment in the transformer substation, and planning an optimal path for a fire-fighting robot to reach abnormal equipment;
fire-fighting medium supply point position selection: and marking the position of the ignition point in the three-dimensional data model of the transformer substation, calling a shortest path algorithm to measure the shortest path from each water supply point to the position of the ignition point, and selecting the water supply equipment with the shortest path.
The robot traveling route: the robot is designed to have high trafficability, so that the robot can be unobstructed in a station without worrying about roadblocks such as steps, and the shortest path from the robot to a fault device is still calculated to determine a traveling route.
In the step (5), the fire-fighting robot is controlled to automatically run to the position of a fire-fighting medium supply point according to the planned optimal path, and automatic butt joint of the fire-fighting hose is completed; referring to fig. 3, the specific process is as follows:
the rear portion of fire-fighting robot is equipped with first to the interface, and the one end and the fire-fighting medium of fire hose supply to equip and be connected, and the other end is equipped with and equips with first to the interface assorted second to the interface, all is equipped with a plurality of vision cameras on fire-fighting robot and the fire-fighting medium supply to equip for the butt joint state of real time monitoring first to the interface and second to the interface.
At least, set up first vision camera and second vision camera at fire-fighting robot's front portion and rear portion, fire-fighting medium supplies to equip and is equipped with the third vision camera, wherein first vision camera can carry out image acquisition in order to realize fire identification to fire-fighting robot's the place ahead, second vision camera and third vision camera all can carry out image acquisition to first pair of interface and second pair of interface, through the cooperation of second vision camera and third vision camera, the realization is to the full flow control of fire-fighting robot and the butt joint of fire hose, and then realize the high automation of butt joint.
The fire hose is arranged on the hose reel, and the fourth visual camera is arranged on the hose reel and used for acquiring the state image of the fire hose in real time and acquiring the image of the joints at two ends of the fire hose at least.
The hose reel is arranged in the fire-fighting medium supply equipment or fixed on the fire-fighting medium supply equipment, the second pair of interfaces is exposed, when the fire-fighting robot needs to be in butt joint with the fire-fighting hose, the robot backs up backwards, the first pair of interfaces are close to the fire-fighting medium supply equipment, and the second pair of interfaces of the fire-fighting hose and the first pair of interfaces of the fire-fighting robot are automatically in butt joint within a certain error range through the flexible butt joint device on the fire-fighting medium supply equipment.
First vision camera, second vision camera, third vision camera and fourth vision camera all are connected with fire control platform through wireless network for the equipment state image real-time transfer who will gather carries out image processing and discernment for fire control platform.
The fire control platform obtains the relative position of the first pair of joints and the second pair of joints by extracting the obvious characteristic value in the image, judges the current butt joint state by continuously grabbing a picture and carrying out image processing, wherein the water hose joints are metal pieces with fixed models, most of the water hoses are white, and the image can be easily identified by extracting the obvious characteristic value in the image.
When the first butt joint and the second butt joint are successfully butted, the first sensor sends a butting success signal to the fire control platform
The fire control platform processes and identifies the state image of the fire hose captured in real time, and when the connection position of a hose reel and the hose appears in the image, the hose is determined to be in a fully unfolded state, and the robot stops acting; and when a joint for butting the water hose with the robot appears in the image, the water hose is determined to be in a completely retracted state, and the water hose reel stops acting.
The damage of fire hose can lead to water pressure not enough, and the numerical value of pressure is preset according to the quantity of the force (forcing) pump of start-up, and the position of fire hose or first butt joint or second butt joint is equipped with pressure sensor for real-time detection water pressure transmits fire control platform, and compares with the predetermined pressure threshold value of storage, and suggestion pressure is not enough when water pressure is less than predetermined pressure threshold value, judges that there is the pipeline leakage/hosepipe damage.
In the step (6), the fire-fighting robot is controlled to automatically run to the vicinity of abnormal equipment, and the optimal fire extinguishing medium is selected from fire extinguishing media such as water columns, fine water mist, dry powder and whether foam is mixed or not according to the type of the equipment on fire, so that fire fighting operation is carried out; and automatically calculating the optimal spraying angle and spraying flow according to the ignition point position.
Referring to fig. 4, the specific process of automatically calculating the optimal injection angle and injection flow rate according to the ignition point position is as follows:
and (6-1) acquiring image information of the field environment through a common vision camera of the multi-view vision device, wherein the image information comprises image information of devices in the field environment, image information of fire conditions in the field environment, information of smoke concentration in the field environment and the like. If the fire scene is in the fire scene, visual image information such as the firing equipment, the fire size, the smoke concentration and the like can be acquired through the multi-view visual equipment.
An infrared image of a field environment is acquired through an infrared camera of the multi-view vision device, and the infrared image mainly comprises the temperature, the highest temperature, the position where the highest temperature appears, the shape of flame and the like of all parts in the field environment. If the fire scene exists, the highest temperature of the temperature in the scene environment, the position where the highest temperature appears, the shape of the flame and other information can be collected.
(6-2) carrying out image graying, segmentation, filtering and other processing on the obtained image by adopting an image processing algorithm, and respectively determining corresponding suspicious fire areas;
first, color detection is performed on the image, such as a large sheet of orange or black, and preliminary processing such as specific gravity calculation is performed.
And then, carrying out gray processing and motion detection on the image after the primary processing to determine whether the image has a suspicious flame region.
And filtering the suspicious flame region, extracting a color histogram of the filtered image, extracting an image characteristic value, performing matching processing, and determining the suspicious fire region in the image.
And finally, dividing and normalizing the suspicious fire area to be used as a basic unit for subsequent judgment.
For the acquired infrared image, because the multi-view vision equipment selects a proper infrared camera according to the wave band of flame infrared radiation, the acquired infrared image has less interference of other infrared signals except the radiation signal of flame, and the interference sources such as other heating equipment or reflection and the like have more regular shapes, the infrared image is processed more simply, the infrared image is segmented after image graying preprocessing, the segmented image characteristic value is extracted, and the extracted image characteristic value is input into a trained neural network model for recognition, so that the fire suspicious region of the infrared image can be obtained.
And (6-3) positioning the fire area according to the preprocessing result of the visual image information and the infrared image information. In the present embodiment, the fire zone includes an authentic fire zone and a suspected fire zone.
And comparing the suspicious fire area processed by the visual image with the suspicious fire area processed by the infrared image, taking the overlapped suspicious fire area as a credible fire area, and if the non-overlapped suspicious fire area is taken as a suspected fire area, judging the overlapped unsuspected fire area as an area without fire, namely the area without fire.
And (6-4) establishing a spray curve model according to the ignition area, identifying the drop point of the water outlet column, and determining the optimal spray angle and spray flow.
Aiming is carried out after the ignition area is determined, the bottom of the credible fire area is used as a target area according to the obtained credible fire area, a spray curve model can be established because the water column curve sprayed by the equipment is fixed with the falling point, the angle and the height of the cradle head are adjusted, the falling point of the curve model falls in the credible fire area, a spray picture returned by other cameras carried by the robot after spraying is used for carrying out image processing by calling an algorithm, and the sprayed water column falling point is identified in the image.
The specific implementation process of processing the jet image and identifying the drop point of the jetted water column comprises the following steps:
preprocessing the jet image, including denoising, smoothing, transforming and the like;
extracting a characteristic value of a jet water column in the preprocessed image;
and inputting the extracted characteristic value of the sprayed water column into a neural network image recognition model, and recognizing the drop point of the sprayed water column.
When the fire-fighting medium is dry powder or water mist, the fire-fighting medium only needs to be controlled to cover the area including the fire point.
When the credible fire area does not exist, according to the obtained suspected fire area, taking the bottom of the suspected fire area as a target area, establishing a spray curve model, adjusting the angle and the height of a cloud deck, enabling the falling point of the curve model to fall in the credible fire area, carrying out image processing through pictures returned by other cameras carried by the robot after spraying, calling an algorithm to recognize the position of a sprayed water column in the image, and determining the optimal spray angle according to the coordinate difference between the falling point of the water column and the suspected fire area.
(6-5) in the fire extinguishing process, judging the fire intensity of the firing equipment according to the ratio of the area of the firing area to the area of the firing equipment;
the fire condition of the on-site ignition equipment is judged by mainly looking at the relative size of flame and comparing the area ratio of the ignition area to the whole equipment. The on-site fire condition judgment rules of different equipment are different, and for power equipment with the length, the width and the height of about 1m, if the area of a firing area accounts for more than one half of the area of the equipment surface design, namely, the fire is big, about one third of the area is middle fire, and less than one third of the area is small fire. For an electric power device with a length, width and height of about 3m, one third of the area is counted as a big fire.
According to different devices in a station, different sample libraries are established, when the robot identifies that the device is on fire or receives alarm information (such as 'xx device on fire'), various information such as a proper fire extinguishing distance, a fire condition judgment basis and the like can be directly obtained from the libraries, and the robot carries out operation through real-time judgment on the basis of various information such as the fire extinguishing distance, the fire condition judgment basis and the like obtained from the sample libraries.
The information in the sample library is obtained by training in advance, and the real-time judgment result obtained by the robot in each operation is stored in the sample library.
In the embodiment, the injection flow is divided into three stages, namely high, medium and low, which respectively correspond to the injection flow from high to low; normally, when the operation is started, a high gear is selected, and a medium gear and a low gear are selected along with the reduction of the fire; if the fire is initially small or the primary effect is to cool down, then medium or low range is selected at the beginning.
In the embodiment, in the working process of the fire-fighting robot, the multi-view vision equipment is used for monitoring the working progress in real time, and the spraying angle and flow are adjusted according to the fire condition, so that accurate fire extinguishment is realized, and other normally running equipment in a station is not influenced;
the intelligent fire extinguishing system is used for accurately extinguishing fire for specific fire equipment in a station, and can immediately process field pictures shot by a visible camera and an infrared camera carried by a robot through an algorithm, analyze field conditions and automatically adjust the spraying position and the spraying flow.
In the operation process, the operation mode is adjusted in real time according to the operation effect, the field environment is automatically analyzed, the switching of multiple injection modes can be carried out, and more effective response can be made for different fire scenes; is favorable for realizing the autonomous operation of the fire-fighting robot.
Example two
In one or more embodiments, a substation fire-fighting robot autonomous operation control system is disclosed, comprising:
means for building a three-dimensional data model of the substation;
the device is used for laying wireless AP equipment at a plurality of fire-fighting medium supply points in the substation and constructing a wireless network in the substation;
means for locating a fire-fighting robot according to the wireless network;
the device is used for determining the position of an optimal fire-fighting medium supply point according to received equipment abnormity monitoring data sent by fire monitoring equipment in the transformer substation and planning an optimal path for the fire-fighting robot to reach the abnormal equipment;
the device is used for controlling the fire-fighting robot to automatically operate to the fire-fighting medium supply point position to complete the automatic butt joint of the fire-fighting hose;
a device for controlling the fire-fighting robot to automatically operate to the vicinity of the abnormal equipment and automatically selecting a fire extinguishing medium according to the type of the equipment on fire;
and the device is used for automatically calculating the optimal spraying angle and spraying flow according to the ignition point position.
The working process of each device described above refers to the process described in the first embodiment, and is not described again.
EXAMPLE III
In one or more embodiments, a substation fire-fighting robot is disclosed, which employs the substation fire-fighting robot autonomous operation control method described in embodiment one to implement autonomous fire-fighting operation in a substation.
Although the embodiments of the present invention have been described with reference to the accompanying drawings, it is not intended to limit the scope of the present invention, and it should be understood by those skilled in the art that various modifications and variations can be made without inventive efforts by those skilled in the art based on the technical solution of the present invention.

Claims (10)

1. A transformer substation fire-fighting robot autonomous operation control method is characterized by comprising the following steps:
establishing a three-dimensional data model of a transformer substation, constructing a wireless network in the transformer substation, and positioning the fire-fighting robot according to the wireless network; the method comprises the following steps of constructing a wireless network in a transformer substation, and positioning the fire-fighting robot according to the wireless network, wherein the method specifically comprises the following steps:
wireless network equipment is arranged at a plurality of fire-fighting medium supply points in the station,
calculating a first coordinate position of the fire-fighting robot relative to the wireless AP according to the coordinate position of the wireless AP equipment in the three-dimensional data model of the transformer substation;
determining a second coordinate position of the fire-fighting robot in the three-dimensional data model of the transformer substation, comparing the first coordinate position with the second coordinate position, and determining whether the current coordinate position of the robot is correct or not;
determining the position of an optimal fire-fighting medium supply point, and planning an optimal path for the fire-fighting robot to reach abnormal equipment;
controlling the fire-fighting robot to automatically run to a fire-fighting medium supply point position to complete automatic butt joint of the fire-fighting hose; the automatic butt joint process of fire hose still includes: processing and identifying the state image of the fire hose captured in real time, and judging that the fire hose is in a fully unfolded state when the connection position of a hose reel and the hose appears in the image; when a joint for butting the fire hose and the fire-fighting robot appears in the image, judging that the fire hose is in a completely retracted state;
controlling the fire-fighting robot to automatically run to the vicinity of abnormal equipment, and automatically selecting a fire extinguishing medium according to the type of the equipment on fire; automatically calculating the optimal spraying angle and spraying flow according to the position of the ignition point; the optimal spraying angle and spraying flow are automatically calculated according to the position of the ignition point, and the method specifically comprises the following steps:
determining the position and the area of the fire point according to the visual image information and the infrared image information of the site environment;
identifying a drop point of the jet medium, taking the bottom of the ignition area as a target area, and judging the coordinate difference between the drop point of the jet medium and the target area;
constructing a fire-fighting medium injection curve model by taking the minimum coordinate difference between the drop point of the injection medium and the target area as a target, and determining the optimal injection angle;
and adjusting the jet flow according to the area ratio of the credible fire area to the suspected fire area in the fire area.
2. The method for controlling the autonomous operation of the fire-fighting robot in the substation according to claim 1, wherein the establishing of the three-dimensional data model of the substation specifically comprises:
establishing a three-dimensional visual model of the transformer substation by using multi-view visual equipment carried by a fire-fighting robot;
scanning equipment outside a transformer substation by laser to obtain three-dimensional point cloud data of the equipment in the transformer substation, and establishing a three-dimensional laser model;
and carrying out registration fusion on the obtained model to obtain a three-dimensional data model of the transformer substation.
3. The autonomous operation control method of the fire-fighting robot of the transformer substation of claim 1, characterized in that the position of the optimal fire-fighting medium supply point is determined, and the optimal path for the fire-fighting robot to reach the abnormal equipment is planned; the method specifically comprises the following steps:
determining the coordinate position of the ignition point in the three-dimensional data model of the transformer substation, calculating the path distance from each medium supply point to the ignition point of the fire-fighting robot, and selecting the medium supply point corresponding to the shortest path distance as the medium supply point position of the fire-fighting robot;
and determining the traveling route of the fire-fighting robot to the medium supply point and the fire point position by adopting a shortest path method.
4. The autonomous operation control method of the substation fire-fighting robot according to claim 1, wherein the automatic butt joint of the fire hose is completed by the following specific processes:
the image acquisition device is used for acquiring image information of the fire-fighting robot and the fire hose butt joint interface in real time, obtaining the relative position of the fire-fighting robot and the fire hose butt joint interface by extracting the characteristic value in the image, continuously adjusting the position of the butt joint interface on the fire-fighting robot, carrying out image processing by continuously grabbing images, combining multi-sensor data, judging the current butt joint state until the butt joint of the fire-fighting robot and the fire hose is successful.
5. The autonomous operation control method of a fire-fighting robot of a substation according to claim 4,
the automatic butt joint process of fire hose still includes: the pressure of real-time detection fire hose and fire-fighting robot butt joint kneck to compare with the preset pressure threshold value of storage, when pressure is less than preset threshold value suggestion pressure is not enough, judges that there is the pipeline leakage or the hosepipe is damaged.
6. The substation fire-fighting robot autonomous operation control method according to claim 1, characterized in that the position and area of a fire point are determined according to visual image information and infrared image information of a site environment, specifically:
respectively preprocessing the obtained visual image information and infrared image information;
comparing the suspicious fire area obtained after the visual image processing with the suspicious fire area obtained after the infrared image processing, taking the overlapped suspicious fire area as a credible fire area, taking the non-overlapped suspicious fire area as a suspicious fire area, and judging the overlapped non-suspicious fire area as an area without fire; and converting the credible fire area into flame space coordinate information.
7. The autonomous operation control method of the substation fire-fighting robot according to claim 6, wherein the obtained visual image information and infrared image information are preprocessed, specifically:
preprocessing the visual image: carrying out gray processing and motion detection on the preprocessed image to determine whether a suspicious flame area exists in the visual image; filtering the suspicious flame region, extracting a color histogram of the filtered image, extracting an image characteristic value, performing matching processing, and determining the suspicious fire region in the visual image; dividing and normalizing the suspicious fire area;
preprocessing infrared image information: and (3) segmenting the infrared image after carrying out image graying pretreatment, extracting the feature value of the segmented image, inputting the extracted feature value of the image into a trained neural network model for identification, and obtaining the suspicious fire area of the infrared image.
8. The substation fire-fighting robot autonomous operation control method according to claim 1, wherein the autonomous operation method further comprises: judging the fire intensity of the ignition equipment according to the ratio of the area of the ignition area to the area of the ignition equipment; and adjusting the jet flow in real time according to the fire intensity.
9. The utility model provides a fire-fighting robot of transformer substation independently operation control system which characterized in that includes:
means for building a three-dimensional data model of the substation;
means for constructing a wireless network within a substation;
means for locating a fire-fighting robot according to the wireless network; the method comprises the following steps of constructing a wireless network in a transformer substation, and positioning the fire-fighting robot according to the wireless network, wherein the method specifically comprises the following steps:
wireless network equipment is arranged at a plurality of fire-fighting medium supply points in the station,
calculating a first coordinate position of the fire-fighting robot relative to the wireless AP according to the coordinate position of the wireless AP equipment in the three-dimensional data model of the transformer substation;
determining a second coordinate position of the fire-fighting robot in the three-dimensional data model of the transformer substation, comparing the first coordinate position with the second coordinate position, and determining whether the current coordinate position of the robot is correct or not;
means for determining the location of an optimal fire-fighting medium supply point while planning an optimal path for the fire-fighting robot to reach the abnormal equipment;
the device is used for controlling the fire-fighting robot to automatically operate to the fire-fighting medium supply point position to complete the automatic butt joint of the fire-fighting hose; the automatic butt joint process of fire hose still includes: processing and identifying the state image of the fire hose captured in real time, and judging that the fire hose is in a fully unfolded state when the connection position of a hose reel and the hose appears in the image; when a joint for butting the fire hose and the fire-fighting robot appears in the image, judging that the fire hose is in a completely retracted state;
a device for controlling the fire-fighting robot to automatically operate to the vicinity of the abnormal equipment and automatically selecting a fire extinguishing medium according to the type of the equipment on fire;
means for automatically calculating an optimum spray angle and spray flow rate based on the location of the fire point; the optimal spraying angle and spraying flow are automatically calculated according to the position of the ignition point, and the method specifically comprises the following steps:
determining the position and the area of the fire point according to the visual image information and the infrared image information of the site environment;
identifying a drop point of the jet medium, taking the bottom of the ignition area as a target area, and judging the coordinate difference between the drop point of the jet medium and the target area;
constructing a fire-fighting medium injection curve model by taking the minimum coordinate difference between the drop point of the injection medium and the target area as a target, and determining the optimal injection angle;
and adjusting the jet flow according to the area ratio of the credible fire area to the suspected fire area in the fire area.
10. A fire-fighting robot of transformer substation, its characterized in that includes: the method for controlling the autonomous operation of the substation fire-fighting robot of any one of claims 1 to 8 is adopted.
CN201911253475.0A 2019-12-09 2019-12-09 Autonomous operation control method and system for fire-fighting robot of transformer substation Active CN110989599B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911253475.0A CN110989599B (en) 2019-12-09 2019-12-09 Autonomous operation control method and system for fire-fighting robot of transformer substation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911253475.0A CN110989599B (en) 2019-12-09 2019-12-09 Autonomous operation control method and system for fire-fighting robot of transformer substation

Publications (2)

Publication Number Publication Date
CN110989599A CN110989599A (en) 2020-04-10
CN110989599B true CN110989599B (en) 2022-06-24

Family

ID=70091611

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911253475.0A Active CN110989599B (en) 2019-12-09 2019-12-09 Autonomous operation control method and system for fire-fighting robot of transformer substation

Country Status (1)

Country Link
CN (1) CN110989599B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113064373B (en) * 2021-04-07 2022-04-15 四川中鼎智能技术有限公司 Industrial hydroelectric equipment logic signal control method, system, terminal and storage medium based on video image recognition
CN113144470B (en) * 2021-04-07 2022-06-17 安徽相品智能科技有限公司 Fire-fighting emergency early warning treatment and fire extinguishing integrated control system
CN113595239B (en) * 2021-06-17 2024-03-26 南瑞集团有限公司 Cloud side end cooperative intelligent management and control system for transformer substation
CN114668995B (en) * 2022-04-29 2023-03-21 西安交通大学 Transformer substation intelligent robot fire fighting system and method based on high-pressure water mist
CN115562255A (en) * 2022-09-13 2023-01-03 中国安全生产科学研究院 Multi-intelligent fire-fighting robot fire hose anti-winding method based on air-ground cooperation

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105396251A (en) * 2015-10-16 2016-03-16 南京航空航天大学 Robot-assisted three-dimensional intelligent fire extinguishing system
CN106526535A (en) * 2016-11-08 2017-03-22 北京创想智控科技有限公司 Indoor robot positioning method and device
CN208003294U (en) * 2017-10-30 2018-10-26 北京自安科技发展有限公司 A kind of Robot Extinguishing Fire system and Intelligent fire-fighting robot of Multi-sensor Fusion
CN109447030A (en) * 2018-11-12 2019-03-08 重庆知遨科技有限公司 A kind of fire-fighting robot movement real-time instruction algorithm for fire scenario
CN109646853A (en) * 2018-12-17 2019-04-19 华北科技学院 A kind of autonomous fire fighting robot device and monitoring system
CN109876345A (en) * 2019-03-05 2019-06-14 华北科技学院 A kind of intelligent fire fighting method and firefighting robot
CN110180114A (en) * 2019-06-05 2019-08-30 山东国兴智能科技股份有限公司 Fire-fighting robot co-located, scouting, fire source identification and aiming extinguishing method
CN110339516A (en) * 2019-08-08 2019-10-18 北京新松融通机器人科技有限公司 A kind of device of view-based access control model detection and arm automatic butt fire hose in parallel
CN110420421A (en) * 2019-07-12 2019-11-08 东南大学 A kind of cable passage inspection firefighting robot

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012018497A2 (en) * 2010-07-25 2012-02-09 Raytheon Company ENHANCED SITUATIONAL AWARENESS AND TARGETING (eSAT) SYSTEM
CN102968681A (en) * 2012-09-30 2013-03-13 安科智慧城市技术(中国)有限公司 Firefighting management system and method
CN103413313B (en) * 2013-08-19 2016-08-10 国家电网公司 The binocular vision navigation system of electrically-based robot and method
CN106470478B (en) * 2015-08-20 2020-03-24 西安云景智维科技有限公司 Positioning data processing method, device and system
CN105031859A (en) * 2015-08-26 2015-11-11 上海格拉曼国际消防装备有限公司 Fire-fighting robot
CN105758450B (en) * 2015-12-23 2017-11-24 西安石油大学 Met an urgent need based on multisensor the fire-fighting early warning sensory perceptual system construction method of robot
RU2623632C1 (en) * 2016-08-22 2017-06-28 Акционерное общество "Квантум Системс" Method for processing volumetric objects
CN106447585A (en) * 2016-09-21 2017-02-22 武汉大学 Urban area and indoor high-precision visual positioning system and method
CN106334283A (en) * 2016-10-10 2017-01-18 南京工程学院 Fire-fighting and rescue robot system and control method
CN107223269B (en) * 2016-12-29 2021-09-28 达闼机器人有限公司 Three-dimensional scene positioning method and device
CN108992825A (en) * 2018-06-06 2018-12-14 北方工业大学 Intelligent fire-fighting robot and control method thereof
CN109102566A (en) * 2018-08-29 2018-12-28 郑州祥和电力设计有限公司 A kind of indoor outdoor scene method for reconstructing and its device of substation
CN109583366B (en) * 2018-11-28 2022-04-08 哈尔滨工业大学 Sports building evacuation crowd trajectory generation method based on video images and WiFi positioning
CN209312190U (en) * 2018-12-12 2019-08-27 智洋创新科技股份有限公司 Electric power tunnel intelligent fire-pretection system
CN110180112B (en) * 2019-06-05 2020-11-13 山东国兴智能科技股份有限公司 Cooperative reconnaissance fire-extinguishing operation method for unmanned aerial vehicle and fire-fighting robot

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105396251A (en) * 2015-10-16 2016-03-16 南京航空航天大学 Robot-assisted three-dimensional intelligent fire extinguishing system
CN106526535A (en) * 2016-11-08 2017-03-22 北京创想智控科技有限公司 Indoor robot positioning method and device
CN208003294U (en) * 2017-10-30 2018-10-26 北京自安科技发展有限公司 A kind of Robot Extinguishing Fire system and Intelligent fire-fighting robot of Multi-sensor Fusion
CN109447030A (en) * 2018-11-12 2019-03-08 重庆知遨科技有限公司 A kind of fire-fighting robot movement real-time instruction algorithm for fire scenario
CN109646853A (en) * 2018-12-17 2019-04-19 华北科技学院 A kind of autonomous fire fighting robot device and monitoring system
CN109876345A (en) * 2019-03-05 2019-06-14 华北科技学院 A kind of intelligent fire fighting method and firefighting robot
CN110180114A (en) * 2019-06-05 2019-08-30 山东国兴智能科技股份有限公司 Fire-fighting robot co-located, scouting, fire source identification and aiming extinguishing method
CN110420421A (en) * 2019-07-12 2019-11-08 东南大学 A kind of cable passage inspection firefighting robot
CN110339516A (en) * 2019-08-08 2019-10-18 北京新松融通机器人科技有限公司 A kind of device of view-based access control model detection and arm automatic butt fire hose in parallel

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
"基于成像激光雷达与双CCD复合的三维精细成像技术研究";焦宏伟;《中国博士学位论文全文数据库 信息科技辑》;20140415(第04期);I136-65 *
"无人机火灾检测平台的设计和构建";王思嘉;《中国优秀硕士学位论文全文数据库 信息科技辑》;20110415(第04期);I138-760 *

Also Published As

Publication number Publication date
CN110989599A (en) 2020-04-10

Similar Documents

Publication Publication Date Title
CN110989599B (en) Autonomous operation control method and system for fire-fighting robot of transformer substation
CN110917529B (en) Transformer substation fire-fighting robot and operation method thereof
CN110841220B (en) Intelligent fire-fighting system and method for transformer substation
CN110898353A (en) Panoramic monitoring and linkage control method and system for fire-fighting robot of transformer substation
CN110133440B (en) Electric unmanned aerial vehicle based on pole tower model matching and visual navigation and inspection method
CA3137995C (en) Multi-mode visual servo control fire-fighting system and working method thereof
CN109773783B (en) Patrol intelligent robot based on space point cloud identification and police system thereof
JP6484695B1 (en) Ship block joint welding defect marking method
CN110837822B (en) Fire-fighting robot injection curve adjusting method and device based on multi-view vision
CN113791641A (en) Aircraft-based facility detection method and control equipment
CN107193277A (en) Autonomous detects the fire-fighting robot and control method of fire extinguishing automatically
CN111512256A (en) Automated and adaptive three-dimensional robotic site survey
CN105759834A (en) System and method of actively capturing low altitude small unmanned aerial vehicle
CN109737981B (en) Unmanned vehicle target searching device and method based on multiple sensors
CN101574567A (en) Computer vision technique based method and system for detecting and extinguishing fire disaster intelligently
CN111679695B (en) Unmanned aerial vehicle cruising and tracking system and method based on deep learning technology
CN107830860B (en) A kind of unmanned boat lifting recycling visual guide method
CN110940316B (en) Navigation method and system for fire-fighting robot of transformer substation in complex environment
CN112258682B (en) Transformer substation robot inspection system and inspection method thereof
CN110975194A (en) Transformer substation fire-fighting robot auxiliary method and system
CN113730860A (en) Autonomous fire extinguishing method of fire-fighting robot in unknown environment
CN112270267A (en) Camera shooting recognition system capable of automatically capturing line faults
CN106052695A (en) Flight inspection tour system and method performing navigation by utilizing 360-degree laser scanner
CN112863113A (en) Intelligent fire-fighting system and method for automatic detector alarming and fire extinguishing and storage medium
CN105373130A (en) Special device accident on-site information detection system based on stereo modeling

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant