CN110962128B - Substation inspection and stationing method and inspection robot control method - Google Patents
Substation inspection and stationing method and inspection robot control method Download PDFInfo
- Publication number
- CN110962128B CN110962128B CN201911267761.2A CN201911267761A CN110962128B CN 110962128 B CN110962128 B CN 110962128B CN 201911267761 A CN201911267761 A CN 201911267761A CN 110962128 B CN110962128 B CN 110962128B
- Authority
- CN
- China
- Prior art keywords
- coordinate system
- equipment
- space coordinate
- stationing
- calibration label
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1602—Programme controls characterised by the control system, structure, architecture
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J11/00—Manipulators not otherwise provided for
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/02—Sensing devices
- B25J19/021—Optical sensing devices
- B25J19/023—Optical sensing devices including video camera means
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Automation & Control Theory (AREA)
- Multimedia (AREA)
- Manipulator (AREA)
Abstract
The application provides a substation inspection and stationing method and an inspection robot control method, wherein the stationing method comprises the following steps: acquiring a detection surface image of the equipment to be detected by using camera equipment, wherein the detection surface image comprises a visual calibration label pasted on the detection surface of the equipment to be detected; establishing a first space coordinate system by taking the visual calibration label as an origin according to the detection surface image; and determining the position of a detection point of the equipment to be detected in the first space coordinate system according to the first space coordinate system to obtain an equipment stationing model corresponding to the visual calibration label. This application adopts the mode of stationing of relative position to visual calibration label in the detection face image is the benchmark, founds first space coordinate system, and based on the detection point position on the first space coordinate system recording equipment, thereby obtains the stationing information of patrolling and examining of equipment, has solved the transformer substation that current online stationing mode leads to and has patrolled and examined the robot and patrol and examine the technical problem that stationing efficiency is low.
Description
Technical Field
The application relates to the technical field of power equipment inspection, in particular to a substation inspection and distribution method and an inspection robot control method.
Background
The transformer substation is an important node of a main grid of a power grid, the voltage level of the transformer substation is generally high, and if equipment of the transformer substation breaks down, the problem of power supply in a large-range area is affected, so that the equipment safety of the transformer substation plays a crucial role in power supply reliability of the power grid.
At present, automatic inspection is realized in the equipment inspection work of most transformer substations, a robot is moved to a specific stop point through inspection, and inspection image information of equipment is acquired through adjusting a camera module, so that inspection of the equipment is completed.
The current work robot cloth point adopts is online cloth point mode, is about to the robot transport scene, and each operation point arranges, all needs to remove the assigned position with the robot, transfers appointed angle, accomplishes and predetermines the procedure and sets for, when needs increase the operating point or when arranging the modification of work, also equally need the robot just can accomplish at appointed operating point position, has led to current transformer substation and has patrolled and examined the technical problem that the robot patrolled and examined cloth point inefficiency.
Disclosure of Invention
The application provides a substation inspection and distribution method and an inspection robot control method, which are used for solving the technical problem of low inspection and distribution efficiency of a substation inspection robot caused by the existing online distribution mode.
In view of this, the first aspect of the present application provides a substation inspection and stationing method, including:
acquiring a detection surface image of equipment to be detected by using camera equipment, wherein the detection surface image comprises a visual calibration label pasted on the detection surface of the equipment to be detected;
establishing a first space coordinate system by taking the visual calibration label as an origin according to the detection surface image;
and determining the position of the detection point of the equipment to be detected in the first space coordinate system according to the first space coordinate system to obtain an equipment stationing model corresponding to the visual calibration label.
Optionally, the determining, according to the first spatial coordinate system, the position of the detection point of the device to be detected in the first spatial coordinate system to obtain the device placement model corresponding to the visual calibration label specifically includes:
determining a first projection distance from the detection point to a reference plane, a second projection distance from the camera equipment to the reference plane, a horizontal direction distance from the detection point to an origin of the first space coordinate system and a vertical direction distance from the detection point to the origin of the first space coordinate system according to the first space coordinate system, wherein the reference plane is a Y-Z plane of the first space coordinate system;
and recording the first projection distance, the second projection distance, the horizontal direction distance and the vertical direction distance corresponding to the detection point to obtain an equipment stationing model corresponding to the visual calibration label.
Optionally, the visual calibration label specifically includes: apriltag tags.
A second aspect of the present application provides a patrol robot control method, including:
when the inspection robot is detected to move to a preset stop point, acquiring an equipment image of equipment to be detected through a robot vision module, wherein the equipment image comprises a vision calibration label pasted on the equipment to be detected;
establishing a second space coordinate system with the robot vision module as an original point according to the equipment image, and determining space parameters of the inspection robot based on the second space coordinate system and the position of the vision calibration label in the second space coordinate system;
acquiring an equipment stationing model corresponding to the visual calibration label according to the image of the visual calibration label to obtain a first space coordinate system in the equipment stationing model;
and obtaining the motion control parameters of the inspection robot through coordinate conversion and a kinematics calculation mode according to the first space coordinate system and the second space coordinate system.
Optionally, the determining the spatial parameters of the inspection robot based on the second spatial coordinate system and the position of the visual calibration tag in the second spatial coordinate system specifically includes:
and obtaining a position vector and a rotation matrix parameter from the origin of the second coordinate system to the reference plane through a visual algorithm operation mode based on the second space coordinate system and the position of the visual calibration label in the second space coordinate system.
The third aspect of the application provides a transformer substation patrols and examines stationing device, includes:
the device comprises an image acquisition unit, a processing unit and a display unit, wherein the image acquisition unit is used for acquiring a detection surface image of equipment to be detected through camera equipment, and the detection surface image comprises a visual calibration label pasted on the detection surface of the equipment to be detected;
the first space coordinate system building unit is used for building a first space coordinate system by taking the visual calibration label as an origin according to the detection surface image;
and the stationing model constructing unit is used for determining the position of the detection point of the equipment to be detected in the first space coordinate system according to the first space coordinate system to obtain an equipment stationing model corresponding to the visual calibration label.
Optionally, the stationing model building unit is specifically configured to:
determining a first projection distance from the detection point to a reference plane, a second projection distance from the camera equipment to the reference plane, a horizontal direction distance from the detection point to an origin of the first space coordinate system and a vertical direction distance from the detection point to the origin of the first space coordinate system according to the first space coordinate system, wherein the reference plane is a Y-Z plane of the first space coordinate system;
and recording the first projection distance, the second projection distance, the horizontal direction distance and the vertical direction distance corresponding to the detection point to obtain an equipment stationing model corresponding to the visual calibration label.
The present application fourth aspect provides a robot patrols and examines, includes:
the equipment image acquisition unit is used for acquiring an equipment image of equipment to be detected through the robot vision module when the inspection robot is detected to move to a preset stop point, wherein the equipment image comprises a vision calibration label pasted on the equipment to be detected;
the robot space parameter calculation unit is used for establishing a second space coordinate system with the robot vision module as an original point according to the equipment image, and determining the space parameters of the inspection robot based on the second space coordinate system and the position of the vision calibration label in the second space coordinate system;
the first space coordinate system pairing unit is used for acquiring an equipment stationing model corresponding to the visual calibration label according to the image of the visual calibration label to obtain a first space coordinate system in the equipment stationing model;
and the motion control parameter calculation unit is used for obtaining the motion control parameters of the inspection robot through coordinate conversion and a kinematics calculation mode according to the first space coordinate system and the second space coordinate system.
Optionally, the robot spatial parameter calculating unit is specifically configured to:
and obtaining a position vector and a rotation matrix parameter from the origin of the second coordinate system to the reference plane through a visual algorithm operation mode based on the second space coordinate system and the position of the visual calibration label in the second space coordinate system.
The fifth aspect of the application provides an indoor system of patrolling and examining of transformer substation, includes: the inspection equipment comprises inspection equipment and an inspection robot according to the fourth aspect of the application;
the device to be detected is adhered with a visual calibration label, and the visual calibration label corresponds to the detection surface of the device to be detected one by one;
the inspection robot is used for inspecting the equipment to be inspected.
According to the technical scheme, the embodiment of the application has the following advantages:
the application provides a substation inspection and stationing method, which comprises the following steps: acquiring a detection surface image of equipment to be detected by using camera equipment, wherein the detection surface image comprises a visual calibration label pasted on the detection surface of the equipment to be detected; establishing a first space coordinate system by taking the visual calibration label as an origin according to the detection surface image; and determining the position of the detection point of the equipment to be detected in the first space coordinate system according to the first space coordinate system to obtain an equipment stationing model corresponding to the visual calibration label.
This application adopts the mode of stationing of relative position, through the detection face image that acquires equipment to visual calibration label in the detection face image is the benchmark, founds first space coordinate system, and based on the detection point position on the first space coordinate system recording equipment, thereby obtains the equipment and patrols and examines stationing information, and the stationing process need not set up the robot of patrolling and examining at the scene, has solved the transformer substation that current online stationing mode leads to and has patrolled and examine the robot and patrol and examine the technical problem that stationing efficiency is low.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without inventive exercise.
Fig. 1 is a schematic flow chart of an embodiment of a substation inspection and stationing method provided in the present application;
fig. 2 is a schematic flowchart of an embodiment of a method for controlling a patrol robot according to the present application;
fig. 3 is a schematic structural diagram of an embodiment of a substation inspection and distribution device provided by the present application;
fig. 4 is a schematic diagram of a control architecture of the inspection robot provided by the present application.
Detailed Description
The embodiment of the application provides a substation inspection and distribution method and an inspection robot control method, which are used for solving the technical problem of low inspection and distribution efficiency of a substation inspection robot caused by the existing online distribution mode.
In order to make the objects, features and advantages of the present invention more apparent and understandable, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application, and it is apparent that the embodiments described below are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Referring to fig. 1, a first embodiment of the present application provides a substation inspection and stationing method, including:
it should be noted that, when the spotting method of this embodiment is implemented, first, a detection surface image of the device to be detected to which the visual calibration label is attached is obtained, and the obtained detection surface image should include an image of the visual calibration label.
It can be understood that the image capturing device mentioned in this embodiment may be a mobile phone, a camera, or other portable image capturing and recording devices, which are not described in detail herein.
And 102, establishing a first space coordinate system by taking the visual calibration label as an origin according to the detection surface image.
Next, from the detection surface image acquired in step 101, a first spatial coordinate system is established with the visual alignment label in the detection surface image as the origin.
And 103, determining the position of the detection point of the equipment to be detected in the first space coordinate system according to the first space coordinate system to obtain an equipment stationing model corresponding to the visual calibration label.
It should be noted that, based on the first spatial coordinate system obtained in step 102, the position parameters of the detection points of the device to be detected in the first spatial coordinate system are determined in the first spatial coordinate system, and the position parameters of the detection points are summarized to obtain the device placement model including the placement information of the detection points on the detection surface of the device to be detected. And finishing the routing inspection and distribution of one detection surface, and distributing other detection surfaces according to the same method.
More specifically, the determining, according to the first spatial coordinate system, the position of the detection point of the device to be detected in the first spatial coordinate system in step 103 of this embodiment, and obtaining the device placement model corresponding to the visual calibration label specifically includes:
determining a first projection distance X1 from a detection point to a reference plane, a second projection distance X2 from a camera device to the reference plane, a horizontal direction distance Y from the detection point to the origin of the first space coordinate system and a vertical direction distance Z from the detection point to the origin of the first space coordinate system according to the first space coordinate system, wherein the reference plane is a Y-Z plane of the first space coordinate system;
and recording a first projection distance X1, a second projection distance X2, a horizontal direction distance Y and a vertical direction distance Z corresponding to the detection points to obtain an equipment stationing model corresponding to the visual calibration label.
It should be noted that most of the inspection detection points of the existing power equipment are arranged on one surface, that is, in a general case, one equipment corresponds to one detection surface, and one detection surface corresponds to one visual calibration label, and if two detection surfaces exist in one equipment, two visual calibration labels should be arranged correspondingly, and so on.
Further, the visual calibration label of the present embodiment may employ an Apriltag label.
This application embodiment is through the mode of stationing that adopts relative position, through the detection face image that acquires equipment to visual calibration label in the detection face image is the benchmark, founds first space coordinate system, and based on the detection point position on the first space coordinate system recording equipment, thereby obtain patrolling and examining stationing information of equipment, the stationing process need not set up the robot of patrolling and examining at the scene, has realized off-line stationing, the transformer substation that has solved current online stationing mode and has leaded to patrols and examines the robot and patrol and examine the technical problem that stationing efficiency is low.
The above is a detailed description of an embodiment of a substation routing inspection and stationing method provided by the present application, and the following is a detailed description of an embodiment of a routing inspection robot control method provided by the present application, which corresponds to the substation routing inspection and stationing method according to the first embodiment of the present application.
Referring to fig. 2, a second embodiment of the present application provides a patrol robot control method based on stationing information obtained by a substation patrol stationing method provided in the first embodiment of the present application, including:
It can be understood that the inspection robot of this embodiment is the inspection robot provided with the robot vision module, and when detecting that the inspection robot moves to the preset stop point, the robot vision module installed on the inspection robot obtains the equipment image of the equipment to be detected.
It will be appreciated that in order to include the visual calibration label affixed to the device under test in the captured device image, the docking point of the present embodiment should be positioned in front of, preferably directly in front of, the detection face of the device under test.
It should be noted that, according to the device image obtained in step 201, a second spatial coordinate system with the robot vision module as an origin is established, and the spatial parameters of the inspection robot are determined based on the second spatial coordinate system and the position of the vision calibration tag in the second spatial coordinate system.
More specifically, based on the second spatial coordinate system and the position of the visual calibration label in the second spatial coordinate system, a position vector P from the origin of the second coordinate system to the reference plane and a rotation matrix parameter R are obtained through a visual algorithm operation mode.
And 204, obtaining the motion control parameters of the inspection robot through coordinate conversion and a kinematics calculation mode according to the first space coordinate system and the second space coordinate system.
It should be noted that, by combining the values X1, X2, and Y, Z in the device stationing model, and the position vector P and the rotation matrix R obtained in step 202, parameters of the motion required by the robot arm are obtained through coordinate system transformation and forward and inverse kinematics calculation, and the robot arm is controlled to make the detection end reach the designated position and the designated spatial attitude of the task point, so as to perform the detection operation.
The inspection robot control method of the embodiment is based on the inspection layout obtained by the first embodiment of the application, the distance and the space attitude between the detection end and the detection equipment/detection surface can be corrected by the position vector P and the rotation matrix R calculated by the visual algorithm, accurate application control can be realized in the Y direction (the projection direction from the task point to the detection surface) after distance correction, collision is avoided, the detection surface can be parallel to the detection end surface after space attitude correction, the partial discharge module can be ensured to be positively abutted to the surface of the task point during partial discharge detection, and the accuracy of partial discharge detection data is improved. After the spatial attitude is corrected, the detection end is controlled to reach the position of a window task point through the mechanical arm, and accurate rotation control can be achieved for the detection end.
The above is a detailed description of an embodiment of a method for controlling a patrol robot according to the present application, and the following is a detailed description of an embodiment of a substation patrol distribution device according to the present application.
The third embodiment of this application provides a transformer substation patrols and examines stationing device, includes:
the image acquisition unit 301 is configured to acquire a detection surface image of the device to be detected through the camera, where the detection surface image includes a visual calibration label pasted on the detection surface of the device to be detected;
a first spatial coordinate system establishing unit 302, configured to establish a first spatial coordinate system with the visual calibration label as an origin according to the detection plane image;
and the stationing model constructing unit 303 is configured to determine, according to the first spatial coordinate system, a position of a detection point of the device to be detected in the first spatial coordinate system, to obtain a device stationing model corresponding to the visual calibration label.
Further, the stationing model building unit 303 is specifically configured to:
determining a first projection distance from a detection point to a reference plane, a second projection distance from a camera to the reference plane, a horizontal direction distance from the detection point to an origin of a first space coordinate system and a vertical direction distance from the detection point to the origin of the first space coordinate system according to the first space coordinate system, wherein the reference plane is a Y-Z plane of the first space coordinate system;
and recording the first projection distance, the second projection distance, the horizontal direction distance and the vertical direction distance corresponding to the detection point to obtain an equipment point distribution model corresponding to the visual calibration label.
The above is a detailed description of an embodiment of the substation inspection and distribution device provided by the present application, and the following is a detailed description of an embodiment of the inspection robot provided by the present application.
The fourth embodiment of the present application provides a robot patrols and examines, includes:
the equipment image acquiring unit 401 is configured to acquire an equipment image of the equipment to be detected through the robot vision module when it is detected that the inspection robot moves to a preset stop point, where the equipment image includes a vision calibration label attached to the equipment to be detected;
the robot space parameter calculating unit 402 is configured to establish a second space coordinate system with the robot vision module as an origin according to the device image, and determine a space parameter of the inspection robot based on the second space coordinate system and the position of the vision calibration tag in the second space coordinate system;
a first spatial coordinate system pairing unit 403, configured to obtain, according to the image of the visual calibration tag, an equipment stationing model corresponding to the visual calibration tag, and obtain a first spatial coordinate system in the equipment stationing model;
and a motion control parameter calculation unit 404, configured to obtain a motion control parameter of the inspection robot through coordinate conversion and a kinematics calculation method according to the first spatial coordinate system and the second spatial coordinate system.
Further, the robot spatial parameter calculating unit 402 is specifically configured to:
and obtaining a position vector and a rotation matrix parameter from the origin of the second coordinate system to the reference plane through a visual algorithm operation mode based on the second space coordinate system and the position of the visual calibration label in the second space coordinate system.
The above is a detailed description of an embodiment of the inspection robot provided by the present application, and the following is a detailed description of an embodiment of the inspection system in the substation room provided by the present application.
The fifth embodiment of the present application provides an indoor system of patrolling and examining of transformer substation, includes: the inspection equipment comprises inspection equipment and an inspection robot as the fourth aspect of the application;
a visual calibration label is adhered on the equipment to be detected, and the visual calibration label is in one-to-one correspondence with the detection surface of the equipment to be detected;
the inspection robot is used for inspecting the equipment to be inspected.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other manners. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The terms "first," "second," "third," "fourth," and the like in the description of the application and the above-described figures, if any, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the application described herein are, for example, capable of operation in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions in the embodiments of the present application.
Claims (8)
1. A substation inspection and stationing method is characterized by comprising the following steps:
acquiring a detection surface image of equipment to be detected by using camera equipment, wherein the detection surface image comprises a visual calibration label pasted on the detection surface of the equipment to be detected;
establishing a first space coordinate system by taking the visual calibration label as an origin according to the detection surface image;
determining a first projection distance from a detection point to a reference plane, a second projection distance from the camera equipment to the reference plane, a horizontal direction distance from the detection point to an origin of the first space coordinate system and a vertical direction distance from the detection point to the origin of the first space coordinate system according to the first space coordinate system, wherein the reference plane is a Y-Z plane of the first space coordinate system;
and recording the first projection distance, the second projection distance, the horizontal direction distance and the vertical direction distance corresponding to the detection point to obtain an equipment stationing model corresponding to the visual calibration label.
2. The substation inspection and stationing method according to claim 1, wherein the visual calibration tag specifically comprises: apriltag tags.
3. A patrol robot control method is characterized by comprising the following steps:
when the inspection robot is detected to move to a preset stop point, acquiring an equipment image of equipment to be detected through a robot vision module, wherein the equipment image comprises a vision calibration label pasted on the equipment to be detected;
establishing a second space coordinate system with the robot vision module as an original point according to the equipment image, and determining space parameters of the inspection robot based on the second space coordinate system and the position of the vision calibration label in the second space coordinate system;
acquiring an equipment stationing model corresponding to the visual calibration label according to the image of the visual calibration label to obtain a first space coordinate system in the equipment stationing model, wherein the equipment stationing model is obtained by the substation routing inspection stationing method according to claim 1 or 2;
and obtaining the motion control parameters of the inspection robot through coordinate conversion and a kinematics calculation mode according to the first space coordinate system and the second space coordinate system.
4. The inspection robot control method according to claim 3, wherein the determining the spatial parameters of the inspection robot based on the second spatial coordinate system and the position of the visual calibration tag in the second spatial coordinate system specifically comprises:
and obtaining a position vector and a rotation matrix parameter from the origin of the second space coordinate system to the reference plane through a visual algorithm operation mode based on the second space coordinate system and the position of the visual calibration label in the second space coordinate system.
5. The utility model provides a transformer substation patrols and examines stationing device which characterized in that includes:
the device comprises an image acquisition unit, a processing unit and a display unit, wherein the image acquisition unit is used for acquiring a detection surface image of equipment to be detected through camera equipment, and the detection surface image comprises a visual calibration label pasted on the detection surface of the equipment to be detected;
the first space coordinate system building unit is used for building a first space coordinate system by taking the visual calibration label as an origin according to the detection surface image;
the stationing model building unit is used for determining the position of a detection point of the equipment to be detected in the first space coordinate system according to the first space coordinate system to obtain an equipment stationing model corresponding to the visual calibration label;
the stationing model building unit is specifically configured to:
determining a first projection distance from the detection point to a reference plane, a second projection distance from the camera equipment to the reference plane, a horizontal direction distance from the detection point to an origin of the first space coordinate system and a vertical direction distance from the detection point to the origin of the first space coordinate system according to the first space coordinate system, wherein the reference plane is a Y-Z plane of the first space coordinate system;
and recording the first projection distance, the second projection distance, the horizontal direction distance and the vertical direction distance corresponding to the detection point to obtain an equipment stationing model corresponding to the visual calibration label.
6. An inspection robot having a substation inspection and distribution device according to claim 5, comprising:
the equipment image acquisition unit is used for acquiring an equipment image of equipment to be detected through the robot vision module when the inspection robot is detected to move to a preset stop point, wherein the equipment image comprises a vision calibration label pasted on the equipment to be detected;
the robot space parameter calculation unit is used for establishing a second space coordinate system with the robot vision module as an original point according to the equipment image, and determining the space parameters of the inspection robot based on the second space coordinate system and the position of the vision calibration label in the second space coordinate system;
the first space coordinate system pairing unit is used for acquiring an equipment stationing model corresponding to the visual calibration label according to the image of the visual calibration label to obtain a first space coordinate system in the equipment stationing model;
and the motion control parameter calculation unit is used for obtaining the motion control parameters of the inspection robot through coordinate conversion and a kinematics calculation mode according to the first space coordinate system and the second space coordinate system.
7. The inspection robot according to claim 6, wherein the robot spatial parameter calculating unit is specifically configured to:
and obtaining a position vector and a rotation matrix parameter from the origin of the second space coordinate system to the reference plane through a visual algorithm operation mode based on the second space coordinate system and the position of the visual calibration label in the second space coordinate system.
8. The utility model provides an indoor system of patrolling and examining of transformer substation which characterized in that includes: the inspection equipment and the inspection robot as claimed in any one of claims 6 and 7;
the device to be detected is adhered with a visual calibration label, and the visual calibration label corresponds to the detection surface of the device to be detected one by one;
the inspection robot is used for inspecting the equipment to be inspected.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911267761.2A CN110962128B (en) | 2019-12-11 | 2019-12-11 | Substation inspection and stationing method and inspection robot control method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911267761.2A CN110962128B (en) | 2019-12-11 | 2019-12-11 | Substation inspection and stationing method and inspection robot control method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110962128A CN110962128A (en) | 2020-04-07 |
CN110962128B true CN110962128B (en) | 2021-06-29 |
Family
ID=70033738
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201911267761.2A Active CN110962128B (en) | 2019-12-11 | 2019-12-11 | Substation inspection and stationing method and inspection robot control method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110962128B (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114063643A (en) * | 2021-11-06 | 2022-02-18 | 山西东高舜博科技有限公司 | Remote monitoring control system and control method for transformer substation |
CN114997195A (en) * | 2022-05-12 | 2022-09-02 | 安徽大学绿色产业创新研究院 | Component checking and positioning method based on inspection robot |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106323294A (en) * | 2016-11-04 | 2017-01-11 | 新疆大学 | Positioning method and device for patrol robot of transformer substation |
CN106514068A (en) * | 2016-11-15 | 2017-03-22 | 成都陵川特种工业有限责任公司 | Control method of robot intelligent welding |
CN106625713A (en) * | 2017-01-11 | 2017-05-10 | 长春工业大学 | Method of improving gumming accuracy of gumming industrial robot |
CN106826822A (en) * | 2017-01-25 | 2017-06-13 | 南京阿凡达机器人科技有限公司 | A kind of vision positioning and mechanical arm crawl implementation method based on ROS systems |
DE102018109774A1 (en) * | 2017-05-17 | 2018-11-22 | Omron Corporation | Image processing system, image processing device and image processing program |
CN109018591A (en) * | 2018-08-09 | 2018-12-18 | 沈阳建筑大学 | A kind of automatic labeling localization method based on computer vision |
CN109345588A (en) * | 2018-09-20 | 2019-02-15 | 浙江工业大学 | A kind of six-degree-of-freedom posture estimation method based on Tag |
CN109584216A (en) * | 2018-11-12 | 2019-04-05 | 上海交通大学 | Object manipulator grabs deformable material bag visual identity and the localization method of operation |
CN109976344A (en) * | 2019-03-30 | 2019-07-05 | 南京理工大学 | Crusing robot posture antidote |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8720382B2 (en) * | 2010-08-31 | 2014-05-13 | Technologies Holdings Corp. | Vision system for facilitating the automated application of disinfectant to the teats of dairy livestock |
-
2019
- 2019-12-11 CN CN201911267761.2A patent/CN110962128B/en active Active
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106323294A (en) * | 2016-11-04 | 2017-01-11 | 新疆大学 | Positioning method and device for patrol robot of transformer substation |
CN106514068A (en) * | 2016-11-15 | 2017-03-22 | 成都陵川特种工业有限责任公司 | Control method of robot intelligent welding |
CN106625713A (en) * | 2017-01-11 | 2017-05-10 | 长春工业大学 | Method of improving gumming accuracy of gumming industrial robot |
CN106826822A (en) * | 2017-01-25 | 2017-06-13 | 南京阿凡达机器人科技有限公司 | A kind of vision positioning and mechanical arm crawl implementation method based on ROS systems |
DE102018109774A1 (en) * | 2017-05-17 | 2018-11-22 | Omron Corporation | Image processing system, image processing device and image processing program |
CN109018591A (en) * | 2018-08-09 | 2018-12-18 | 沈阳建筑大学 | A kind of automatic labeling localization method based on computer vision |
CN109345588A (en) * | 2018-09-20 | 2019-02-15 | 浙江工业大学 | A kind of six-degree-of-freedom posture estimation method based on Tag |
CN109584216A (en) * | 2018-11-12 | 2019-04-05 | 上海交通大学 | Object manipulator grabs deformable material bag visual identity and the localization method of operation |
CN109976344A (en) * | 2019-03-30 | 2019-07-05 | 南京理工大学 | Crusing robot posture antidote |
Also Published As
Publication number | Publication date |
---|---|
CN110962128A (en) | 2020-04-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110962128B (en) | Substation inspection and stationing method and inspection robot control method | |
EP2857973A1 (en) | Apparatus and method for measuring response time | |
CN108647553A (en) | Rapid expansion method, system, equipment and the storage medium of model training image | |
CN110850723A (en) | Fault diagnosis and positioning method based on transformer substation inspection robot system | |
CN105333819A (en) | Robot workpiece assembly and form and location tolerance detection system and method based on face laser sensor | |
CN112399072B (en) | VR live-action system for monitoring switch station data of power distribution room in real time | |
CN208366871U (en) | Detection system | |
CN108153602A (en) | The control method of robot | |
CN112749656A (en) | Air switch state detection method and device based on ORB feature matching and yolo | |
CN111158358A (en) | Method and system for self-optimization routing inspection of transformer/converter station based on three-dimensional model | |
CN105005814A (en) | Online comprehensive service system for electric automobiles | |
CN113983965A (en) | Flat cable quality detection device and detection method | |
CN108366229B (en) | Intelligent inspection method for fixed-point equipment | |
CN112465421A (en) | Method, device and equipment for managing U-bit position of machine room asset | |
CN110636256A (en) | Industrial fault diagnosis method, device, equipment and storage medium based on AR | |
CN109955220A (en) | Robot system | |
CN113627400B (en) | Industrial instrument video identification system | |
RU2473433C2 (en) | Method of repair of complex hardware system (chs) | |
CN108957384A (en) | A kind of turnover box and electric energy measuring equipment discriminating direction method and apparatus | |
CN115147356A (en) | Photovoltaic panel inspection positioning method, device, equipment and storage medium | |
CN112085799B (en) | Autonomous registration method and system for power equipment | |
CN213275788U (en) | Electric debugging system of motor train unit | |
CN108492395A (en) | A kind of cruising inspection system and method for the power matching network equipment based on RFID | |
CN114967674A (en) | Robot inspection task configuration method and system based on panoramic sensing of transformer substation | |
CN107566829A (en) | Fault detection system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
CB02 | Change of applicant information |
Address after: Room 501-503, annex building, Huaye building, No.1-3 Chuimao new street, Xihua Road, Yuexiu District, Guangzhou City, Guangdong Province 510000 Applicant after: China Southern Power Grid Power Technology Co.,Ltd. Address before: Room 501-503, annex building, Huaye building, No.1-3 Chuimao new street, Xihua Road, Yuexiu District, Guangzhou City, Guangdong Province 510000 Applicant before: GUANGDONG ELECTRIC POWER SCIENCE RESEARCH INSTITUTE ENERGY TECHNOLOGY Co.,Ltd. |
|
CB02 | Change of applicant information | ||
GR01 | Patent grant | ||
GR01 | Patent grant |