CN113485401A - Vision feedback-based hovering control method and device for inspection robot - Google Patents

Vision feedback-based hovering control method and device for inspection robot Download PDF

Info

Publication number
CN113485401A
CN113485401A CN202110844341.7A CN202110844341A CN113485401A CN 113485401 A CN113485401 A CN 113485401A CN 202110844341 A CN202110844341 A CN 202110844341A CN 113485401 A CN113485401 A CN 113485401A
Authority
CN
China
Prior art keywords
expected
inspection robot
controller
image
hovering
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110844341.7A
Other languages
Chinese (zh)
Inventor
邸龙
吴卫堃
董丽梦
李梓玮
谭麒
余航
王伦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Power Grid Co Ltd
Zhaoqing Power Supply Bureau of Guangdong Power Grid Co Ltd
Original Assignee
Guangdong Power Grid Co Ltd
Zhaoqing Power Supply Bureau of Guangdong Power Grid Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Power Grid Co Ltd, Zhaoqing Power Supply Bureau of Guangdong Power Grid Co Ltd filed Critical Guangdong Power Grid Co Ltd
Priority to CN202110844341.7A priority Critical patent/CN113485401A/en
Publication of CN113485401A publication Critical patent/CN113485401A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/08Control of attitude, i.e. control of roll, pitch, or yaw
    • G05D1/0808Control of attitude, i.e. control of roll, pitch, or yaw specially adapted for aircraft
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • G05D1/106Change initiated in response to external conditions, e.g. avoidance of elevated terrain or of no-fly zones

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Manipulator (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention provides a hovering control method and device of an inspection robot based on visual feedback, and belongs to the technical field of power transmission line inspection. The method comprises the steps of obtaining and processing a video information image of a hovering target, extracting a current characteristic vector and an expected characteristic vector of the hovering target from the processed video information image, further obtaining a characteristic error vector, performing operation in an IBVS (active binary bias) controller, and outputting an initial control instruction; and according to the characteristic information of the current state of the inspection flying robot, the characteristic information is fused with the initial control instruction and then used as the input of a self-balancing controller to generate a final control instruction, the steps are repeated, and the hovering position of the inspection flying robot is continuously corrected in real time according to the generated final control instruction, so that the problem that the target is difficult to lock under the condition that external disturbance exists in the conventional inspection flying robot and stable hovering is realized is solved.

Description

Vision feedback-based hovering control method and device for inspection robot
Technical Field
The invention belongs to the technical field of power transmission line inspection, and particularly relates to a hovering control method and device of an inspection robot based on visual feedback.
Background
Today's society is becoming more and more dependent on the supply of electricity. Power outages will therefore cause significant financial losses to power producers, sellers and consumers, and so it is now common in the industry to focus on how to reduce power line faults. In order to detect faults as quickly as possible and to arrange for an effective maintenance plan, it is necessary to carry out regular inspections of the power lines. The traditional manual walking inspection mode and the helicopter-assisted inspection mode have the defects. The manual walking inspection requires workers to observe the power line along the line, but only the surface of the power line can be seen from the ground, and meanwhile, the inspection method for manual visual inspection is single in means, low in working efficiency and subjective. Large faults are therefore easily ignored. Helicopter-assisted inspection is a common method at present. Although helicopter-assisted inspection has high working efficiency compared with manual walking inspection and is not influenced by the terrain, helicopter-assisted inspection costs expensive expenses.
Adopt among the prior art to load the four rotor crafts of vision sensor and patrol and examine and use flying robot as the power line, four rotor crafts can take photo by plane at the flight of line top, and the real-time result of taking photo by plane of will passing to ground workstation. Sometimes, due to mission requirements, a quad-rotor aircraft is required to hover over the observation in order to obtain the required inspection information. However, the robot can encounter various interferences (such as air flow, external interference signals and the like) during high-altitude flight, and under the condition of external disturbance, the conventional flying robot for inspection is difficult to ensure that a target can be locked, so that stable hovering is realized.
Disclosure of Invention
In view of the above, the invention aims to solve the problem that the existing inspection flying robot is difficult to lock a target under the condition of external disturbance, and realizes stable hovering.
In order to solve the technical problems, the invention provides the following technical scheme:
in a first aspect, the invention provides a hovering control method of an inspection robot based on visual feedback, which comprises the following steps:
acquiring a video information image of a target at a hovering position in real time through an airborne visual sensor, and processing the video information image;
extracting a current characteristic vector and an expected characteristic vector of a target at a hovering position from the processed video information image, and comparing the current characteristic vector with the expected characteristic vector to obtain a characteristic error vector;
inputting the characteristic error vector into an IBVS controller for operation, and processing output data to obtain an initial control instruction of the inspection robot;
the inspection robot acquires the characteristic information of the current state of the inspection robot, fuses the characteristic information of the current state and an initial control instruction and then uses the fused characteristic information as the input of a self-balancing controller to generate a final control instruction, wherein the final control instruction is used for correcting the hovering position;
and repeating the steps, and continuously correcting the hovering position of the inspection robot in real time according to the generated final control instruction.
Further, the video information image specifically includes:
a desired image obtained at initial hover and a target image obtained during any time period after hover.
Further, the processing of the video information image specifically includes:
acquiring a desired image of a target at hovering and a target image;
respectively carrying out fuzzy processing on the expected image and the target image by adopting Gaussian transformation;
respectively carrying out graying processing on the desired image and the target image after the fuzzy processing;
and performing threshold operation on the grayed expected image and the target image to respectively obtain corresponding binary images.
Further, the extracting the current feature vector and the expected feature vector of the hovering target from the processed video information image specifically includes:
carrying out contour detection on the processed video information image, and calculating a centroid coordinate of a target area;
and acquiring pixel coordinate values of the target centroid in a camera digital coordinate system, and converting the pixel coordinate values into a camera real 2D image plane coordinate system according to internal parameters of the camera to obtain a current feature vector of a current target image and an expected feature vector of an expected image.
Further, the outputting the data specifically includes:
a pitch control command, a roll control command, a desired yaw control command, a desired altitude control command, a lateral control command, and a longitudinal control command.
Further, the processing of the output data to obtain the initial control instruction of the inspection robot specifically includes:
and the expected roll angle instruction and the expected pitch angle instruction are fused with the expected yaw angle control instruction and the expected altitude control instruction to form an initial control instruction.
Further, the feature information of the current state specifically includes:
yaw angle, pitch angle, roll angle, altitude information, and velocity estimates in three-dimensional space.
Further, the final control instruction specifically includes:
a difference between the desired yaw angle control command and the current yaw angle, a difference between the desired altitude control command and the current altitude, a desired roll angle command, and a desired pitch angle command.
In a second aspect, the present invention provides a hovering control device for an inspection robot based on visual feedback, including:
the airborne sensor is used for acquiring the current yaw angle, pitch angle, roll angle and height information of the inspection robot, the speed estimation value of the three-dimensional space and the video information image of the hovering target in real time;
the image processing module is used for acquiring an expected image obtained during initial suspension and a target image obtained in any time period after suspension from the video information image of the target at the hovering position and processing the expected image and the target image;
the visual servo controller is used for extracting a current feature vector of a target from the processed target image, extracting an expected feature vector from the expected image, comparing the current feature vector with the expected feature vector to obtain an input parameter feature error vector, and outputting a pitch angle control instruction, a roll angle control instruction, an expected yaw angle control instruction, an expected height control instruction, a transverse control instruction and a longitudinal control instruction based on the feature error vector;
the self-balancing controller is used for fusing the characteristic information of the current state of the inspection robot with a pitch angle control instruction, a roll angle control instruction, an expected yaw angle control instruction and an expected height control instruction to obtain a final control instruction;
and the inspection robot controller obtains the rotating speed of each rotor through a rotor rotating speed calculation function according to the final control instruction, controls each rotor to rotate according to the corresponding rotating speed, and corrects the hovering position of the inspection robot in real time.
Further, the self-balancing controller comprises a yaw angle controller, a height controller, a pitch angle controller and a roll angle controller;
the pitch angle controller is used for generating the speed of rotating around the Y axis at the next moment by taking the expected pitch angle instruction output by the horizontal speed controller as input and sending the speed to the inspection robot controller;
the roll angle controller is used for generating the speed of rotating around the X axis at the next moment by taking the expected roll angle instruction output by the horizontal speed controller as input and sending the speed to the inspection robot controller;
the height controller is used for generating the height at the next moment by taking the difference value between the expected height control instruction and the current height as input, and sending the height to the inspection robot controller;
and the yaw angle controller is used for generating the speed of rotating around the Z axis at the next moment by taking the difference value between the expected yaw angle control command and the current yaw angle as input, and sending the speed to the inspection robot controller.
Compared with the prior art, the invention has the following beneficial effects:
the invention provides a hovering control method and a hovering control device of an inspection robot based on visual feedback, which are characterized in that a video information image of a hovering target is obtained and processed, a current feature vector and an expected feature vector of the hovering target are extracted from the processed video information image, the current feature vector and the expected feature vector are compared to obtain a feature error vector, operation is carried out in an IBVS (interactive text-to-video switching) controller, and an initial control instruction of the inspection robot is output; the inspection robot acquires the characteristic information of the current state of the inspection robot, fuses the characteristic information of the current state and an initial control instruction and then uses the fused characteristic information as the input of a self-balancing controller to generate a final control instruction, wherein the final control instruction is used for correcting the hovering position; and repeating the steps, and continuously correcting the hovering position of the inspection robot in real time according to the generated final control instruction. According to the invention, on one hand, the hovering position of the inspection robot is corrected in real time based on the video information image fed back by the vision, on the other hand, various parameters of the inspection robot are calculated in real time based on the vision servo controller and the self-balancing controller, and a real-time control instruction is continuously output, so that the inspection robot can still correct the hovering position of the inspection robot and lock an inspection target even under the condition of external interference.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without inventive exercise.
Fig. 1 is a schematic flow chart of a hovering control method of an inspection robot based on visual feedback according to an embodiment of the present invention;
fig. 2 is a schematic diagram of a control framework of a hovering control method of an inspection robot based on visual feedback according to an embodiment of the present invention;
fig. 3 is a schematic control principle diagram of an inspection robot hovering control device based on visual feedback according to an embodiment of the present invention.
Detailed Description
In order to make the objects, features and advantages of the present invention more obvious and understandable, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the embodiments described below are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, the embodiment provides a hovering control method for an inspection robot based on visual feedback, which is characterized by including the following steps:
s100: and acquiring a video information image of the target at the hovering position in real time through the airborne visual sensor, and processing the video information image.
It should be noted that, the onboard vision sensor may be a camera, the video information image specifically includes an expected image obtained during initial hovering and a target image obtained in any time period after hovering, and the processing of the video information image specifically includes:
s101: a desired image of the target at hover and a target image are acquired.
S102: and respectively carrying out blurring processing on the expected image and the target image by adopting Gaussian transformation.
S103: and performing graying processing on the desired image and the target image after the blurring processing respectively.
The image after the graying process also needs to be subjected to Gamma conversion to increase the contrast. Specifically, Gamma transformation is a nonlinear operation performed on the gray value of the input image, so that the gray value of the output image and the gray value of the input image have an exponential relationship:
Figure BDA0003179946610000051
wherein: max is the maximum value of the pixel in the image, and r is the pixel value of each pixel point; gamma is 1.5. The purpose of performing this transformation is: the gray scales of the lighter areas are stretched, the gray scales of the darker areas are compressed and darker, and the image is darkened integrally.
And (3) carrying out gray scale transformation on the expected image or the target image, carrying out gamma transformation on the gray scale image to increase the contrast, scanning each pixel in the image by using a template, and replacing the value of the central pixel point of the template by using the weighted average gray scale value of the pixels in the neighborhood determined by the template.
S104: and performing threshold operation on the grayed expected image and the target image to respectively obtain corresponding binary images.
It should be noted that, the desired image is set as the entire input background, the difference between the desired image and the background is calculated for each target image to obtain a difference map, and the difference map is subjected to binarization threshold processing.
The pixel value of the first frame and the pixel value of the corresponding position of each subsequent frame are subtracted to obtain:
diff=|image(x1,y1)-image(x,y)|
in the binarization threshold processing, 5-10 threshold values are selected to be proper, and the better threshold value is 7; pixels above this threshold are reassigned 255 and pixels below this threshold are reassigned 0.
S200: and extracting the current characteristic vector and the expected characteristic vector of the target at the hovering position from the processed video information image, and comparing the current characteristic vector with the expected characteristic vector to obtain a characteristic error vector.
It should be noted that, the extracting the current feature vector and the expected feature vector of the hover target from the processed video information image specifically includes:
carrying out contour detection on the processed video information image, and calculating a centroid coordinate of a target area;
and acquiring pixel coordinate values of the target centroid in a camera digital coordinate system, and converting the pixel coordinate values into a camera real 2D image plane coordinate system according to internal parameters of the camera to obtain a current feature vector of a current target image and an expected feature vector of an expected image.
The coordinate of the centroid under the camera digital image coordinate system is converted into a coordinate value under the camera real 2D image coordinate system, and the coordinate value is obtained by adopting the following formula;
Figure BDA0003179946610000062
wherein: rhov、ρu、cv、cuAs intrinsic parameters of the camera, (x)i,yi) Coordinates of the centroid in a camera real 2D image coordinate system, and (u, v) coordinates of the centroid in a camera digital image coordinate system.
The current feature vector of the current target image is obtained by adopting the following formula:
Figure BDA0003179946610000061
wherein: f is the intrinsic parameter of the camera, (x)i,yi) Coordinates of the centroid in the camera's true 2D image coordinate system.
The expected feature vector of the expected image is obtained by adopting the following formula:
Figure BDA0003179946610000071
wherein: f is the internal reference of the cameraNumber (x)i,yi) Coordinates of the centroid in the camera's true 2D image coordinate system.
S300: and inputting the characteristic error vector into an IBVS controller for operation, and processing output data to obtain an initial control instruction of the inspection robot.
It should be noted that the output data after the operation in the IBVS controller specifically includes:
a pitch control command, a roll control command, a desired yaw control command, a desired altitude control command, a lateral control command, and a longitudinal control command.
The horizontal control instruction and the longitudinal control instruction are processed by the horizontal speed controller and then output an expected roll angle instruction and an expected pitch angle instruction, and the expected roll angle instruction and the expected pitch angle instruction are fused with the expected yaw angle control instruction and the expected altitude control instruction to form an initial control instruction.
S400: the inspection robot acquires the characteristic information of the current state of the inspection robot, fuses the characteristic information of the current state and the initial control instruction and then uses the fused characteristic information and the initial control instruction as the input of the self-balancing controller to generate a final control instruction, and the final control instruction is used for correcting the hovering position.
It should be noted that the feature information of the current state specifically includes: yaw angle, pitch angle, roll angle, altitude information, and velocity estimates in three-dimensional space.
The final control instruction specifically includes: a difference between the desired yaw angle control command and the current yaw angle, a difference between the desired altitude control command and the current altitude, a desired roll angle command, and a desired pitch angle command.
S500: and repeating the steps S100-S400, and continuously correcting the hovering position of the inspection robot in real time according to the generated final control instruction.
The embodiment provides a hovering control method of an inspection robot based on visual feedback, which comprises the steps of obtaining and processing a video information image of a hovering target, extracting a current feature vector and an expected feature vector of the hovering target from the processed video information image, comparing the current feature vector with the expected feature vector to obtain a feature error vector, performing operation in an IBVS (interactive text-to-video switching) controller, and outputting an initial control instruction of the inspection robot; the inspection robot acquires the characteristic information of the current state of the inspection robot, fuses the characteristic information of the current state and an initial control instruction and then uses the fused characteristic information as the input of a self-balancing controller to generate a final control instruction, wherein the final control instruction is used for correcting the hovering position; and repeating the steps, and continuously correcting the hovering position of the inspection robot in real time according to the generated final control instruction.
The flying robot for inspection can continuously generate a final control instruction according to the video information image acquired in real time, so that the robot can still correct the hovering position of the robot under the condition of external interference such as airflow and external interference signals in time, and the inspection target is locked.
The foregoing is a detailed description of an embodiment of the inspection robot hovering control method based on visual feedback, and the following is a detailed description of an embodiment of the inspection robot hovering control apparatus based on visual feedback.
Referring to fig. 2-3, an embodiment of the present invention provides a hovering control device for an inspection robot based on visual feedback, including:
and the airborne sensor is used for acquiring the current yaw angle, pitch angle, roll angle, height information, the speed estimation value of the three-dimensional space and the video information image of the hovering target of the inspection robot in real time.
And the image processing module is used for acquiring an expected image obtained during initial suspension and a target image obtained in any time period after suspension from the video information image of the target at the hovering position and processing the expected image and the target image.
And the visual servo controller is used for extracting a current characteristic vector of the target from the processed target image, extracting an expected characteristic vector from the expected image, comparing the current characteristic vector with the expected characteristic vector to obtain an input parameter characteristic error vector, and outputting a pitch angle control command, a roll angle control command, an expected yaw angle control command, an expected height control command, a transverse control command and a longitudinal control command based on the characteristic error vector.
And the self-balancing controller is used for fusing the characteristic information of the current state of the inspection robot with a pitch angle control instruction, a roll angle control instruction, an expected yaw angle control instruction and an expected height control instruction to obtain a final control instruction.
It should be noted that the self-balancing controller includes a yaw angle controller, a height controller, a pitch angle controller and a roll angle controller;
the pitch angle controller is used for generating the speed of rotating around the Y axis at the next moment by taking the expected pitch angle instruction output by the horizontal speed controller as input and sending the speed to the inspection robot controller;
the roll angle controller is used for generating the speed of rotating around the X axis at the next moment by taking the expected roll angle instruction output by the horizontal speed controller as input and sending the speed to the inspection robot controller;
the height controller is used for generating the height at the next moment by taking the difference value between the expected height control instruction and the current height as input, and sending the height to the inspection robot controller;
and the yaw angle controller is used for generating the speed of rotating around the Z axis at the next moment by taking the difference value between the expected yaw angle control command and the current yaw angle as input, and sending the speed to the inspection robot controller.
And the inspection robot controller obtains the rotating speed of each rotor through a rotor rotating speed calculation function according to the final control instruction, controls each rotor to rotate according to the corresponding rotating speed, and corrects the hovering position of the inspection robot in real time.
Wherein, when the flying robot for inspection rotates around the Y axis (or X axis), the flying robot can cause the inspection robot to horizontally move along the X axis (or Y axis), namely the pitch angle and the roll angle caused by the rotation speed change of the rotor wing of the inspection robot cause the horizontal movement of the aircraft on the X, Y axis, therefore, according to the relation between the translation speed and the inclination angle, a corresponding horizontal speed controller is added in front of the pitch angle controller and the roll angle controller in the self-balancing controller, the translation speed along the X axis and the Y axis is taken as the input through the translation speed controller, the expected roll angle and the expected pitch angle of the inspection robot at the following moment are taken as the output, the independent control input of the rotation of the inspection robot around the X axis and the Y axis is removed, the quantity of control parameters of the inspection robot is reduced through the combined control input of the horizontal movement speed and the inclination angle, the operation steps are reduced, the operation efficiency is improved, and the quick response of control is realized.
The desired pitch angle command output by the horizontal velocity controller is obtained with the lateral control command as an input, and the desired roll angle command output by the horizontal velocity controller is obtained with the longitudinal control command as an input.
The yaw angle controller and the height controller in the self-balancing controller are improved, the difference value between the expected height at the next moment and the feedback height at the current moment is input into the height controller, the difference value between the expected yaw angle at the next moment and the feedback yaw angle at the current moment is input into the yaw angle controller, the yaw angle and the height control input quantity of the inspection robot are weakened, the amplitude of the inspection robot moving along the Z axis and rotating around the Z axis is reduced, the self-balancing control effect is improved, and the hovering stability is guaranteed.
Fig. 3 is a schematic control principle diagram of an embodiment of the inspection robot hovering control device based on visual feedback, wherein,
the method comprises the steps that an onboard camera obtains video information of a hovering target, an expected image obtained during initial hovering and a target image obtained in any time period after hovering are intercepted from the video information of the hovering target, and processing is conducted; extracting a current characteristic vector S of a target from the processed target image, extracting an expected characteristic vector S from the expected image, comparing the current characteristic vector with the expected characteristic vector to obtain a characteristic error vector, and outputting an expected yaw angle control instruction Wz, an expected height control instruction Vz, a transverse control instruction Vx and a longitudinal control instruction Vy after being processed by a visual servo controller; and the transverse control instruction Vx and the longitudinal control instruction Vy are processed by a translation speed controller, and an expected roll angle instruction and an expected pitch angle instruction of the inspection robot at the next moment are output.
Fusing the expected pitch angle instruction with the fed back current pitch angle of the inspection robot, inputting the fused result into a pitch angle controller, and generating the speed tau rotating around the Y axis at the next momentySending the rotor speed to a rotor speed calculation function for processing; fusing an expected roll angle instruction with the fed back current roll angle of the inspection robot, inputting the fused result into a roll angle controller, and generating the speed tau rotating around the X axis at the next momentxSending the rotor speed to a rotor speed calculation function for processing; fusing an expected yaw angle control instruction with the fed-back current yaw angle of the inspection robot, inputting the fused command into a yaw angle controller, and generating a speed tau rotating around the Z axis at the next momentzSending the rotor speed to a rotor speed calculation function for processing; and fusing the expected height control instruction with the fed-back current height and altitude m0 of the inspection robot, inputting the fused command into the height controller, generating the height T at the next moment, and sending the height T to a rotor wing rotating speed calculation function for processing.
Rotor speed calculation function to received tauy、τx、τzAnd T is calculated to obtain the rotating speeds m1, m2, m3 and m4 of the four rotors, and the rotating speeds are input into the inspection robot controller, so that the rotating speeds of the four rotors are controlled, and the posture of the inspection robot is adjusted.
The above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (10)

1. A hovering control method of an inspection robot based on visual feedback is characterized by comprising the following steps:
acquiring a video information image of a target at a hovering position in real time through an airborne visual sensor, and processing the video information image;
extracting a current characteristic vector and an expected characteristic vector of a target at a hovering position from the processed video information image, and comparing the current characteristic vector with the expected characteristic vector to obtain a characteristic error vector;
inputting the characteristic error vector into an IBVS controller for operation, and processing output data to obtain an initial control instruction of the inspection robot;
the inspection robot acquires the characteristic information of the current state of the inspection robot, fuses the characteristic information of the current state and the initial control instruction and then uses the fused characteristic information as the input of a self-balancing controller to generate a final control instruction, wherein the final control instruction is used for correcting the hovering position;
and repeating the steps, and continuously correcting the hovering position of the inspection robot in real time according to the generated final control instruction.
2. The inspection robot hovering control method according to claim 1, wherein the video information image specifically includes:
a desired image obtained at initial hover and a target image obtained during any time period after hover.
3. The visual feedback-based hovering control method for an inspection robot according to claim 2, wherein the processing the video information image specifically includes:
acquiring a desired image of a target at hovering and a target image;
respectively carrying out fuzzy processing on the expected image and the target image by adopting Gaussian transformation;
performing graying processing on the desired image and the target image after the blurring processing respectively;
and performing threshold operation on the grayed expected image and the target image to respectively obtain corresponding binary images.
4. The visual feedback-based inspection robot hovering control method according to claim 1, wherein the extracting a current feature vector and an expected feature vector of a hovering target from the processed video information image specifically comprises:
carrying out contour detection on the processed video information image, and calculating a centroid coordinate of a target area;
the method comprises the steps of obtaining pixel coordinate values of a target centroid in a camera digital coordinate system, converting the pixel coordinate values into a camera real 2D image plane coordinate system according to internal parameters of the camera, and obtaining a current feature vector of a current target image and an expected feature vector of an expected image.
5. The visual feedback-based inspection robot hovering control method according to claim 1, wherein the outputting the data specifically includes:
a pitch control command, a roll control command, a desired yaw control command, a desired altitude control command, a lateral control command, and a longitudinal control command.
6. The inspection robot hovering control method according to claim 5, wherein the processing the output data to obtain the initial control instruction of the inspection robot includes:
and the transverse control instruction and the longitudinal control instruction are processed by a horizontal speed controller and then output an expected roll angle instruction and an expected pitch angle instruction, and the expected roll angle instruction and the expected pitch angle instruction are fused with the expected yaw angle control instruction and the expected altitude control instruction to form the initial control instruction.
7. The inspection robot hovering control method according to claim 1, wherein the characteristic information of the current state specifically includes:
yaw angle, pitch angle, roll angle, altitude information, and velocity estimates in three-dimensional space.
8. The inspection robot hovering control method according to claim 1, wherein the final control instruction specifically includes:
a difference between the desired yaw angle control command and the current yaw angle, a difference between the desired altitude control command and the current altitude, a desired roll angle command, and a desired pitch angle command.
9. The utility model provides a control device that hovers of robot patrols and examines based on vision feedback which characterized in that includes:
the airborne sensor is used for acquiring the current yaw angle, pitch angle, roll angle and height information of the inspection robot, the speed estimation value of the three-dimensional space and the video information image of the hovering target in real time;
the image processing module is used for acquiring an expected image obtained during initial suspension and a target image obtained in any time period after suspension from the video information image of the target at the hovering position and processing the expected image and the target image;
the visual servo controller is used for extracting a current feature vector of a target from the processed target image, extracting an expected feature vector from the expected image, comparing the current feature vector with the expected feature vector to obtain an input parameter feature error vector, and outputting a pitch angle control command, a roll angle control command, an expected yaw angle control command, an expected height control command, a transverse control command and a longitudinal control command based on the feature error vector;
the self-balancing controller is used for fusing the characteristic information of the current state of the inspection robot with a pitch angle control instruction, a roll angle control instruction, an expected yaw angle control instruction and an expected height control instruction to obtain a final control instruction;
and the inspection robot controller obtains the rotating speed of each rotor through a rotor rotating speed calculation function according to the final control instruction, controls each rotor to rotate according to the corresponding rotating speed, and corrects the hovering position of the inspection robot in real time.
10. The inspection robot hovering control device based on visual feedback according to claim 9, wherein the self-balancing controller comprises a yaw angle controller, a height controller, a pitch angle controller and a roll angle controller;
the pitch angle controller is used for generating the speed of rotating around the Y axis at the next moment by taking the expected pitch angle instruction output by the horizontal speed controller as input and sending the speed to the inspection robot controller;
the roll angle controller is used for generating the speed of rotating around the X axis at the next moment by taking the expected roll angle instruction output by the horizontal speed controller as input and sending the speed to the inspection robot controller;
the height controller is used for generating the height at the next moment by taking the difference value between the expected height control instruction and the current height as input, and sending the height to the inspection robot controller;
and the yaw angle controller is used for generating the speed of rotating around the Z axis at the next moment by taking the difference value between the expected yaw angle control command and the current yaw angle as input, and sending the speed to the inspection robot controller.
CN202110844341.7A 2021-07-26 2021-07-26 Vision feedback-based hovering control method and device for inspection robot Pending CN113485401A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110844341.7A CN113485401A (en) 2021-07-26 2021-07-26 Vision feedback-based hovering control method and device for inspection robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110844341.7A CN113485401A (en) 2021-07-26 2021-07-26 Vision feedback-based hovering control method and device for inspection robot

Publications (1)

Publication Number Publication Date
CN113485401A true CN113485401A (en) 2021-10-08

Family

ID=77942559

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110844341.7A Pending CN113485401A (en) 2021-07-26 2021-07-26 Vision feedback-based hovering control method and device for inspection robot

Country Status (1)

Country Link
CN (1) CN113485401A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114489102A (en) * 2022-01-19 2022-05-13 上海复亚智能科技有限公司 Self-inspection method and device for electric power tower, unmanned aerial vehicle and storage medium
CN114967458A (en) * 2022-05-30 2022-08-30 江南大学 Drive control system of micro-nano robot cluster

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108062104A (en) * 2016-11-09 2018-05-22 哈尔滨工大天才智能科技有限公司 A kind of flying robot's Hovering control system setup method
CN108062103A (en) * 2016-11-09 2018-05-22 哈尔滨工大天才智能科技有限公司 A kind of flying robot regards the vision feedback control method of object hovering
CN110362098A (en) * 2018-03-26 2019-10-22 北京京东尚科信息技术有限公司 Unmanned plane vision method of servo-controlling, device and unmanned plane
CN110488847A (en) * 2019-08-09 2019-11-22 中国科学院自动化研究所 The bionic underwater robot Hovering control mthods, systems and devices of visual servo
CN110675443A (en) * 2019-09-24 2020-01-10 西安科技大学 Coal briquette area detection method for underground coal conveying image
CN111368661A (en) * 2020-02-25 2020-07-03 华南理工大学 Finger vein image enhancement method based on image processing
CN112446889A (en) * 2020-07-01 2021-03-05 龚循安 Medical video reading method based on ultrasound
CN112462797A (en) * 2020-11-30 2021-03-09 深圳技术大学 Visual servo control method and system using grey prediction model

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108062104A (en) * 2016-11-09 2018-05-22 哈尔滨工大天才智能科技有限公司 A kind of flying robot's Hovering control system setup method
CN108062103A (en) * 2016-11-09 2018-05-22 哈尔滨工大天才智能科技有限公司 A kind of flying robot regards the vision feedback control method of object hovering
CN110362098A (en) * 2018-03-26 2019-10-22 北京京东尚科信息技术有限公司 Unmanned plane vision method of servo-controlling, device and unmanned plane
CN110488847A (en) * 2019-08-09 2019-11-22 中国科学院自动化研究所 The bionic underwater robot Hovering control mthods, systems and devices of visual servo
CN110675443A (en) * 2019-09-24 2020-01-10 西安科技大学 Coal briquette area detection method for underground coal conveying image
CN111368661A (en) * 2020-02-25 2020-07-03 华南理工大学 Finger vein image enhancement method based on image processing
CN112446889A (en) * 2020-07-01 2021-03-05 龚循安 Medical video reading method based on ultrasound
CN112462797A (en) * 2020-11-30 2021-03-09 深圳技术大学 Visual servo control method and system using grey prediction model

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
周磊: "飞行机器人视物悬停的视觉反馈控制方法", 《中国优秀硕士学位论文全文数据库 信息科技辑》, no. 2, pages 140 - 362 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114489102A (en) * 2022-01-19 2022-05-13 上海复亚智能科技有限公司 Self-inspection method and device for electric power tower, unmanned aerial vehicle and storage medium
CN114967458A (en) * 2022-05-30 2022-08-30 江南大学 Drive control system of micro-nano robot cluster

Similar Documents

Publication Publication Date Title
CN109784333B (en) Three-dimensional target detection method and system based on point cloud weighted channel characteristics
CN113485401A (en) Vision feedback-based hovering control method and device for inspection robot
CN108229587B (en) Autonomous transmission tower scanning method based on hovering state of aircraft
CN110163177B (en) Unmanned aerial vehicle automatic sensing and identifying method for wind turbine generator blades
CN111339893B (en) Pipeline detection system and method based on deep learning and unmanned aerial vehicle
CN110910350B (en) Nut loosening detection method for wind power tower cylinder
EP2713310A2 (en) System and method for detection and tracking of moving objects
CN112505065A (en) Method for detecting surface defects of large part by indoor unmanned aerial vehicle
CN114281093B (en) Defect detection system and method based on unmanned aerial vehicle power inspection
CN111998862B (en) BNN-based dense binocular SLAM method
CN112598637A (en) Automatic flight method for routing inspection of blades of wind turbine generator in blade area
Ohta et al. Image acquisition of power line transmission towers using UAV and deep learning technique for insulators localization and recognition
CN113327296A (en) Laser radar and camera online combined calibration method based on depth weighting
CN109358640B (en) Real-time visualization method and system for laser detection of unmanned aerial vehicle and storage medium
CN114581632A (en) Method, equipment and device for detecting assembly error of part based on augmented reality technology
CN115355952B (en) Intelligent inspection system for crude oil storage tank
CN116149193A (en) Anti-disturbance control method and system for rotor unmanned aerial vehicle based on vision
CN113639643B (en) Crop seedling stage height detection method based on RGB-D depth camera
CN113206951B (en) Real-time electronic image stabilization method based on flapping wing flight system
CN110992286B (en) Photovoltaic module image correction method based on CCD camera
CN110618696B (en) Air-ground integrated surveying and mapping unmanned aerial vehicle
Maurya et al. Vision-based fractional order sliding mode control for autonomous vehicle tracking by a quadrotor uav
CN113467503B (en) Stability enhancement control method and device for power transmission line inspection robot
CN113433958A (en) Unmanned aerial vehicle inspection method and device
CN112517473A (en) Photovoltaic cleaning robot stable operation method and system based on artificial intelligence

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination