CN110850872A - Robot inspection method and device, computer readable storage medium and robot - Google Patents

Robot inspection method and device, computer readable storage medium and robot Download PDF

Info

Publication number
CN110850872A
CN110850872A CN201911049771.9A CN201911049771A CN110850872A CN 110850872 A CN110850872 A CN 110850872A CN 201911049771 A CN201911049771 A CN 201911049771A CN 110850872 A CN110850872 A CN 110850872A
Authority
CN
China
Prior art keywords
robot
inspection
image
distance
preset
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911049771.9A
Other languages
Chinese (zh)
Inventor
黄高波
柴黎林
熊友军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Ubtech Technology Co ltd
Original Assignee
Shenzhen Ubtech Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Ubtech Technology Co ltd filed Critical Shenzhen Ubtech Technology Co ltd
Priority to CN201911049771.9A priority Critical patent/CN110850872A/en
Publication of CN110850872A publication Critical patent/CN110850872A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0238Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
    • G05D1/024Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0214Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0221Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0253Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting relative motion information from a plurality of images taken successively, e.g. visual odometry, optical flow
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0257Control of position or course in two dimensions specially adapted to land vehicles using a radar
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Electromagnetism (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Optics & Photonics (AREA)
  • Manipulator (AREA)

Abstract

The application belongs to the technical field of computers, and particularly relates to a robot inspection method and device, a computer readable storage medium and a robot. The method comprises the steps of obtaining position information of a target inspection point, and controlling a robot to move to the target inspection point according to the position information; shooting an inspection target object according to preset shooting parameters to obtain an inspection image of the inspection target object; and adjusting the orientation and/or position of the robot according to the patrol inspection image and a preset reference image until the difference between the patrol inspection image and the reference image is within a preset precision range. Through this application embodiment, the benchmark image has been set up in advance, provides reliable foundation for judging the accuracy of patrolling and examining the image to can adjust orientation and/or position of robot according to patrolling and examining image and benchmark image, make its final shooting the image of patrolling and examining that obtains can satisfy image identification's demand, improved the rate of accuracy of patrolling and examining the result.

Description

Robot inspection method and device, computer readable storage medium and robot
Technical Field
The application belongs to the technical field of computers, and particularly relates to a robot inspection method and device, a computer readable storage medium and a robot.
Background
In recent years, with the development of robotics, indoor inspection robots applied to power distribution rooms and telecommunication IDC rooms have come to appear. The main function of the indoor inspection robot is to move in a machine room and inspect whether each instrument and equipment in the machine room works normally. The important method for the robot to judge whether the instruments and equipment work normally is to shoot images, identify whether the meter reading in the images is within a specified normal reading range, and identify whether the equipment indicator lamps in the images are normal. Whether the visible image identification is accurate is the key to whether the routing inspection is successful.
To ensure accurate image recognition, the image recognition process and algorithm are very important, and to ensure the correctness of image shooting, it is necessary to ensure that the positions and orientations of the images to be shot by the robot during inspection are consistent with those of the positions and orientations of the images to be deployed or that the errors are as small as possible. Currently, this position and orientation is mainly guaranteed by the lidar slam. However, the lidar slam is sometimes easily interfered by some environmental factors, so that the positioning may have slight deviation, sometimes the positioning may reach the center deviation, which may be about 7 cm, and the deviation may cause the accuracy of the routing inspection result to be extremely low.
Disclosure of Invention
In view of this, embodiments of the present application provide a robot inspection method, an apparatus, a computer-readable storage medium, and a robot, so as to solve the problem that the accuracy of an inspection result is extremely low due to a deviation of an inspection shot image in an existing robot inspection method.
A first aspect of an embodiment of the present application provides a robot inspection method, which may include:
acquiring position information of a target inspection point, and controlling the robot to move to the target inspection point according to the position information;
shooting an inspection target object according to preset shooting parameters to obtain an inspection image of the inspection target object;
and adjusting the orientation and/or position of the robot according to the patrol inspection image and a preset reference image until the difference between the patrol inspection image and the reference image is within a preset precision range.
Further, the adjusting the orientation and/or position of the robot according to the inspection image and a preset reference image includes:
comparing the inspection image with the reference image, and judging whether the orientation of the robot is within a preset orientation range;
and if the orientation of the robot is not in the orientation range, adjusting the orientation of the robot, and returning to execute the step of shooting the inspection target object according to preset shooting parameters.
Further, the robot inspection method further comprises the following steps:
if the orientation of the robot is within the orientation range, judging whether the position of the robot is within a preset position range;
and if the position of the robot is not in the position range, adjusting the position of the robot, and returning to execute the step of shooting the inspection target object according to preset shooting parameters.
Further, the robot inspection method further comprises the following steps:
if the position of the robot is within the position range, judging whether the distance between the robot and the inspection target object is within a preset distance range;
and if the distance between the robot and the inspection target is not within the distance range, adjusting the distance between the robot and the inspection target, and returning to execute the step of shooting the inspection target according to preset shooting parameters.
Further, the adjusting the distance between the robot and the inspection target includes:
if the distance between the robot and the inspection target object is larger than a first distance value, controlling the robot to adjust the advancing direction so as to keep the advancing direction to be over against the inspection target object, wherein the first distance value is an upper limit value of the distance range;
and controlling the robot to advance for a preset distance.
Further, the adjusting the distance between the robot and the inspection target includes:
if the distance between the robot and the inspection target object is smaller than a second distance value, controlling the robot to adjust the advancing direction so as to keep the advancing direction back to the inspection target object, wherein the second distance value is the lower limit value of the distance range;
and controlling the robot to advance for a preset distance.
Further, the robot inspection method further comprises the following steps:
and if the distance between the robot and the inspection target object is within the distance range, determining that the difference between the inspection image and the reference image is within the precision range, and determining the current inspection image as a target inspection image.
A second aspect of the embodiments of the present application provides a robot inspection device, which may include:
the inspection mobile module is used for acquiring the position information of a target inspection point and controlling the robot to move to the target inspection point according to the position information;
the inspection image shooting module is used for shooting an inspection target object according to preset shooting parameters to obtain an inspection image of the inspection target object;
and the shooting adjusting module is used for adjusting the orientation and/or position of the robot according to the patrol inspection image and a preset reference image until the difference between the patrol inspection image and the reference image is within a preset precision range.
Further, the photographing adjusting module may include:
the orientation judging unit is used for comparing the inspection image with the reference image and judging whether the orientation of the robot is in a preset orientation range or not;
and the orientation adjusting unit is used for adjusting the orientation of the robot if the orientation of the robot is not in the orientation range, and returning to execute the step of shooting the inspection target object according to preset shooting parameters.
Further, the shooting adjustment module may further include:
the position judging unit is used for judging whether the position of the robot is in a preset position range or not if the orientation of the robot is in the orientation range;
and the position adjusting unit is used for adjusting the position of the robot if the position of the robot is not in the position range, and returning to execute the step of shooting the inspection target object according to the preset shooting parameters.
Further, the shooting adjustment module may further include:
the distance judging unit is used for judging whether the distance between the robot and the inspection target object is within a preset distance range or not if the position of the robot is within the position range;
and the distance adjusting unit is used for adjusting the distance between the robot and the inspection target object if the distance between the robot and the inspection target object is not within the distance range, and returning to execute the step of shooting the inspection target object according to preset shooting parameters.
Further, the distance adjusting unit may include:
a first adjusting subunit, configured to control the robot to adjust an advancing direction to keep the advancing direction facing the inspection target if a distance between the robot and the inspection target is greater than a first distance value, where the first distance value is an upper limit value of the distance range;
and the first advancing subunit is used for controlling the robot to advance for a preset distance.
Further, the distance adjusting unit may further include:
a second adjusting subunit, configured to control the robot to adjust an advancing direction to keep the advancing direction facing away from the inspection target if a distance between the robot and the inspection target is smaller than a second distance value, where the second distance value is a lower limit value of the distance range;
and the second advancing subunit is used for controlling the robot to advance for a preset distance.
Further, the shooting adjustment module may further include:
and the inspection image determining unit is used for determining that the difference between the inspection image and the reference image is within the precision range and determining the current inspection image as a target inspection image if the distance between the robot and the inspection target object is within the distance range.
A third aspect of embodiments of the present application provides a computer-readable storage medium, in which a computer program is stored, and the computer program, when executed by a processor, implements the steps of any of the robot inspection methods described above.
A fourth aspect of the embodiments of the present application provides a robot, including a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor implements the steps of any one of the robot inspection methods when executing the computer program.
A fifth aspect of embodiments of the present application provides a computer program product, which, when run on a robot, causes the robot to perform the steps of any of the robot inspection methods described above.
Compared with the prior art, the embodiment of the application has the advantages that: the method comprises the steps of obtaining position information of a target inspection point, and controlling a robot to move to the target inspection point according to the position information; shooting an inspection target object according to preset shooting parameters to obtain an inspection image of the inspection target object; and adjusting the orientation and/or position of the robot according to the patrol inspection image and a preset reference image until the difference between the patrol inspection image and the reference image is within a preset precision range. Through this application embodiment, the benchmark image has been set up in advance, provides reliable foundation for judging the accuracy of patrolling and examining the image to can adjust orientation and/or position of robot according to patrolling and examining image and benchmark image, make its final shooting the image of patrolling and examining that obtains can satisfy image identification's demand, improved the rate of accuracy of patrolling and examining the result.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1 is a flowchart of an embodiment of a robot inspection method according to an embodiment of the present disclosure;
FIG. 2 is a schematic diagram of a reference image;
FIG. 3 is a schematic diagram of a patrol inspection image with significant characteristic trapezoidal differences;
FIG. 4 is a schematic diagram of a patrol image with significant back-and-forth offset;
FIG. 5 is a schematic view of a patrol image with image features significantly magnified;
fig. 6 is a structural diagram of an embodiment of a robot inspection device in an embodiment of the present application;
FIG. 7 is a schematic block diagram of a robot in an embodiment of the present application;
fig. 8 is a schematic structural diagram of a robot in an embodiment of the present application.
Detailed Description
In order to make the objects, features and advantages of the present invention more apparent and understandable, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application, and it is apparent that the embodiments described below are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also to be understood that the terminology used in the description of the present application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of the present application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be further understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
As used in this specification and the appended claims, the term "if" may be interpreted contextually as "when", "upon" or "in response to a determination" or "in response to a detection". Similarly, the phrase "if it is determined" or "if a [ described condition or event ] is detected" may be interpreted contextually to mean "upon determining" or "in response to determining" or "upon detecting [ described condition or event ]" or "in response to detecting [ described condition or event ]".
In addition, in the description of the present application, the terms "first," "second," "third," and the like are used solely to distinguish one from another and are not to be construed as indicating or implying relative importance.
Referring to fig. 1, an embodiment of a robot inspection method in an embodiment of the present application may include:
and S101, acquiring the position information of a target inspection point, and controlling the robot to move to the target inspection point according to the position information.
The position information of the target inspection point can be preset in an inspection and deployment stage, specifically, in the inspection and deployment stage, the robot can be controlled to move to an appointed inspection and shooting area, the position and shooting parameters of the robot are adjusted, it is ensured that a camera of the robot can be directly opposite to a preset inspection target object, and the content to be shot is in an image. At the moment, the currently shot image is stored and determined as a reference image to be used as a reference in the subsequent inspection shooting process; saving the current position information of the robot, and determining the current position information of the robot as the position information of the target patrol point; and storing the current shooting parameters of the camera, and using the current shooting parameters as the reference in the subsequent inspection shooting process. Wherein the position information comprises coordinates (x, y) and orientation (theta); the shooting parameters include, but are not limited to, a shooting angle, a focus parameter (focus), and a zoom parameter (zoom), and if the camera is on the pan-tilt, the shooting parameters may further include angle parameters such as a pitch angle (pitch) and a yaw angle (yaw) of the pan-tilt. Preferably, the holder can rotate 360 degrees.
When the inspection task needs to be executed, the position information of the target inspection point, which is stored in advance, can be acquired, and the robot is controlled to move to the target inspection point according to the position information.
Specifically, a preset electronic map may be obtained first, and a starting position of the robot in the electronic map may be determined by positioning. The electronic map can be pre-stored in a storage medium of the robot, or can be acquired by the robot from a preset inspection management terminal, and after the electronic map is imported, the robot can determine the starting position of the robot in the electronic map through positioning. The specific positioning method may be any one of the positioning methods commonly used in the prior art, which is not specifically limited in the embodiment of the present application.
And then, global path planning can be carried out to obtain an optimal global path from the initial position to the target inspection point, after the optimal global path is obtained, the robot is controlled to move according to the optimal global path, and obstacle avoidance processing and local path planning are carried out in the moving process to avoid obstacles on the optimal global path and finally reach the target inspection point. After the target patrol point is reached, the orientation of the robot is adjusted according to the position information, and therefore the staged navigation task is completed.
Step S102, shooting the inspection target object according to preset shooting parameters to obtain an inspection image of the inspection target object.
Because the position information and the shooting parameters at this time are both consistent with the position information and the shooting parameters at the routing inspection deployment stage, theoretically, the image (i.e., the routing inspection image) obtained by shooting at this time should be basically consistent with the reference image, and the contents to be shot are all in the routing inspection image. However, in practice, due to the positioning deviation of the laser radar, the inspection image may have a large difference from the reference image, and a part of the contents to be captured may be omitted from the inspection image.
And S103, adjusting the orientation and/or position of the robot according to the patrol inspection image and a preset reference image until the difference between the patrol inspection image and the reference image is within a preset precision range.
Specifically, the inspection image and the reference image may be compared to determine whether the orientation of the robot is within a preset orientation range.
Fig. 2 is a schematic diagram of the reference image, and it should be noted that the frame lines in the diagram are not actual images, but are merely used for facilitating the schematic description of the embodiments of the present application, and represent image features extracted from the images by the vision algorithm. In this embodiment of the application, the image features of the inspection image and the image features of the reference image may be compared, and if there are significant characteristic trapezoid differences (i.e., greater than a preset trapezoid difference threshold), it is indicated that the orientation of the robot is not in the orientation range, otherwise, if there are no significant characteristic trapezoid differences (i.e., less than or equal to the trapezoid difference threshold), it is indicated that the orientation of the robot is in the orientation range.
For example, if the patrol inspection image is as shown in fig. 3, the image features of the patrol inspection image and the image features of the reference image have a significant characteristic trapezoidal difference, and in this case, it may be determined that the orientation of the robot is not within the orientation range; if the patrol image is also as shown in fig. 2, the image features of the patrol image and the image features of the reference image do not have obvious feature trapezoidal differences, and in this case, it can be determined that the orientation of the robot is within the orientation range.
If the orientation of the robot is not in the orientation range, the orientation of the robot is adjusted, and the step S102 and the subsequent steps are returned to, that is, the inspection target object is photographed again according to the preset photographing parameters to obtain a new inspection image, and the image is compared with the reference image again.
If the orientation of the robot is in the orientation range, the inspection image is continuously compared with the reference image, and whether the position of the robot is in a preset position range or not is judged.
In this embodiment of the application, the image feature of the inspection image and the image feature of the reference image may be compared, and if there is a significant front-back deviation (i.e., greater than a preset deviation threshold value), it indicates that the position of the robot is not in the position range, otherwise, if there is no significant front-back deviation (i.e., less than or equal to the deviation threshold value), it indicates that the position of the robot is in the position range.
For example, if the inspection image is as shown in fig. 4, the image features of the inspection image and the image features of the reference image have obvious front-back offsets, and at this time, it can be determined that the position of the robot is not within the position range; if the inspection image is also as shown in fig. 2, the image features of the inspection image and the image features of the reference image do not have obvious front-back offset, and at this time, it can be determined that the position of the robot is within the position range.
And if the position of the robot is not in the position range, adjusting the position of the robot. For example, it may be assumed that the inspection target is located at the right side of the advancing direction of the robot, and the camera is also directed toward the right side of the robot, and at this time, if the image feature of the inspection image is significantly shifted forward compared to the image feature of the reference image, the robot may be controlled to retreat by a preset distance, and if the image feature of the inspection image is significantly shifted backward compared to the image feature of the reference image, the robot may be controlled to advance by a preset distance.
After the position of the robot is adjusted, the process may return to step S102 and the subsequent steps, that is, the inspection target object is photographed again according to the preset photographing parameters to obtain a new inspection image, and the inspection image is compared with the reference image again.
And if the position of the robot is in the position range, judging whether the distance between the robot and the inspection target object is in a preset distance range.
In this embodiment of the application, the image feature of the inspection image and the image feature of the reference image may be compared, and if there is an obvious distance difference (that is, greater than a preset distance threshold), it is indicated that the position of the robot is not within the distance range, otherwise, if there is no obvious distance difference (that is, less than or equal to the distance threshold), it is indicated that the position of the robot is within the distance range.
For example, if the inspection image is as shown in fig. 5, and the image feature of the inspection image is significantly enlarged compared to the image feature of the reference image, that is, the robot is located too close to the inspection target, there is a significant distance difference between the image feature of the inspection image and the image feature of the reference image, and at this time, it may be determined that the distance between the robot and the inspection target is not within the distance range; similarly, if the image feature of the inspection image is significantly reduced compared with the image feature of the reference image, that is, the position of the robot is too far away from the inspection target object, there is a significant distance difference between the image feature of the inspection image and the image feature of the reference image, and at this time, it can be determined that the distance between the robot and the inspection target object is not within the distance range; if the inspection image is also as shown in fig. 2, there is no significant distance difference between the image features of the inspection image and the image features of the reference image, and at this time, it is determined that the distance between the robot and the inspection target object is within the distance range.
And if the distance between the robot and the inspection target object is not within the distance range, adjusting the distance between the robot and the inspection target object. Specifically, if the image characteristics of the inspection image are significantly reduced compared to the image characteristics of the reference image, that is, the distance between the robot and the inspection target is greater than a first distance value, which is an upper limit value of the distance range, at this time, the robot may be controlled to adjust the advancing direction (for example, may rotate 90 degrees clockwise) to keep the advancing direction right opposite to the inspection target, and the robot may be controlled to advance by a predetermined distance, so as to shorten the distance between the robot and the inspection target.
If the image characteristics of the inspection image are obviously enlarged compared with the image characteristics of the reference image, that is, the distance between the robot and the inspection target object is smaller than a second distance value, and the second distance value is the lower limit value of the distance range, the robot can be controlled to adjust the advancing direction (for example, the robot can rotate 90 degrees counterclockwise) so as to keep the advancing direction opposite to the inspection target object, and the robot is controlled to advance by a preset distance, so that the distance between the robot and the inspection target object is increased.
After the distance between the robot and the inspection target is adjusted, the process may return to step S102 and the subsequent steps, that is, the inspection target is photographed again according to the preset photographing parameters to obtain a new inspection image, and the image is compared with the reference image again.
If the distance between the robot and the inspection target object is within the distance range, it can be determined that the difference between the inspection image and the reference image is within the accuracy range, and at this time, the current inspection image can be determined as a target inspection image. The robot can be according to the target is patrolled and examined the image and is carried out the operation of patrolling and examining that corresponds, carries out image recognition to it, and whether the instrument operating condition of inspection target thing is normal, whether have fire alarm information etc..
To sum up, the embodiment of the present application obtains the position information of the target inspection point, and controls the robot to move to the target inspection point according to the position information; shooting an inspection target object according to preset shooting parameters to obtain an inspection image of the inspection target object; and adjusting the orientation and/or position of the robot according to the patrol inspection image and a preset reference image until the difference between the patrol inspection image and the reference image is within a preset precision range. Through this application embodiment, the benchmark image has been set up in advance, provides reliable foundation for judging the accuracy of patrolling and examining the image to can adjust orientation and/or position of robot according to patrolling and examining image and benchmark image, make its final shooting the image of patrolling and examining that obtains can satisfy image identification's demand, improved the rate of accuracy of patrolling and examining the result.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
Corresponding to the robot inspection method in the foregoing embodiments, fig. 6 shows a structure diagram of an embodiment of a robot inspection device according to an embodiment of the present disclosure.
In this embodiment, a robot inspection device can include:
the inspection moving module 601 is used for acquiring position information of a target inspection point and controlling the robot to move to the target inspection point according to the position information;
an inspection image shooting module 602, configured to shoot an inspection target according to preset shooting parameters, to obtain an inspection image of the inspection target;
a shooting adjustment module 603, configured to adjust the orientation and/or position of the robot according to the inspection image and a preset reference image until a difference between the inspection image and the reference image is within a preset accuracy range.
Further, the photographing adjusting module may include:
the orientation judging unit is used for comparing the inspection image with the reference image and judging whether the orientation of the robot is in a preset orientation range or not;
and the orientation adjusting unit is used for adjusting the orientation of the robot if the orientation of the robot is not in the orientation range, and returning to execute the step of shooting the inspection target object according to preset shooting parameters.
Further, the shooting adjustment module may further include:
the position judging unit is used for judging whether the position of the robot is in a preset position range or not if the orientation of the robot is in the orientation range;
and the position adjusting unit is used for adjusting the position of the robot if the position of the robot is not in the position range, and returning to execute the step of shooting the inspection target object according to the preset shooting parameters.
Further, the shooting adjustment module may further include:
the distance judging unit is used for judging whether the distance between the robot and the inspection target object is within a preset distance range or not if the position of the robot is within the position range;
and the distance adjusting unit is used for adjusting the distance between the robot and the inspection target object if the distance between the robot and the inspection target object is not within the distance range, and returning to execute the step of shooting the inspection target object according to preset shooting parameters.
Further, the distance adjusting unit may include:
a first adjusting subunit, configured to control the robot to adjust an advancing direction to keep the advancing direction facing the inspection target if a distance between the robot and the inspection target is greater than a first distance value, where the first distance value is an upper limit value of the distance range;
and the first advancing subunit is used for controlling the robot to advance for a preset distance.
Further, the distance adjusting unit may further include:
a second adjusting subunit, configured to control the robot to adjust an advancing direction to keep the advancing direction facing away from the inspection target if a distance between the robot and the inspection target is smaller than a second distance value, where the second distance value is a lower limit value of the distance range;
and the second advancing subunit is used for controlling the robot to advance for a preset distance.
Further, the shooting adjustment module may further include:
and the inspection image determining unit is used for determining that the difference between the inspection image and the reference image is within the precision range and determining the current inspection image as a target inspection image if the distance between the robot and the inspection target object is within the distance range.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described apparatuses, modules and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Fig. 7 is a schematic structural diagram of a robot according to an embodiment of the present application, and only a part related to the embodiment of the present application is shown for convenience of description.
As shown in fig. 7, the robot 7 of this embodiment includes: a processor 70, a memory 71 and a computer program 72 stored in said memory 71 and executable on said processor 70. The processor 70, when executing the computer program 72, implements the steps in the various robot inspection method embodiments described above, such as the steps S101 through S103 shown in fig. 1. Alternatively, the processor 70, when executing the computer program 72, implements the functions of each module/unit in the above-mentioned device embodiments, for example, the functions of the modules 601 to 603 shown in fig. 6.
Illustratively, the computer program 72 may be partitioned into one or more modules/units that are stored in the memory 71 and executed by the processor 70 to accomplish the present application. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution of the computer program 72 in the robot 7.
The Processor 70 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 71 may be an internal storage unit of the robot 7, such as a hard disk or a memory of the robot 7. The memory 71 may also be an external storage device of the robot 7, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, provided on the robot 7. Further, the memory 71 may also include both an internal storage unit and an external storage device of the robot 7. The memory 71 is used for storing the computer program and other programs and data required by the robot 7. The memory 71 may also be used to temporarily store data that has been output or is to be output.
It will be appreciated by those skilled in the art that fig. 7 is merely an example of the robot 7, and does not constitute a limitation of the robot 7, and may include more or less components than those shown, or combine some components, or different components, for example, the robot 7 may further include input and output devices, network access devices, buses, etc.
As shown in fig. 8, the robot may further include a mobile chassis, a camera, a radar sensor, a map module, a radar slam navigation module, a non-map movement (loculation) module, a visual algorithm module, and a patrol movement control module.
The mobile chassis is a moving device for the robot to move, and navigation movement and non-map local movement are realized through the module to realize that the robot moves to a specified position.
The radar slam navigation module is an important movement control module of the robot, detects and analyzes the current position of the robot in a map mainly through radar data, and plans a movement path going to a target position according to the target movement position and the situation of a map obstacle. Because the robot is positioned in the map with certain errors, the final navigation position may also have errors, and if the errors are large, the position correction needs to be carried out through visual comparison.
The inspection mobile control module is a core control unit of the robot, and the module firstly controls the robot to reach a target inspection point through the radar slam navigation module. After navigation is finished, whether the position and the position of the point which is deployed in advance are within the precision range or not is judged through vision, if the position and the position are outside the precision range, a picture is shot through a specified camera, the direction and/or the position which need to be adjusted are analyzed through a vision algorithm module, and the routing inspection mobile control module controls the mobile chassis to move through the non-map mobile module.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/robot and method may be implemented in other ways. For example, the above-described embodiments of the apparatus/robot are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow in the method of the embodiments described above can be realized by a computer program, which can be stored in a computer-readable storage medium and can realize the steps of the embodiments of the methods described above when the computer program is executed by a processor. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, etc. It should be noted that the computer readable medium may contain content that is subject to appropriate increase or decrease as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media does not include electrical carrier signals and telecommunications signals as is required by legislation and patent practice.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (10)

1. A robot inspection method is characterized by comprising the following steps:
acquiring position information of a target inspection point, and controlling the robot to move to the target inspection point according to the position information;
shooting an inspection target object according to preset shooting parameters to obtain an inspection image of the inspection target object;
and adjusting the orientation and/or position of the robot according to the patrol inspection image and a preset reference image until the difference between the patrol inspection image and the reference image is within a preset precision range.
2. The robot inspection method according to claim 1, wherein the adjusting the orientation and/or position of the robot according to the inspection image and a preset reference image includes:
comparing the inspection image with the reference image, and judging whether the orientation of the robot is within a preset orientation range;
and if the orientation of the robot is not in the orientation range, adjusting the orientation of the robot, and returning to execute the step of shooting the inspection target object according to preset shooting parameters.
3. The robot inspection method according to claim 2, further comprising:
if the orientation of the robot is within the orientation range, judging whether the position of the robot is within a preset position range;
and if the position of the robot is not in the position range, adjusting the position of the robot, and returning to execute the step of shooting the inspection target object according to preset shooting parameters.
4. The robot inspection method according to claim 3, further comprising:
if the position of the robot is within the position range, judging whether the distance between the robot and the inspection target object is within a preset distance range;
and if the distance between the robot and the inspection target is not within the distance range, adjusting the distance between the robot and the inspection target, and returning to execute the step of shooting the inspection target according to preset shooting parameters.
5. The robot inspection method according to claim 4, wherein the adjusting the distance between the robot and the inspection target includes:
if the distance between the robot and the inspection target object is larger than a first distance value, controlling the robot to adjust the advancing direction so as to keep the advancing direction to be over against the inspection target object, wherein the first distance value is an upper limit value of the distance range;
and controlling the robot to advance for a preset distance.
6. The robot inspection method according to claim 4, wherein the adjusting the distance between the robot and the inspection target includes:
if the distance between the robot and the inspection target object is smaller than a second distance value, controlling the robot to adjust the advancing direction so as to keep the advancing direction back to the inspection target object, wherein the second distance value is the lower limit value of the distance range;
and controlling the robot to advance for a preset distance.
7. The robot inspection method according to claim 4, further comprising:
and if the distance between the robot and the inspection target object is within the distance range, determining that the difference between the inspection image and the reference image is within the precision range, and determining the current inspection image as a target inspection image.
8. The utility model provides a robot inspection device which characterized in that includes:
the inspection mobile module is used for acquiring the position information of a target inspection point and controlling the robot to move to the target inspection point according to the position information;
the inspection image shooting module is used for shooting an inspection target object according to preset shooting parameters to obtain an inspection image of the inspection target object;
and the shooting adjusting module is used for adjusting the orientation and/or position of the robot according to the patrol inspection image and a preset reference image until the difference between the patrol inspection image and the reference image is within a preset precision range.
9. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the robot inspection method according to any one of claims 1 to 7.
10. A robot comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the steps of the robot inspection method according to any one of claims 1 to 7 when executing the computer program.
CN201911049771.9A 2019-10-31 2019-10-31 Robot inspection method and device, computer readable storage medium and robot Pending CN110850872A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911049771.9A CN110850872A (en) 2019-10-31 2019-10-31 Robot inspection method and device, computer readable storage medium and robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911049771.9A CN110850872A (en) 2019-10-31 2019-10-31 Robot inspection method and device, computer readable storage medium and robot

Publications (1)

Publication Number Publication Date
CN110850872A true CN110850872A (en) 2020-02-28

Family

ID=69599169

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911049771.9A Pending CN110850872A (en) 2019-10-31 2019-10-31 Robot inspection method and device, computer readable storage medium and robot

Country Status (1)

Country Link
CN (1) CN110850872A (en)

Cited By (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111399517A (en) * 2020-03-31 2020-07-10 中通服创立信息科技有限责任公司 Track type inspection robot following monitoring method based on UWB positioning system
CN111579119A (en) * 2020-04-22 2020-08-25 深圳市优必选科技股份有限公司 Temperature measurement method and device, computer readable storage medium and robot
CN111899373A (en) * 2020-08-05 2020-11-06 中国工商银行股份有限公司 Method and device for determining inspection point of machine room, robot and storage medium
CN111932609A (en) * 2020-07-08 2020-11-13 广州科易光电技术有限公司 Cloud deck calibration method and device for valve hall equipment inspection robot and storage medium
CN111948684A (en) * 2020-08-21 2020-11-17 广东电网有限责任公司 Distribution network obstacle inspection system and method based on differential positioning
CN112016820A (en) * 2020-08-17 2020-12-01 广东电网有限责任公司广州供电局 Patrol robot scheduling method, system and device and computer equipment
CN112033543A (en) * 2020-07-21 2020-12-04 深圳市优必选科技股份有限公司 Blackbody alignment method and device, robot and computer readable storage medium
CN112067134A (en) * 2020-07-31 2020-12-11 深圳市优必选科技股份有限公司 Temperature detection method, device, terminal and storage medium
CN112104842A (en) * 2020-09-15 2020-12-18 上海思源弘瑞自动化有限公司 Image acquisition equipment correction method
CN112585554A (en) * 2020-03-27 2021-03-30 深圳市大疆创新科技有限公司 Unmanned aerial vehicle inspection method and device and unmanned aerial vehicle
CN112669487A (en) * 2020-12-21 2021-04-16 北京佳讯飞鸿电气股份有限公司 Target tracking method and inspection robot
CN112792798A (en) * 2020-12-29 2021-05-14 重庆电子工程职业学院 Track robot inspection positioning device, method and equipment and readable storage medium
CN113301306A (en) * 2021-05-24 2021-08-24 中国工商银行股份有限公司 Intelligent inspection method and system
CN113433560A (en) * 2021-06-25 2021-09-24 北京铁道工程机电技术研究所股份有限公司 Positioning method and device for side inspection of robot, electronic equipment and medium
CN113500605A (en) * 2021-09-13 2021-10-15 中科开创(广州)智能科技发展有限公司 Inspection task visualization method and device, computer equipment and storage medium
CN113538358A (en) * 2021-07-09 2021-10-22 深圳市行知行机器人技术有限公司 Robot walking deviation rectifying method and device, intelligent robot and storage medium
CN113554778A (en) * 2021-07-28 2021-10-26 广东电网有限责任公司 Small target identification method and device for power transmission line inspection robot
CN113627400A (en) * 2021-10-12 2021-11-09 成都川江信息技术有限公司 Industrial instrument video identification system
CN113701890A (en) * 2021-08-09 2021-11-26 国网上海市电力公司 Sensing monitoring method and system for tracking cable hot spot and damage
CN113727022A (en) * 2021-08-30 2021-11-30 杭州申昊科技股份有限公司 Inspection image acquisition method and device, electronic equipment and storage medium
CN113791626A (en) * 2021-11-12 2021-12-14 南方电网数字电网研究院有限公司 Power inspection method and device, quadruped robot, system and storage medium
WO2021253247A1 (en) * 2020-06-16 2021-12-23 深圳市大疆创新科技有限公司 Inspection method and apparatus for movable platform, and movable platform and storage medium
CN113900436A (en) * 2021-09-07 2022-01-07 杭州申昊科技股份有限公司 Inspection control method, device, equipment and storage medium
CN114005026A (en) * 2021-09-29 2022-02-01 达闼科技(北京)有限公司 Image recognition method and device for robot, electronic device and storage medium
CN114063641A (en) * 2021-10-19 2022-02-18 深圳市优必选科技股份有限公司 Robot patrol method, patrol robot and computer readable storage medium
CN114362357A (en) * 2021-12-10 2022-04-15 深圳供电局有限公司 Monitoring method of transformer substation
CN114639163A (en) * 2022-02-25 2022-06-17 纯米科技(上海)股份有限公司 Walking program scoring method, system, electronic device and storage medium
WO2022142808A1 (en) * 2020-12-30 2022-07-07 深圳市海柔创新科技有限公司 Storage robot, camera assembly and positioning method
CN115278063A (en) * 2022-07-08 2022-11-01 深圳市施罗德工业集团有限公司 Inspection method, inspection device and inspection robot
CN115648212A (en) * 2022-10-31 2023-01-31 湖北中烟工业有限责任公司 Instrument visual identification method and device based on cooperative robot mobile platform
CN115706854A (en) * 2021-08-06 2023-02-17 北京小米移动软件有限公司 Camera control method and device for foot type robot and foot type robot
CN115760989A (en) * 2023-01-10 2023-03-07 西安华创马科智能控制系统有限公司 Hydraulic support robot track alignment method and device
CN115900835A (en) * 2023-01-09 2023-04-04 广东电网有限责任公司 Method and system for detecting basic parameters of power inspection robot
CN116027798A (en) * 2022-09-30 2023-04-28 三峡大学 Unmanned aerial vehicle power inspection system and method based on image correction

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103197679A (en) * 2013-03-22 2013-07-10 长沙理工大学 Accurate positioning method for orbit type routing-inspection robot
CN104298248A (en) * 2014-10-08 2015-01-21 南京航空航天大学 Accurate visual positioning and orienting method for rotor wing unmanned aerial vehicle
CN106468918A (en) * 2015-08-18 2017-03-01 航天图景(北京)科技有限公司 A kind of standardized data acquisition method of line data-logging and system
CN107018304A (en) * 2016-01-28 2017-08-04 中兴通讯股份有限公司 A kind of image-pickup method and image collecting device
CN107680135A (en) * 2017-11-16 2018-02-09 珊口(上海)智能科技有限公司 Localization method, system and the robot being applicable
CN109459437A (en) * 2018-11-07 2019-03-12 天津市普迅电力信息技术有限公司 Multi-rotor unmanned aerial vehicle transmission tower defect identification method based on high accuracy positioning
CN109703465A (en) * 2018-12-28 2019-05-03 百度在线网络技术(北京)有限公司 The control method and device of vehicle-mounted imaging sensor
CN109940603A (en) * 2019-01-21 2019-06-28 浙江大学滨海产业技术研究院 A kind of crusing robot arrives point tolerance compensating control method

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103197679A (en) * 2013-03-22 2013-07-10 长沙理工大学 Accurate positioning method for orbit type routing-inspection robot
CN104298248A (en) * 2014-10-08 2015-01-21 南京航空航天大学 Accurate visual positioning and orienting method for rotor wing unmanned aerial vehicle
CN106468918A (en) * 2015-08-18 2017-03-01 航天图景(北京)科技有限公司 A kind of standardized data acquisition method of line data-logging and system
CN107018304A (en) * 2016-01-28 2017-08-04 中兴通讯股份有限公司 A kind of image-pickup method and image collecting device
CN107680135A (en) * 2017-11-16 2018-02-09 珊口(上海)智能科技有限公司 Localization method, system and the robot being applicable
CN109459437A (en) * 2018-11-07 2019-03-12 天津市普迅电力信息技术有限公司 Multi-rotor unmanned aerial vehicle transmission tower defect identification method based on high accuracy positioning
CN109703465A (en) * 2018-12-28 2019-05-03 百度在线网络技术(北京)有限公司 The control method and device of vehicle-mounted imaging sensor
CN109940603A (en) * 2019-01-21 2019-06-28 浙江大学滨海产业技术研究院 A kind of crusing robot arrives point tolerance compensating control method

Cited By (45)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112585554A (en) * 2020-03-27 2021-03-30 深圳市大疆创新科技有限公司 Unmanned aerial vehicle inspection method and device and unmanned aerial vehicle
CN111399517A (en) * 2020-03-31 2020-07-10 中通服创立信息科技有限责任公司 Track type inspection robot following monitoring method based on UWB positioning system
CN111399517B (en) * 2020-03-31 2023-12-12 中通服创立信息科技有限责任公司 Following monitoring method of track type inspection robot based on UWB positioning system
CN111579119A (en) * 2020-04-22 2020-08-25 深圳市优必选科技股份有限公司 Temperature measurement method and device, computer readable storage medium and robot
WO2021253247A1 (en) * 2020-06-16 2021-12-23 深圳市大疆创新科技有限公司 Inspection method and apparatus for movable platform, and movable platform and storage medium
CN111932609A (en) * 2020-07-08 2020-11-13 广州科易光电技术有限公司 Cloud deck calibration method and device for valve hall equipment inspection robot and storage medium
CN112033543A (en) * 2020-07-21 2020-12-04 深圳市优必选科技股份有限公司 Blackbody alignment method and device, robot and computer readable storage medium
CN112067134A (en) * 2020-07-31 2020-12-11 深圳市优必选科技股份有限公司 Temperature detection method, device, terminal and storage medium
CN111899373A (en) * 2020-08-05 2020-11-06 中国工商银行股份有限公司 Method and device for determining inspection point of machine room, robot and storage medium
CN112016820A (en) * 2020-08-17 2020-12-01 广东电网有限责任公司广州供电局 Patrol robot scheduling method, system and device and computer equipment
CN112016820B (en) * 2020-08-17 2022-10-25 广东电网有限责任公司广州供电局 Patrol robot scheduling method, system and device and computer equipment
CN111948684B (en) * 2020-08-21 2024-01-23 广东电网有限责任公司 Distribution network obstacle inspection system and method based on differential positioning
CN111948684A (en) * 2020-08-21 2020-11-17 广东电网有限责任公司 Distribution network obstacle inspection system and method based on differential positioning
CN112104842A (en) * 2020-09-15 2020-12-18 上海思源弘瑞自动化有限公司 Image acquisition equipment correction method
CN112104842B (en) * 2020-09-15 2023-01-06 上海思源弘瑞自动化有限公司 Image acquisition equipment correction method, device, equipment and medium
CN112669487A (en) * 2020-12-21 2021-04-16 北京佳讯飞鸿电气股份有限公司 Target tracking method and inspection robot
CN112792798A (en) * 2020-12-29 2021-05-14 重庆电子工程职业学院 Track robot inspection positioning device, method and equipment and readable storage medium
CN112792798B (en) * 2020-12-29 2022-05-06 重庆电子工程职业学院 Track robot inspection positioning device, method and equipment and readable storage medium
WO2022142808A1 (en) * 2020-12-30 2022-07-07 深圳市海柔创新科技有限公司 Storage robot, camera assembly and positioning method
CN113301306A (en) * 2021-05-24 2021-08-24 中国工商银行股份有限公司 Intelligent inspection method and system
CN113433560B (en) * 2021-06-25 2023-12-26 北京铁道工程机电技术研究所股份有限公司 Positioning method and device for robot side inspection, electronic equipment and medium
CN113433560A (en) * 2021-06-25 2021-09-24 北京铁道工程机电技术研究所股份有限公司 Positioning method and device for side inspection of robot, electronic equipment and medium
CN113538358A (en) * 2021-07-09 2021-10-22 深圳市行知行机器人技术有限公司 Robot walking deviation rectifying method and device, intelligent robot and storage medium
CN113554778A (en) * 2021-07-28 2021-10-26 广东电网有限责任公司 Small target identification method and device for power transmission line inspection robot
CN115706854A (en) * 2021-08-06 2023-02-17 北京小米移动软件有限公司 Camera control method and device for foot type robot and foot type robot
CN113701890A (en) * 2021-08-09 2021-11-26 国网上海市电力公司 Sensing monitoring method and system for tracking cable hot spot and damage
CN113727022A (en) * 2021-08-30 2021-11-30 杭州申昊科技股份有限公司 Inspection image acquisition method and device, electronic equipment and storage medium
CN113900436B (en) * 2021-09-07 2023-11-07 杭州申昊科技股份有限公司 Inspection control method, inspection control device, inspection control equipment and storage medium
CN113900436A (en) * 2021-09-07 2022-01-07 杭州申昊科技股份有限公司 Inspection control method, device, equipment and storage medium
CN113500605A (en) * 2021-09-13 2021-10-15 中科开创(广州)智能科技发展有限公司 Inspection task visualization method and device, computer equipment and storage medium
CN114005026A (en) * 2021-09-29 2022-02-01 达闼科技(北京)有限公司 Image recognition method and device for robot, electronic device and storage medium
CN113627400A (en) * 2021-10-12 2021-11-09 成都川江信息技术有限公司 Industrial instrument video identification system
CN114063641B (en) * 2021-10-19 2024-04-16 深圳市优必选科技股份有限公司 Robot patrol method, patrol robot and computer readable storage medium
CN114063641A (en) * 2021-10-19 2022-02-18 深圳市优必选科技股份有限公司 Robot patrol method, patrol robot and computer readable storage medium
CN113791626A (en) * 2021-11-12 2021-12-14 南方电网数字电网研究院有限公司 Power inspection method and device, quadruped robot, system and storage medium
CN114362357A (en) * 2021-12-10 2022-04-15 深圳供电局有限公司 Monitoring method of transformer substation
CN114639163B (en) * 2022-02-25 2024-06-07 纯米科技(上海)股份有限公司 Scoring method and scoring system for walking program, electronic device and storage medium
CN114639163A (en) * 2022-02-25 2022-06-17 纯米科技(上海)股份有限公司 Walking program scoring method, system, electronic device and storage medium
CN115278063A (en) * 2022-07-08 2022-11-01 深圳市施罗德工业集团有限公司 Inspection method, inspection device and inspection robot
CN116027798B (en) * 2022-09-30 2023-11-17 三峡大学 Unmanned aerial vehicle power inspection system and method based on image correction
CN116027798A (en) * 2022-09-30 2023-04-28 三峡大学 Unmanned aerial vehicle power inspection system and method based on image correction
CN115648212A (en) * 2022-10-31 2023-01-31 湖北中烟工业有限责任公司 Instrument visual identification method and device based on cooperative robot mobile platform
CN115900835A (en) * 2023-01-09 2023-04-04 广东电网有限责任公司 Method and system for detecting basic parameters of power inspection robot
CN115900835B (en) * 2023-01-09 2024-04-16 广东电网有限责任公司 Detection method and system for basic parameters of power inspection robot
CN115760989A (en) * 2023-01-10 2023-03-07 西安华创马科智能控制系统有限公司 Hydraulic support robot track alignment method and device

Similar Documents

Publication Publication Date Title
CN110850872A (en) Robot inspection method and device, computer readable storage medium and robot
US11422261B2 (en) Robot relocalization method and apparatus and robot using the same
CN109807885B (en) Visual calibration method and device for manipulator and intelligent terminal
CN110491060B (en) Robot, safety monitoring method and device thereof, and storage medium
CN110278382B (en) Focusing method, device, electronic equipment and storage medium
CN111381586A (en) Robot and movement control method and device thereof
CN113418543B (en) Automatic driving sensor detection method and device, electronic equipment and storage medium
CN107742304B (en) Method and device for determining movement track, mobile robot and storage medium
CN111192331A (en) External parameter calibration method and device for laser radar and camera
CN109828250B (en) Radar calibration method, calibration device and terminal equipment
CN111060132B (en) Calibration method and device for travelling crane positioning coordinates
CN112033543B (en) Blackbody alignment method and device, robot and computer readable storage medium
CN112967347B (en) Pose calibration method, pose calibration device, robot and computer readable storage medium
CN112215887A (en) Pose determination method and device, storage medium and mobile robot
CN112212851B (en) Pose determination method and device, storage medium and mobile robot
CN115407355B (en) Library position map verification method and device and terminal equipment
CN111336938A (en) Robot and object distance detection method and device thereof
CN116386373A (en) Vehicle positioning method and device, storage medium and electronic equipment
CN113776520B (en) Map construction, using method, device, robot and medium
CN110426674B (en) Spatial position determination method and device, electronic equipment and storage medium
CN111136689A (en) Self-checking method and device
CN111353932B (en) Coordinate conversion method and device, electronic equipment and storage medium
CN110763232B (en) Robot and navigation positioning method and device thereof
CN112580402A (en) Monocular vision pedestrian distance measurement method and system, vehicle and medium thereof
CN113227708B (en) Method and device for determining pitch angle and terminal equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination