CN115139339A - Method, device, equipment and storage medium for testing working coverage rate of robot - Google Patents

Method, device, equipment and storage medium for testing working coverage rate of robot Download PDF

Info

Publication number
CN115139339A
CN115139339A CN202210781544.0A CN202210781544A CN115139339A CN 115139339 A CN115139339 A CN 115139339A CN 202210781544 A CN202210781544 A CN 202210781544A CN 115139339 A CN115139339 A CN 115139339A
Authority
CN
China
Prior art keywords
robot
working
scene
area
coverage area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210781544.0A
Other languages
Chinese (zh)
Inventor
瞿卫新
杨品
姚晓辉
叶炜铖
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou Aotemin Robot Technology Service Co ltd
Original Assignee
Suzhou Aotemin Robot Technology Service Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou Aotemin Robot Technology Service Co ltd filed Critical Suzhou Aotemin Robot Technology Service Co ltd
Priority to CN202210781544.0A priority Critical patent/CN115139339A/en
Publication of CN115139339A publication Critical patent/CN115139339A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/0095Means or methods for testing manipulators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

The invention discloses a robot coverage rate testing method, a device, equipment and a storage medium, wherein the method comprises the steps of calibrating a target scene based on a fixed coordinate system to construct a testing scene; generating a motion track of the robot according to the position information of the robot in the test scene, which is acquired by the image acquisition equipment in real time; determining a working coverage area of the robot according to the motion track; and determining the working coverage rate of the robot according to the working coverage area and the test scene. By the method, accurate robot coverage can be obtained.

Description

Method, device, equipment and storage medium for testing working coverage rate of robot
Technical Field
The embodiment of the invention relates to the field of computers, in particular to a method, a device, equipment and a storage medium for testing coverage rate of a robot.
Background
At present, the coverage rate of the robot is an important index for evaluating the working condition of the robot. When the coverage rate of the robot is calculated at the present stage, a high-speed camera is often adopted to capture the motion state of the robot to obtain the running track of the robot, and then the running track of the robot is mapped to a standard test scene to obtain the coverage rate of the robot. However, certain errors often exist in the running conditions of the robot in an actual test scene and the running track in a standard test scene, so that the accurate coverage rate of the robot cannot be obtained. Therefore, how to reflect the running condition of the robot in an actual test scene to obtain accurate coverage rate of the robot is a problem to be solved currently.
Disclosure of Invention
The invention provides a robot coverage rate testing method, a device, equipment and a storage medium, which can obtain accurate robot coverage rate.
According to an aspect of the present invention, there is provided a work coverage testing method of a robot, including:
calibrating a target scene based on a fixed coordinate system to construct a test scene;
generating a motion track of the robot according to the position information of the robot in the test scene, which is acquired by the image acquisition equipment in real time;
determining a working coverage area of the robot according to the motion track;
and determining the working coverage rate of the robot according to the working coverage area and the test scene.
According to another aspect of the present invention, there is provided a work coverage testing apparatus of a robot, the apparatus including:
the test scene construction module is used for calibrating the target scene based on the fixed coordinate system so as to construct a test scene;
the motion trail generation module is used for generating a motion trail of the robot according to the position information of the robot in the test scene, which is acquired by the image acquisition equipment in real time;
the working coverage area determining module is used for determining the working coverage area of the robot according to the motion track;
and the working coverage rate determining module is used for determining the working coverage rate of the robot according to the working coverage area and the test scene.
According to another aspect of the present invention, there is provided an electronic apparatus including:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores a computer program executable by the at least one processor, the computer program being executable by the at least one processor to enable the at least one processor to perform a method of testing a working coverage of a robot according to any embodiment of the invention.
According to another aspect of the present invention, there is provided a computer-readable storage medium storing computer instructions for causing a processor to implement a method for testing a working coverage of a robot according to any one of the embodiments of the present invention when the computer instructions are executed.
According to the technical scheme of the embodiment of the invention, a target scene is calibrated based on a fixed coordinate system to construct a test scene; generating a motion track of the robot according to position information of the robot in a test scene, which is acquired by image acquisition equipment in real time; determining a working coverage area of the robot according to the motion track; and determining the working coverage rate of the robot according to the working coverage area and the test scene. According to the scheme, the running track of the robot is mapped to the test scene without acquiring the running track of the robot, so that the position information of the robot in the actual test scene can be directly acquired, and the motion track of the robot in the test scene can be more accurately acquired according to the position information, so that the more accurate working coverage rate of the robot is acquired, and meanwhile, the calculation efficiency of the working coverage rate of the robot is improved.
It should be understood that the statements in this section do not necessarily identify key or critical features of the embodiments of the present invention, nor do they necessarily limit the scope of the invention. Other features of the present invention will become apparent from the following description.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a flowchart of a method for testing a working coverage of a robot according to an embodiment of the present invention;
fig. 2 is a flowchart of a method for testing the working coverage of a robot according to a second embodiment of the present invention;
fig. 3 is a flowchart of a method for testing the working coverage of a robot according to a third embodiment of the present invention;
fig. 4 is a schematic structural diagram of a working coverage rate testing apparatus for a robot according to a fourth embodiment of the present invention;
fig. 5 is a schematic structural diagram of an electronic device according to a fifth embodiment of the present invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the terms "current," "history," "target," and the like in the description and claims of the present invention and in the above-described drawings are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Example one
Fig. 1 is a flowchart of a method for testing a work coverage of a robot according to an embodiment of the present invention, which is applicable to how to test the work coverage of the robot. The method may be performed by a working coverage testing apparatus of a robot, which may be implemented in hardware and/or software, and which may be configured in an electronic device. As shown in fig. 1, the method includes:
and S110, calibrating the target scene based on the fixed coordinate system to construct a test scene.
The fixed coordinate system is a space rectangular coordinate system set according to actual requirements. The target scene is used for testing the working coverage rate of the robot, and can be selected in advance according to actual requirements. The test scene is a plane scene graph of the target scene determined according to the coordinate points of the target scene.
Specifically, according to actual needs, a target scene for testing the working coverage of the robot is selected, and the target scene can be a hotel scene or a residential scene. After the target scene is selected, a fixed coordinate system may be set within the target scene. The fixed coordinate system may be set by selecting a coordinate origin in the target scene, placing a right-angled vertex of the L-shaped ruler at the coordinate origin, using two sides of the L-shaped ruler as an X-axis and a Y-axis of the fixed coordinate system, respectively, and using an axis perpendicular to the X-axis and the Y-axis at the coordinate origin as a Z-axis of the fixed coordinate system. And calibrating all contour points in the target scene by adopting a fixed coordinate system, namely calibrating the coordinate information of all contour points in the target scene on the fixed coordinate system to obtain the coordinates of the contour points. Contour points refer to all points of inflection within the target scene.
After all the contour points are calibrated, the corner point coordinates of the target scene are determined. Corner points refer to contour points on the boundary lines of the target scene. For example, if the target scene is a rectangular room, the corner points are the four right-angled vertices of the room. After determining the corner coordinates of the target scene, inputting the corner coordinates of the target scene and the contour points of the target scene into a test system, and drawing the test scene by the test system according to the corner coordinates of the target scene and the contour points of the target scene.
And S120, generating a motion track of the robot according to the position information of the robot in the test scene, which is acquired by the image acquisition equipment in real time.
The image acquisition device refers to a device capable of acquiring the pose of the robot, and may be a motion capture camera, for example. The robot pose includes position information and pose information of the robot. The image acquisition device is disposed in the optical motion capture system. The optical motion capture system is based on the computer vision principle, and the motion capture camera monitors the feature points of the robot from different angles, so that the motion capture of the robot is realized according to the monitored feature points through a rigid body calculation algorithm. The feature points refer to points where the image gradation value changes drastically or points where the curvature is large on the image edge. For any feature point in the test scene, as long as the feature point can be simultaneously acquired by more than two motion capture cameras, the position of the feature point in the test scene can be determined.
Specifically, the robot image can be acquired in real time through at least two image acquisition devices, and the robot image is sent to the test system, the test system determines the characteristic points of the robot according to the robot image through an image processing algorithm, and the characteristic points of the robot are marked in the test scene to obtain the position information of the robot in the test scene, wherein the position information of the robot in the test scene refers to the characteristic point coordinates of the robot characteristic points in a fixed coordinate system. And determining the motion trail of the robot according to the position information of the robot in the test scene. For example, the method for determining the motion trajectory of the robot may be: and connecting the coordinates of the robot characteristic points corresponding to the position information to obtain a characteristic point connecting line, and taking the robot characteristic point connecting line as a motion track of the robot.
For example, the position information may be marked in the test scene according to the position information acquisition time of the robot in the test scene, so as to determine the motion track of the robot in the test scene. Specifically, the method can be realized by the following substeps:
s1201, acquiring the position information of the robot in the test scene, acquired by the image acquisition equipment, in real time.
Specifically, an image acquisition device is adopted to acquire a robot image of the robot in a test scene in real time, and the acquired robot image and acquisition time corresponding to the robot image are sent to a test system. And the test system determines the feature point coordinates of the robot feature points in the fixed coordinate system according to the robot image, and uses the feature point coordinates as the position information of the robot in the test scene. And storing the position information and the acquisition time of the position information.
And S1202, marking the position information in the test scene in sequence based on the acquisition time of the position information.
The acquisition time of the position information refers to the acquisition time of the robot image corresponding to the position information.
Specifically, after the position information is acquired, the position information is marked in the test scene once based on the sequence of the acquisition time. Preferably, the position information marked in the test scene carries an acquisition time identifier of the position information.
And S1203, determining a motion track of the robot in the test scene according to the test scene after the position information is marked.
Specifically, after the position information is marked in sequence in the test scene, the feature point coordinates corresponding to the position information are connected in sequence according to the sequence of marking the position information to obtain a feature point connecting line, and the feature point connecting line is used as the motion trail of the robot.
It should be noted that, according to the position information acquisition time of the robot in the test scene, the position information is marked in the test scene, so as to determine the motion trajectory of the robot in the test scene. The real position information of the robot in the actual test scene can be obtained.
And S130, determining the working coverage area of the robot according to the motion trail.
The working coverage area of the robot refers to a working area which can be involved in the working process of the robot.
Specifically, after the motion track of the robot in the test scene is determined, the working width of the robot is further obtained, the working target area of the robot when the robot executes the task on each robot feature point on the motion track is determined according to the working width, and then all the working target areas are overlapped, so that the working coverage area of the robot can be obtained.
The working width of the robot refers to a width value of a maximum working area which can be reached when the robot is in a fixed state when the robot executes a task. The working target area is a circular area formed by taking the working width of the robot as the diameter and taking the characteristic point of the robot as the center of a circle.
For example, if the width of the maximum sweeping area that can be reached when the sweeping robot performs a sweeping task at a fixed point is two meters, the working width of the sweeping robot is two meters, and the working area of the sweeping robot at the fixed point is a circular area with a diameter of two meters.
And S140, determining the working coverage rate of the robot according to the working coverage area and the test scene.
The working coverage rate of the robot refers to the ratio of the working coverage area of the robot to the test scene. The working coverage rate can be used as one of the working performance indexes of the robot and also can be used as one of the reference parameters when the robot is subsequently maintained.
For example, the working coverage of the robot may be determined according to the coverage area of the working coverage area and the scene area of the test scene. Specifically, the method can be realized by the following substeps:
and S1401, determining the coverage area of the working coverage area.
Wherein the coverage area refers to the area of the working coverage area.
Specifically, after the working coverage area is determined, the coverage area of the working coverage area is calculated through the test system.
And S1402, determining the scene area of the test scene.
Specifically, the scene area of the test scene is calculated through the test system.
And S1403, taking the ratio of the coverage area to the scene area as the working coverage rate of the robot.
It should be noted that the working coverage of the robot is determined according to the coverage area of the working coverage area and the scene area of the test scene, so that a more accurate calculation result of the coverage of the robot can be obtained.
According to the technical scheme provided by the embodiment, a target scene is calibrated based on a fixed coordinate system so as to construct a test scene; generating a motion track of the robot according to position information of the robot in a test scene, which is acquired by image acquisition equipment in real time; determining a working coverage area of the robot according to the motion track; and determining the working coverage rate of the robot according to the working coverage area and the test scene. According to the scheme, the running track of the robot is mapped to the test scene without acquiring the running track of the robot, so that the position information of the robot in the actual test scene can be directly acquired, and the motion track of the robot in the test scene can be more accurately acquired according to the position information, so that the more accurate working coverage rate of the robot is acquired, and meanwhile, the calculation efficiency of the working coverage rate of the robot is improved.
Example two
Fig. 2 is a flowchart of a work coverage test of a robot according to a second embodiment of the present invention, which is optimized based on the second embodiment, and this embodiment provides a preferred embodiment that a calibration scene is obtained according to a fixed coordinate system and a target scene, a calibration obstacle is obtained according to a fixed coordinate system and an obstacle in the target scene, and a test scene is constructed according to the calibration scene and the calibration obstacle. Specifically, as shown in fig. 2, the method includes:
and S210, calibrating the contour points in the target scene on the spot based on the fixed coordinate system to obtain a calibration scene.
Specifically, a fixed coordinate system is adopted to calibrate all contour points in the target scene, that is, coordinate information of all contour points in the target scene on the fixed coordinate system is calibrated to obtain the coordinates of the contour points. After all the contour points are calibrated, the corner point coordinates of the target scene are determined. After determining the corner coordinates of the target scene, inputting the corner coordinates of the target scene and the contour points of the target scene into a test system, and drawing the test scene by the test system according to the corner coordinates of the target scene and the contour points of the target scene.
And S220, calibrating the obstacles in the target scene based on the fixed coordinate system to obtain calibrated obstacles.
The obstacle may be an obstacle added in the target scene according to actual needs. The step of calibrating the obstacle is to draw the obstacle in a calibration scene according to the coordinates of the information point of the obstacle on a fixed coordinate system.
Specifically, if the cross section of the obstacle is square and the area of the square is smaller than a preset area threshold, determining the coordinate of the center point of the square in a fixed coordinate system, and marking the center point of the square in a calibration scene based on the coordinate of the center point of the square. And determining the length value of the square, inputting the length value into the test system, so that the test system draws a cross-sectional image of the obstacle in the calibration scene according to the length value of the square and the central point marked in the calibration scene, and taking the cross-sectional image of the obstacle as the calibration obstacle.
If the cross section of the obstacle is circular, the coordinates of the center point of the circle in a fixed coordinate system are determined. Based on the coordinates of the circular center point, the circular center point is marked in the calibration scene, the circular radius is input into the value testing system, the testing system draws the cross section image of the obstacle in the calibration scene according to the circular center point and the circular radius, and the cross section image of the obstacle is used as the calibration obstacle.
If the cross section of the obstacle is a polygon, determining the coordinates of the outline points of the polygon in a fixed coordinate system, marking the coordinates of the outline points of the polygon in a calibration scene, connecting the coordinates of the outline points of the polygon with straight lines, drawing a cross section image of the obstacle in the calibration scene, and taking the cross section image of the obstacle as the calibration obstacle.
And S230, constructing a test scene according to the calibration scene and the calibration obstacles.
Specifically, after all the calibration obstacles are determined, the calibration relaxation containing the calibration obstacles is used as a test scene, so that the construction of the test scene is completed.
And S240, generating a motion track of the robot according to the position information of the robot in the test scene, which is acquired by the image acquisition equipment in real time.
Specifically, when the position information of the robot in the test scene generates the motion trajectory of the robot, it is necessary to ensure that the motion trajectory of the robot does not overlap with the calibration obstacle in the test scene.
And S250, determining the working coverage area of the robot according to the motion track.
And S260, determining the working coverage rate of the robot according to the working coverage area and the test scene.
It should be noted that when determining the working coverage of the robot according to the working coverage area and the test scene, the scene area of the test scene needs to be determined first, then the obstacle area of the calibration obstacle in the test scene needs to be determined, and the difference between the scene area and the obstacle area is used as the effective area. And then taking the ratio of the coverage area of the working coverage area to the effective area as the working coverage area of the robot.
According to the technical scheme of the embodiment, the contour points in the target scene are calibrated on the spot based on the fixed coordinate system to obtain a calibration scene; calibrating the obstacles in the target scene based on the fixed coordinate system to obtain calibrated obstacles; constructing a test scene according to the calibration scene and the calibration barrier; generating a motion track of the robot according to position information of the robot in a test scene, which is acquired by image acquisition equipment in real time; determining a working coverage area of the robot according to the motion track; and determining the working coverage rate of the robot according to the working coverage area and the test scene. According to the scheme, when the working coverage rate test scene of the robot is constructed, the influence of the obstacles in the test scene on the working coverage rate of the robot is fully considered, and the effect of improving the working coverage rate calculation accuracy of the robot is achieved.
EXAMPLE III
Fig. 3 is a flowchart of a work coverage test of a robot according to a third embodiment of the present invention, where the third embodiment is optimized based on the foregoing embodiments, and a current work area of the robot is determined according to a motion trajectory and a work area width of the robot; according to the current working area and the historical working area, the preferred embodiment of the working coverage area of the robot is determined. Specifically, as shown in fig. 3, the method includes:
and S310, calibrating the target scene based on the fixed coordinate system to construct a test scene.
And S320, generating a motion track of the robot according to the position information of the robot in the test scene, which is acquired by the image acquisition equipment in real time.
And S330, determining the current working area of the robot according to the motion trail and the working area width value of the robot.
The current working area refers to a working coverage area of the robot obtained when the working coverage area of the robot is tested. The width value of the working area of the robot is the working width of the robot.
Specifically, before the working coverage of the robot is tested, the model of the robot and the working area width value corresponding to the model of the robot are input into the test system. After the motion track of the robot in the test scene is determined, the working area width value of the robot is further obtained according to the model of the robot, the working target area of the robot when the robot executes tasks on each robot feature point on the motion track under the current condition is determined according to the working area width value, and then all the working target areas are overlapped, so that the current working area of the robot can be obtained.
And S340, determining the working coverage area of the robot according to the current working area and the historical working area.
The historical working area refers to a working coverage area of the robot obtained when the working coverage area of the robot is tested once.
Specifically, the current working area is covered on the historical working area, so that a superposed working area of a superposed part of the current working area and the historical working area can be obtained, and an independent working area of which the current working area and the historical working area are not superposed can be obtained. The independent working area comprises an area which is not overlapped with the historical working area in the current working area and an area which is not overlapped with the current working area in the historical working area. And adding the overlapped working area and the independent working area to obtain a working coverage area of the robot.
Preferably, the method for determining the working coverage area of the robot may be: and performing superposition calculation on the current working area and the historical working area of the robot for a limited number of times to obtain the working coverage area of the robot. Specifically, the method can be realized through the following substeps:
and S3401, sequentially performing superposition calculation on the current working area and the historical working area according to the determined time of the working area to obtain a superposition coverage area.
And the determination time of the working area is the time when the testing system determines the working area.
Specifically, according to the sequence of the determined time from near to far, the current working area and the historical working area are sequentially subjected to superposition calculation, and the superposition calculation result is used as a superposition coverage area. The manner of the superposition calculation is consistent with the manner of calculating the working coverage area of the robot, and is not described herein again.
And S3402, if the superposition calculation frequency is less than or equal to the superposition frequency threshold, taking the superposition coverage area as the working coverage area of the robot.
The threshold value of the number of times of superposition is a numerical value set according to actual needs.
Specifically, when the current working area and the historical working area are subjected to superposition calculation according to the determination time of the working area, the superposition calculation frequency is determined, and if the superposition calculation frequency is less than or equal to the superposition frequency threshold value, the superposition coverage area is used as the working coverage area of the robot to complete the superposition calculation.
For example, the threshold of the number of times of superimposition may be set to five times, and if the superimposition calculation is completed after four times of superimposition, the superimposition coverage area obtained by the four times of superimposition calculation may be set as the work coverage area of the robot.
Further, if the number of times of superposition calculation is greater than the threshold of the number of times of superposition, the superposition coverage area obtained by the superposition calculation of the threshold number of times of superposition is used as the working coverage area of the robot.
Specifically, when the current working area and the historical working area are subjected to superposition calculation, the superposition calculation frequency is determined, if the superposition frequency is greater than the superposition frequency threshold, the superposition calculation is stopped, and a superposition coverage area obtained by the superposition calculation of the superposition frequency threshold is used as the working coverage area of the robot.
The method has the advantages that the current working area and the historical working area of the robot are subjected to superposition calculation for a limited number of times to obtain the working coverage area of the robot, and the obtaining efficiency of the working coverage area can be improved while the accuracy of the obtained working coverage area is ensured.
And S350, determining the working coverage rate of the robot according to the working coverage area and the test scene.
According to the technical scheme of the embodiment, a target scene is calibrated based on a fixed coordinate system to construct a test scene; generating a motion track of the robot according to position information of the robot in a test scene, which is acquired by image acquisition equipment in real time; determining the current working area of the robot according to the motion track and the working area width value of the robot; determining a working coverage area of the robot according to the current working area and the historical working area; and determining the working coverage rate of the robot according to the working coverage area and the test scene. The method and the device solve the problem that when the working coverage area of the robot is determined, a certain error possibly exists in the working coverage area of the robot acquired at a single time, so that the calculated working coverage rate of the robot is inaccurate. According to the scheme, the current working area and the historical working area are overlapped, so that a more accurate working coverage area can be obtained, and a more accurate test result of the working coverage rate of the robot can be obtained.
Example four
Fig. 4 is a schematic structural diagram of a method for testing the working coverage of the robot according to the fourth embodiment of the present invention. The present embodiment is applicable to a case where data is processed. As shown in fig. 4, the method for testing the working coverage of the robot includes: the test scenario construction module 410, the motion trajectory generation module 420, the work coverage determination module 430, and the work coverage determination module 440.
The test scene constructing module 410 is configured to calibrate a target scene based on a fixed coordinate system to construct a test scene;
the motion trail generation module 420 is configured to generate a motion trail of the robot according to position information of the robot in a test scene, which is acquired by the image acquisition device in real time;
the work coverage area determining module 430 is configured to determine a work coverage area of the robot according to the motion trajectory;
and the work coverage rate determining module 440 is used for determining the work coverage rate of the robot according to the work coverage area and the test scene.
According to the technical scheme provided by the embodiment, a target scene is calibrated based on a fixed coordinate system so as to construct a test scene; generating a motion track of the robot according to position information of the robot in a test scene, which is acquired by image acquisition equipment in real time; determining a working coverage area of the robot according to the motion track; and determining the working coverage rate of the robot according to the working coverage area and the test scene. According to the scheme, the position information of the robot in the actual test scene can be directly acquired without mapping the running track of the robot to the test scene after the running track of the robot is acquired, and the motion track of the robot in the test scene can be more accurately acquired according to the position information, so that the more accurate working coverage rate of the robot is acquired, and meanwhile, the calculation efficiency of the working coverage rate of the robot is improved.
The test scenario construction module 410 is specifically configured to:
based on a fixed coordinate system, carrying out on-site calibration on contour points in a target scene to obtain a calibration scene;
calibrating the obstacles in the target scene based on the fixed coordinate system to obtain calibrated obstacles;
and constructing a test scene according to the calibration scene and the calibration obstacles.
Illustratively, the motion trajectory generation module 420 is specifically configured to:
acquiring position information of the robot in a test scene acquired by image acquisition equipment in real time;
marking the position information in the test scene in sequence based on the acquisition time of the position information;
and determining the motion track of the robot in the test scene according to the test scene marked with the position information.
Illustratively, the work coverage area determination module 430 includes:
the current working area determining unit is used for determining the current working area of the robot according to the motion track and the working area width value of the robot;
and the work coverage area calculation unit is used for determining the work coverage area of the robot according to the current work area and the historical work area.
Further, the work coverage area calculating unit is specifically configured to:
according to the determination time of the working area, sequentially carrying out superposition calculation on the current working area and the historical working area to obtain a superposition coverage area;
if the superposition calculation times are less than or equal to the superposition time threshold, taking the superposition coverage area as the working coverage area of the robot;
and if the superposition calculation times are greater than the superposition time threshold, taking the superposition coverage area obtained by the superposition calculation of the superposition time threshold as the working coverage area of the robot.
Illustratively, the operational coverage determination module 440 is specifically configured to:
determining the coverage area of a working coverage area;
determining the scene area of a test scene;
and taking the ratio of the coverage area to the scene area as the working coverage rate of the robot.
The method for testing the working coverage of the robot provided by the embodiment can be applied to the method for testing the working coverage of the robot provided by any embodiment, and has corresponding functions and beneficial effects.
EXAMPLE five
FIG. 5 illustrates a schematic diagram of an electronic device 10 that may be used to implement an embodiment of the invention. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital assistants, cellular phones, smart phones, wearable devices (e.g., helmets, glasses, watches, etc.), and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the inventions described and/or claimed herein.
As shown in fig. 5, the electronic device 10 includes at least one processor 11, and a memory communicatively connected to the at least one processor 11, such as a Read Only Memory (ROM) 12, a Random Access Memory (RAM) 13, and the like, wherein the memory stores a computer program executable by the at least one processor, and the processor 11 may perform various suitable actions and processes according to the computer program stored in the Read Only Memory (ROM) 12 or the computer program loaded from the storage unit 18 into the Random Access Memory (RAM) 13. In the RAM 13, various programs and data necessary for the operation of the electronic apparatus 10 can also be stored. The processor 11, the ROM 12, and the RAM 13 are connected to each other via a bus 14. An input/output (I/O) interface 15 is also connected to bus 14.
A number of components in the electronic device 10 are connected to the I/O interface 15, including: an input unit 16 such as a keyboard, a mouse, or the like; an output unit 17 such as various types of displays, speakers, and the like; a storage unit 18 such as a magnetic disk, optical disk, or the like; and a communication unit 19 such as a network card, modem, wireless communication transceiver, etc. The communication unit 19 allows the electronic device 10 to exchange information/data with other devices via a computer network such as the internet and/or various telecommunication networks.
Processor 11 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of processor 11 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various specialized Artificial Intelligence (AI) computing chips, various processors running machine learning model algorithms, a Digital Signal Processor (DSP), and any suitable processor, controller, microcontroller, or the like. The processor 11 performs the various methods and processes described above, such as the work coverage test method of the robot.
In some embodiments, the operational coverage testing method of the robot may be implemented as a computer program tangibly embodied in a computer-readable storage medium, such as storage unit 18. In some embodiments, part or all of the computer program may be loaded and/or installed onto the electronic device 10 via the ROM 12 and/or the communication unit 19. When the computer program is loaded into RAM 13 and executed by processor 11, one or more steps of the above described method of work coverage testing of a robot may be performed. Alternatively, in other embodiments, the processor 11 may be configured to perform the working coverage test method of the robot by any other suitable means (e.g. by means of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuitry, field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), system on a chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, receiving data and instructions from, and transmitting data and instructions to, a storage system, at least one input device, and at least one output device.
A computer program for implementing the methods of the present invention may be written in any combination of one or more programming languages. These computer programs may be provided to a processor of a general purpose computer, special purpose computer, or other programmable robot for performing the method for coverage testing, such that the computer programs, when executed by the processor, cause the functions/acts specified in the flowchart and/or block diagram block or blocks to be performed. A computer program can execute entirely on a machine, partly on a machine, as a stand-alone software package partly on a machine and partly on a remote machine or entirely on a remote machine or server.
In the context of the present invention, a computer-readable storage medium may be a tangible medium that can contain, or store a computer program for use by or in connection with an instruction execution system, apparatus, or device. A computer readable storage medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. Alternatively, the computer readable storage medium may be a machine readable signal medium. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on an electronic device having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) by which a user can provide input to the electronic device. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic, speech, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), blockchain networks, and the internet.
The computing system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server can be a cloud server, also called a cloud computing server or a cloud host, and is a host product in a cloud computing service system, so that the defects of high management difficulty and weak service expansibility in the traditional physical host and VPS service are overcome.
It should be understood that various forms of the flows shown above may be used, with steps reordered, added, or deleted. For example, the steps described in the present invention may be executed in parallel, sequentially, or in different orders, and are not limited herein as long as the desired results of the technical solution of the present invention can be achieved.
The above-described embodiments should not be construed as limiting the scope of the invention. It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and substitutions may be made in accordance with design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (10)

1. A method for testing the working coverage rate of a robot is characterized by comprising the following steps:
calibrating a target scene based on a fixed coordinate system to construct a test scene;
generating a motion track of the robot according to the position information of the robot in the test scene, which is acquired by the image acquisition equipment in real time;
determining a working coverage area of the robot according to the motion track;
and determining the working coverage rate of the robot according to the working coverage area and the test scene.
2. The method of claim 1, wherein the calibrating the target scene based on the fixed coordinate system to construct the test scene comprises:
based on a fixed coordinate system, carrying out on-site calibration on contour points in a target scene to obtain a calibration scene;
calibrating the obstacles in the target scene based on the fixed coordinate system to obtain calibrated obstacles;
and constructing a test scene according to the calibration scene and the calibration obstacles.
3. The method according to claim 1, wherein the generating the motion trajectory of the robot according to the position information of the robot in the test scene, which is acquired by the image acquisition device in real time, comprises:
acquiring the position information of the robot in the test scene acquired by the image acquisition equipment in real time;
marking the position information in a test scene in sequence based on the acquisition time of the position information;
and determining the motion trail of the robot in the test scene according to the test scene marked with the position information.
4. The method of claim 1, wherein determining a work coverage area of the robot from the motion profile comprises:
determining the current working area of the robot according to the motion track and the working area width value of the robot;
and determining the working coverage area of the robot according to the current working area and the historical working area.
5. The method of claim 4, wherein determining the work coverage area of the robot based on the current work area and historical work areas comprises:
according to the determination time of the working area, sequentially carrying out superposition calculation on the current working area and the historical working area to obtain a superposition coverage area;
if the superposition calculation times are less than or equal to the superposition time threshold, taking the superposition coverage area as the working coverage area of the robot;
and if the superposition calculation times are greater than the superposition time threshold, taking the superposition coverage area obtained by the superposition calculation of the superposition time threshold as the working coverage area of the robot.
6. The method of claim 1, wherein determining robot coverage from the work coverage area and the test scenario comprises:
determining the coverage area of the working coverage area;
determining a scene area of the test scene;
and taking the ratio of the coverage area to the scene area as the working coverage rate of the robot.
7. A robot coverage testing device, comprising:
the test scene construction module is used for calibrating the target scene based on the fixed coordinate system so as to construct a test scene;
the motion trail generation module is used for generating a motion trail of the robot according to the position information of the robot in the test scene, which is acquired by the image acquisition equipment in real time;
the working coverage area determining module is used for determining the working coverage area of the robot according to the motion track;
and the working coverage rate determining module is used for determining the working coverage rate of the robot according to the working coverage area and the test scene.
8. The apparatus of claim 7, wherein the test scenario construction module is specifically configured to:
based on a fixed coordinate system, carrying out on-site calibration on contour points in a target scene to obtain a calibration scene;
calibrating the obstacles in the target scene based on the fixed coordinate system to obtain calibrated obstacles;
and constructing a test scene according to the calibration scene and the calibration obstacles.
9. An electronic device, characterized in that the electronic device comprises:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores a computer program executable by the at least one processor to enable the at least one processor to perform the method of work coverage testing of a robot of any of claims 1-6.
10. A computer-readable storage medium storing computer instructions for causing a processor to perform the method of work coverage testing of a robot as claimed in any one of claims 1 to 6 when executed.
CN202210781544.0A 2022-07-04 2022-07-04 Method, device, equipment and storage medium for testing working coverage rate of robot Pending CN115139339A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210781544.0A CN115139339A (en) 2022-07-04 2022-07-04 Method, device, equipment and storage medium for testing working coverage rate of robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210781544.0A CN115139339A (en) 2022-07-04 2022-07-04 Method, device, equipment and storage medium for testing working coverage rate of robot

Publications (1)

Publication Number Publication Date
CN115139339A true CN115139339A (en) 2022-10-04

Family

ID=83410272

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210781544.0A Pending CN115139339A (en) 2022-07-04 2022-07-04 Method, device, equipment and storage medium for testing working coverage rate of robot

Country Status (1)

Country Link
CN (1) CN115139339A (en)

Similar Documents

Publication Publication Date Title
CN111578951B (en) Method and device for generating information in automatic driving
EP3943964A2 (en) Method and apparatus for determining positioning information of vehicle, electronic device, storage medium and program product
CN114926549B (en) Three-dimensional point cloud processing method, device, equipment and storage medium
CN115656989A (en) External parameter calibration method and device, electronic equipment and storage medium
CN115457152A (en) External parameter calibration method and device, electronic equipment and storage medium
CN113219505B (en) Method, device and equipment for acquiring GPS coordinates for vehicle-road cooperative tunnel scene
CN113344862A (en) Defect detection method, defect detection device, electronic equipment and storage medium
CN115273071A (en) Object identification method and device, electronic equipment and storage medium
CN113029136A (en) Method, apparatus, storage medium, and program product for positioning information processing
CN116524165B (en) Migration method, migration device, migration equipment and migration storage medium for three-dimensional expression model
CN115139339A (en) Method, device, equipment and storage medium for testing working coverage rate of robot
CN115153632A (en) Ultrasonic imaging positioning system, method, device, equipment and storage medium
CN114812576A (en) Map matching method and device and electronic equipment
CN115435720A (en) Test scene generation method, device, equipment and storage medium
CN117739993B (en) Robot positioning method and device, robot and storage medium
CN116258714B (en) Defect identification method and device, electronic equipment and storage medium
CN113026828B (en) Underwater pile foundation flaw detection method, device, equipment, storage medium and program product
CN114972511A (en) Method and device for determining pose of calibration object, electronic equipment and storage medium
CN115752469A (en) Part detection method and device, electronic equipment and storage medium
CN118602948A (en) Quality detection method and device for vehicle part and storage medium
CN117576077A (en) Defect detection method, device, equipment and storage medium
CN117742243A (en) Equipment action collision prediction method and device, electronic equipment and storage medium
CN117953073A (en) Calibration parameter determining method and device for depth camera and electronic equipment
CN115792932A (en) Positioning method, device, equipment and medium for inspection robot
CN114694138A (en) Road surface detection method, device and equipment applied to intelligent driving

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination