CN111136689A - Self-checking method and device - Google Patents

Self-checking method and device Download PDF

Info

Publication number
CN111136689A
CN111136689A CN201911419107.9A CN201911419107A CN111136689A CN 111136689 A CN111136689 A CN 111136689A CN 201911419107 A CN201911419107 A CN 201911419107A CN 111136689 A CN111136689 A CN 111136689A
Authority
CN
China
Prior art keywords
self
robot
checking
motion
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911419107.9A
Other languages
Chinese (zh)
Other versions
CN111136689B (en
Inventor
罗沛
孙其民
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Uditech Co Ltd
Original Assignee
Uditech Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Uditech Co Ltd filed Critical Uditech Co Ltd
Priority to CN201911419107.9A priority Critical patent/CN111136689B/en
Publication of CN111136689A publication Critical patent/CN111136689A/en
Application granted granted Critical
Publication of CN111136689B publication Critical patent/CN111136689B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/0095Means or methods for testing manipulators

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

The application is applicable to the technical field of robots, and provides a self-checking method, wherein the self-checking method is applied to a robot, and comprises the following steps: if the self-checking trigger information is detected, acquiring actual motion data of the robot; and obtaining a movement behavior self-checking result according to the actual movement data and reference movement data, wherein the reference movement data is the movement data collected when the robot is in a normal movement behavior state. By the method, the movement capability condition of the robot can be known in time.

Description

Self-checking method and device
Technical Field
The application belongs to the technical field of robots, and particularly relates to a self-checking method and device.
Background
The motion capability is one of basic capabilities of the robot and guarantees the service realization of the robot.
At present, in an unattended service site, the abnormal motion capability of the robot can be discovered only when the robot fails to execute a task, that is, the motion capability condition of the robot cannot be known in time through the prior art.
Disclosure of Invention
The embodiment of the application provides a self-checking method and a self-checking device, and the problem that the motion capability condition of a robot cannot be known in time can be solved.
In a first aspect, an embodiment of the present application provides a self-inspection method, where the self-inspection method is applied to a robot, and the self-inspection method includes:
if the self-checking trigger information is detected, acquiring actual motion data of the robot;
and obtaining a movement behavior self-checking result according to the actual movement data and reference movement data, wherein the reference movement data is the movement data collected when the robot is in a normal movement behavior state.
In a first possible implementation manner of the first aspect, before the obtaining actual motion data of the robot if the self-test trigger information is detected, the method includes:
executing a specified task;
correspondingly, if the self-checking trigger information is detected, the acquiring of the actual motion data of the robot comprises:
in a second possible implementation manner of the first aspect, the self-checking trigger information is designated geographic location information, and correspondingly, before the acquiring actual motion data of the robot if the self-checking trigger information is detected, the method includes:
determining the geographic position information of the robot;
correspondingly, if the self-checking trigger information is detected, the actual motion data of the robot is obtained as follows:
and if the detected geographic position information of the robot is the designated geographic position information, acquiring the actual motion data of the robot.
In a third possible implementation manner of the first aspect, the self-checking trigger information is designated marker information, and correspondingly, before the acquiring actual motion data of the robot if the self-checking trigger information is detected, the method includes:
determining information of the articles within a preset range;
correspondingly, if the self-checking trigger information is detected, the actual motion data of the robot is obtained as follows:
and if the information of the articles in the preset range is detected to be the designated marker information, acquiring the actual motion data of the robot.
In a fourth possible implementation manner of the first aspect, the actual motion data includes: actual motion trajectory, the reference motion data comprising: a reference motion trajectory; correspondingly, the obtaining of the exercise behavior self-inspection result according to the actual exercise data and the reference exercise data includes:
and obtaining a motion behavior self-checking result according to the deviation between the actual motion track and the reference motion track.
Based on the fourth possible implementation manner of the first aspect of the present application, in a fifth possible implementation manner, the obtaining a motion behavior self-check result according to a deviation between the actual motion trajectory and the reference motion trajectory includes:
and obtaining a motion behavior self-checking result according to the deviation between the form of the actual motion track and the form of the reference motion track.
In a sixth possible implementation manner of the first aspect, the actual motion data includes: the reference motion data comprises a motion control command corresponding to the actual motion trajectory: correspondingly, obtaining a motion behavior self-checking result according to the actual motion data and the reference motion data includes:
and obtaining a motion behavior self-checking result according to the deviation between the motion control command corresponding to the actual motion track and the motion control command corresponding to the reference motion track.
In a second aspect, an embodiment of the present application provides a self-inspection apparatus, where the self-inspection apparatus is applied to a robot, and the self-inspection apparatus includes:
the first acquisition unit is used for acquiring actual motion data of the robot if self-checking trigger information is detected;
and the second acquisition unit is used for acquiring a movement behavior self-checking result according to the actual movement data and reference movement data, wherein the reference movement data is the movement data acquired when the robot is in a normal movement behavior state.
In a third aspect, an embodiment of the present application provides a robot, including: a memory, a processor and a computer program stored in the memory and executable on the processor, the processor implementing the steps of the self-test method when executing the computer program.
In a fourth aspect, an embodiment of the present application provides a computer-readable storage medium, including: the computer-readable storage medium stores a computer program which, when executed by a processor, implements the steps of the self-test method as described.
In a fifth aspect, the present application provides a computer program product, which when run on a robot, causes the robot to perform the steps of the self-test method according to any one of the first aspect.
It is understood that the beneficial effects of the second aspect to the fifth aspect can be referred to the related description of the first aspect, and are not described herein again.
Compared with the prior art, the embodiment of the application has the advantages that: when the self-checking trigger information is detected, the actual motion data of the robot can be acquired, and then the motion behavior self-checking result is acquired according to the actual motion data and the reference motion data, so that the motion behavior self-checking result of the robot can reflect the motion capability condition of the robot, namely the motion capability condition of the robot is not reflected until the robot fails to execute the task, and therefore operation and maintenance personnel can know the motion capability condition of the robot in time.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
Fig. 1 is a schematic flow chart of a self-testing method provided in an embodiment of the present application;
fig. 2 is a schematic structural diagram of a self-testing apparatus according to an embodiment of the present disclosure;
fig. 3 is a schematic structural diagram of a robot provided in an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
As used in this specification and the appended claims, the term "if" may be interpreted contextually as "when", "upon" or "in response to" determining "or" in response to detecting ". Similarly, the phrase "if it is determined" or "if a [ described condition or event ] is detected" may be interpreted contextually to mean "upon determining" or "in response to determining" or "upon detecting [ described condition or event ]" or "in response to detecting [ described condition or event ]".
Furthermore, in the description of the present application and the appended claims, the terms "first," "second," and the like are used for distinguishing between descriptions and not necessarily for describing or implying relative importance.
Reference throughout this specification to "one embodiment" or "some embodiments," or the like, means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," or the like, in various places throughout this specification are not necessarily all referring to the same embodiment, but rather "one or more but not all embodiments" unless specifically stated otherwise. The terms "comprising," "including," "having," and variations thereof mean "including, but not limited to," unless expressly specified otherwise.
The first embodiment is as follows:
fig. 1 shows a schematic flow chart of a first self-inspection method provided in an embodiment of the present application, where the self-inspection method is applied to a robot, and is detailed as follows:
and S101, if self-checking trigger information is detected, acquiring actual motion data of the robot.
Specifically, if the self-checking trigger information is detected by the designated device, the actual motion data of the robot is acquired through a self-checking program. Wherein, the self-checking trigger information is used for triggering (starting) a self-checking program.
By way of example and not limitation, the designation device may include any of: camera, laser radar. The self-checking trigger information includes but is not limited to any one of the following: specifying two-dimensional image information and specifying three-dimensional article information. When the designated device is a camera, correspondingly, the step S101 is specifically: and if the appointed two-dimensional image information is detected through the camera, acquiring the actual motion data of the robot. Or, when the specified device is a laser radar, correspondingly, the step S101 specifically includes: and if the designated three-dimensional article information is detected through the laser radar, acquiring actual motion data of the robot.
Optionally, since the robot often needs to execute the task, if the execution process of the designated task and the execution process of the self-checking method are independent of each other, the self-checking efficiency or the task execution efficiency may be negatively affected, for example, if the designated task is executed first and then the self-checking method is executed, the self-checking efficiency may be reduced; if the self-checking method is executed first and then a designated task is executed, the task execution efficiency of the robot is reduced, so that in order to ensure the self-checking efficiency and the task execution efficiency of the robot and save the total time for executing the task and the robot to execute the self-checking method, before the step S101, the method includes: executing a specified task; correspondingly, the step S101 includes: and if the self-checking trigger information is detected, continuing to execute the specified task, and acquiring actual motion data corresponding to the specified task which is continuously executed by the robot.
Specifically, the implementation of the specified task needs to depend on the robot to perform spatial motion, that is, if the self-checking trigger information is detected, the specified task is continuously executed, and the obtaining of the actual motion data corresponding to the specified task continuously executed by the robot includes: and if the self-checking trigger information is detected, continuing to execute the specified task, and acquiring actual motion data corresponding to the specified task which is continuously executed by the robot at intervals of a preset time length during the period of continuously executing the specified task until the self-checking stop indication information is detected.
Wherein, the self-checking stop instruction information is used for instructing the robot to stop the self-checking. For example, the self-test end indication information may include any one of the following: and specifying task completion information and self-checking stop time point arrival information.
By way of example and not limitation, assuming that the designated task is to a designated charging pile, the robot may start charging after reaching the designated charging pile, the self-test trigger information is information of a geographical position which is a preset distance from the designated charging pile, the preset time length is 0.2 seconds, and the self-test stop indication information is designated task completion information, specifically information that the designated charging pile has been reached, or information that the charging pile is ready and ready for charging. The specific task execution is as follows: planning a motion path according to the geographic position of the designated charging pile, moving on the planned motion path, correspondingly, if the self-checking trigger information is detected, continuing to execute the designated task, and acquiring actual motion data corresponding to the designated task continuously executed by the robot specifically comprises the following steps: and if the information of the geographical position which is away from the specified charging pile by the preset length is detected, continuing to move on the planned movement path, and acquiring actual movement data corresponding to the movement on the planned movement path every 0.2 seconds until the specified task completion information is detected.
Optionally, the self-checking trigger information is designated geographical location information, and correspondingly, before the step S101, the method includes: determining the geographic position information of the robot; correspondingly, the step S101 specifically includes: and if the detected geographic position information of the robot is the designated geographic position information, acquiring the actual motion data of the robot.
Specifically, the determining the geographic position information of the robot includes: and acquiring environment perception information acquired by a designated device, matching the environment data with map information, and determining the geographic position information of the robot according to a matching result.
By way of example and not limitation, the specifying device is a laser radar, the environmental perception information is environmental space information obtained based on sensor data processing, the environmental space information may be in a form of point cloud data, and the position of the robot may be determined based on currently acquired environmental space information and navigation map information.
Optionally, the self-test triggering information is designated marker information, and correspondingly, before the step S101, the method includes: determining information of the articles within a preset range; correspondingly, the step S101 specifically includes: and if the information of the articles in the preset range is detected to be the designated marker information, acquiring the actual motion data of the robot.
By way of example and not limitation, the information of the article is identification information of the article, for example, the information of the article is a name of the article, and the designated marker information is a designated marker name. Assuming that the name of the designated marker is a two-dimensional code, if the name of the article in the preset range is detected to be the two-dimensional code, acquiring actual motion data of the robot.
And S102, obtaining a movement behavior self-checking result according to the actual movement data and reference movement data, wherein the reference movement data is the movement data collected when the robot is in a normal movement behavior state.
Specifically, the actual exercise data and the reference exercise data are compared, and an exercise behavior self-checking result is obtained according to the comparison result, wherein the exercise behavior self-checking result is as follows: the self-checking result represents that the movement behavior is normal or the self-checking result represents that the movement behavior is abnormal.
Alternatively, in order to acquire the reference motion data to ensure smooth execution of step S102, before step S102, the method includes: and if an initialization state trigger instruction is received, entering an initialization state, acquiring motion sample data of the robot entering the initialization state, wherein the motion sample data is the actual motion data of the robot entering the initialization state, and acquiring reference motion data according to the motion sample data.
As an example and not by way of limitation, the robot entering the initialization state can collect motion sample data, specifically: and when the robot entering the initialization state detects the self-checking trigger information again, acquiring the motion sample data again until the acquired motion sample data is equal to the preset data quantity threshold value.
Optionally, the actual motion data comprises: actual motion trajectory, the reference motion data comprising: a reference motion trajectory; correspondingly, the step S102 includes: and obtaining a motion behavior self-checking result according to the deviation between the actual motion track and the reference motion track.
Specifically, the actual motion trajectory and the reference motion trajectory are compared to obtain a comparison result, a deviation between the actual motion trajectory and the reference motion trajectory is determined according to the comparison result, if the deviation between the actual motion trajectory and the reference motion trajectory is smaller than or equal to a preset deviation threshold, a self-checking result indicating that the motion behavior is normal is determined as a motion behavior self-checking result, and if the deviation between the actual motion trajectory and the reference motion trajectory is larger than a preset deviation threshold, a self-checking result indicating that the motion behavior is abnormal is determined as a motion behavior self-checking result.
In some embodiments, the sampling time intervals of the actual motion trajectory and the reference motion trajectory are the same, the initial sampling point of the actual motion trajectory is the same as the specified sampling point of the reference motion trajectory, and correspondingly, the obtaining of the motion behavior self-check result according to the deviation between the actual motion trajectory and the reference motion trajectory includes: and obtaining a motion behavior self-checking result according to the deviation between the coordinates of the sampling point corresponding to the actual motion track and the coordinates of the sampling point corresponding to the reference motion track, wherein the number of the sampling points corresponding to the actual motion track and the number of the sampling points corresponding to the reference motion track are both more than one, and the motion behavior self-checking result can be obtained through the coordinates, namely, the motion behavior self-checking result can be obtained through quantized data, so that the accuracy of the motion behavior self-checking result can be improved.
In a real situation, in order to avoid an obstacle, the robot may cause a starting sampling point of the actual motion trajectory and any specified sampling point of the reference motion trajectory to be different, and therefore, in order to be able to perform the self-checking of the motion behavior according to the deviation between the coordinates of the sampling point corresponding to the actual motion trajectory and the coordinates of the sampling point corresponding to the reference motion trajectory, before the self-checking of the motion behavior according to the deviation between the coordinates of the sampling point corresponding to the actual motion trajectory and the coordinates of the sampling point corresponding to the reference motion trajectory, the method includes: performing coordinate transformation on all sampling points of the reference motion trajectory to enable the coordinates of one designated sampling point of the reference motion trajectory to be the same as the coordinates of the initial sampling point of the actual motion trajectory, and correspondingly, obtaining a motion behavior self-checking result according to the deviation between the coordinates of the sampling point corresponding to the actual motion trajectory and the coordinates of the sampling point corresponding to the reference motion trajectory comprises: and obtaining a motion behavior self-checking result according to the deviation between the coordinates of the sampling point corresponding to the actual motion track and the coordinates of the sampling point of the reference motion track after coordinate transformation.
In some embodiments, since the form of the actual motion trajectory of the robot can represent the actual motion behavior of the robot, in order to obtain the motion behavior self-inspection result through the actual motion behavior of the robot, in this case, the obtaining the motion behavior self-inspection result according to the deviation between the actual motion trajectory and the reference motion trajectory includes: and obtaining a motion behavior self-checking result according to the deviation between the form of the actual motion track and the form of the reference motion track.
By way of example and not limitation, assuming that the actual motion trajectory is in an "L-shaped" form, and the reference motion trajectory is in an "S-shaped" form, it is determined that a deviation between the "L-shaped" and the "S-shaped" is greater than a preset deviation threshold, and a self-checking result indicating that the motion behavior is abnormal is determined as a motion behavior self-checking result.
Optionally, since the actual motion behavior of the robot is generated based on the motion control instruction, the motion control command corresponding to the actual motion trajectory of the robot may represent an actual motion behavior situation of the robot, and in order to obtain a motion behavior self-checking result according to the actual motion behavior situation of the robot, the actual motion data includes: the reference motion data comprises a motion control command corresponding to the actual motion trajectory: a motion control command corresponding to the reference motion trajectory, and correspondingly, the step S102 includes: and obtaining a motion behavior self-checking result according to the deviation between the motion control command corresponding to the actual motion track and the motion control command corresponding to the reference motion track.
By way of example and not limitation, the motion control command corresponding to the reference motion trajectory may include, but is not limited to: the motion direction control command or/and the steering angle control command corresponding to the reference motion trajectory, and the motion control command corresponding to the actual motion trajectory may include but is not limited to: and a motion direction control instruction or/and a steering angle control instruction corresponding to the actual motion track. Generally, a reference motion trajectory having a substantially straight shape includes a small number of steering angle control commands included in a corresponding motion control command, and if the motion control command corresponding to the reference motion trajectory includes 2 steering angle control commands having a 30-degree steering angle corresponding to the reference motion trajectory, and the motion control command corresponding to the actual motion trajectory includes 20 steering angle control commands having a 30-degree steering angle corresponding to the actual motion trajectory, a motion self-test result is obtained according to a difference between the number of steering angle control commands having a 30-degree steering angle included in the motion control command corresponding to the reference motion trajectory and the number of steering angle control commands having a 30-degree steering angle included in the motion control command corresponding to the actual motion trajectory.
Optionally, since the motion behavior self-check result can reflect the motion capability condition of the robot, in order to facilitate the operation and maintenance staff to know the motion capability condition of the robot, after step S102, the method includes: and uploading the motion behavior self-checking result to a designated terminal.
By way of example and not limitation, the designated terminal includes any of: and the server and the mobile phone terminal are designated.
In the embodiment of the application, when the self-checking trigger information is detected, the actual motion data of the robot can be acquired, and then the motion behavior self-checking result is acquired according to the actual motion data and the reference motion data, so that the motion behavior self-checking result of the robot can reflect the motion capability condition of the robot, namely the motion capability condition of the robot is not reflected until the robot fails to execute the task, and therefore operation and maintenance personnel can know the motion capability condition of the robot in time.
Example two:
in correspondence with the above-described embodiments, fig. 2 shows a schematic structural diagram of a self-inspection apparatus provided in an embodiment of the present application, which is applied to a robot, and for convenience of explanation, only the parts related to the embodiment of the present application are shown.
The self-checking device comprises: a first acquisition unit 201 and a second acquisition unit 202.
The first obtaining unit 201 is configured to obtain actual motion data of the robot if the self-test trigger information is detected;
optionally, since the robot often needs to execute the task, if the execution process of the designated task and the execution process of the self-checking method are independent of each other, the self-checking efficiency or the task execution efficiency may be negatively affected, for example, if the designated task is executed first and then the self-checking method is executed, the self-checking efficiency may be reduced; if the self-checking method is executed first and then a designated task is executed, the task execution efficiency of the robot may be reduced, so that, in order to ensure the self-checking efficiency and the task execution efficiency of the robot and save the total time for executing the task and the robot to execute the self-checking method, the first obtaining unit 201 is further configured to, before executing the self-checking trigger information and obtaining the actual motion data of the robot, further: executing a specified task; correspondingly, when the first obtaining unit 201 is configured to perform the following operation, when the actual motion data of the robot is obtained if the self-test trigger information is detected, specifically: and if the self-checking trigger information is detected, continuing to execute the specified task, and acquiring actual motion data corresponding to the specified task which is continuously executed by the robot.
Optionally, the self-checking trigger information is designated geographical location information, and correspondingly, the self-checking apparatus further includes: a first information determination unit to: determining the geographical position information of the robot before the first obtaining unit 201 executes the step of obtaining the actual motion data of the robot if the self-test triggering information is detected; correspondingly, when the first obtaining unit 201 is configured to perform the following operation, when the actual motion data of the robot is obtained if the self-test trigger information is detected, specifically: and if the detected geographic position information of the robot is the designated geographic position information, acquiring the actual motion data of the robot.
Optionally, the self-checking trigger information is designated marker information, and correspondingly, the self-checking device further includes: a second information determination unit to: determining information of articles within a preset range before the first obtaining unit 201 executes the step of obtaining actual motion data of the robot if the self-checking trigger information is detected; correspondingly, when the first obtaining unit 201 is configured to perform the following operation, when the actual motion data of the robot is obtained if the self-test trigger information is detected, specifically: and if the information of the articles in the preset range is detected to be the designated marker information, acquiring the actual motion data of the robot.
The second obtaining unit 202 is configured to obtain a motion behavior self-inspection result according to the actual motion data and reference motion data, where the reference motion data is motion data collected when the robot is in a normal motion behavior state.
Optionally, in order to acquire the reference motion data to ensure that the second acquiring unit 202 can smoothly execute the motion behavior self-test result acquired according to the actual motion data and the reference motion data, the self-test apparatus further includes: a reference motion data acquisition unit.
The reference motion data acquisition unit is configured to: before the second obtaining unit 202 executes the motion behavior self-checking result obtained according to the actual motion data and the reference motion data, if an initialization state trigger instruction is received, the robot enters an initialization state, the robot entering the initialization state can collect motion sample data, the motion sample data is the actual motion data of the robot entering the initialization state, and the reference motion data is obtained according to the motion sample data.
Optionally, the actual motion data comprises: actual motion trajectory, the reference motion data comprising: a reference motion trajectory; correspondingly, when the second obtaining unit 202 executes the motion behavior self-test result obtained according to the actual motion data and the reference motion data, it is specifically configured to: and obtaining a motion behavior self-checking result according to the deviation between the actual motion track and the reference motion track.
In some embodiments, the sampling time intervals of the actual motion trajectory and the reference motion trajectory are the same, the initial sampling point of the actual motion trajectory and the specified sampling point of the reference motion trajectory are the same, and correspondingly, when the second obtaining unit 202 performs the obtaining of the motion behavior self-checking result according to the deviation between the actual motion trajectory and the reference motion trajectory, the second obtaining unit is specifically configured to: and obtaining a motion behavior self-checking result according to the deviation between the coordinates of the sampling point corresponding to the actual motion track and the coordinates of the sampling point corresponding to the reference motion track, wherein the number of the sampling points corresponding to the actual motion track and the number of the sampling points corresponding to the reference motion track are both more than one, and the motion behavior self-checking result can be obtained through the coordinates, namely, the motion behavior self-checking result can be obtained through quantized data, so that the accuracy of the motion behavior self-checking result can be improved.
In a real situation, in order to avoid an obstacle, the robot may cause the initial sampling point of the actual motion trajectory and any specified sampling point of the reference motion trajectory to be different, and therefore, in order for the second obtaining unit 202 to perform the self-checking of the motion behavior according to the deviation between the coordinates of the sampling point corresponding to the actual motion trajectory and the coordinates of the sampling point corresponding to the reference motion trajectory, the self-checking apparatus further includes: and a coordinate transformation unit.
The coordinate transformation unit is configured to: before the second obtaining unit 202 performs the obtaining of the motion behavior self-inspection result according to the deviation between the coordinates of the sampling point corresponding to the actual motion trajectory and the coordinates of the sampling point corresponding to the reference motion trajectory, coordinate transformation is performed on all sampling points of the reference motion trajectory, so that the coordinates of one specified sampling point of the reference motion trajectory are the same as the coordinates of the initial sampling point of the actual motion trajectory, and correspondingly, when the second obtaining unit 202 performs the obtaining of the motion behavior self-inspection result according to the deviation between the coordinates of the sampling point corresponding to the actual motion trajectory and the coordinates of the sampling point corresponding to the reference motion trajectory, the second obtaining unit is specifically configured to: and obtaining a motion behavior self-checking result according to the deviation between the coordinates of the sampling point corresponding to the actual motion track and the coordinates of the sampling point of the reference motion track after coordinate transformation.
Optionally, since the motion behavior self-checking result can reflect the motion capability condition of the robot, in order to facilitate the operation and maintenance staff to know the motion capability condition of the robot, the self-checking apparatus further includes: the results are uploaded to the unit.
The result uploading unit is used for: after the second obtaining unit 202 executes the motion behavior self-checking result obtained according to the actual motion data and the reference motion data, the motion behavior self-checking result is uploaded to a designated terminal.
In the embodiment of the application, when the self-checking trigger information is detected, the actual motion data of the robot can be acquired, and then the motion behavior self-checking result is acquired according to the actual motion data and the reference motion data, so that the motion behavior self-checking result of the robot can reflect the motion capability condition of the robot, namely the motion capability condition of the robot is not reflected until the robot fails to execute the task, and therefore operation and maintenance personnel can know the motion capability condition of the robot in time.
Example three:
fig. 3 is a schematic structural diagram of a robot according to an embodiment of the present application. As shown in fig. 3, the robot 3 of this embodiment includes: at least one processor 30 (only one shown in fig. 3), a memory 31, and a computer program 32 stored in the memory 31 and executable on the at least one processor 30, wherein the processor 30 executes the computer program 32 to implement the steps of any of the self-test method embodiments described above.
The robot may include, but is not limited to, a processor 30, a memory 31. Those skilled in the art will appreciate that fig. 3 is merely an example of the robot 3, and does not constitute a limitation of the robot 3, and may include more or less components than those shown, or combine some components, or different components, such as input and output devices, network access devices, etc.
The Processor 30 may be a Central Processing Unit (CPU), and the Processor 30 may be other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 31 may in some embodiments be an internal storage unit of the robot 3, such as a hard disk or a memory of the robot 3. The memory 31 may also be an external storage device of the robot 3 in other embodiments, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, which are provided on the robot 3. Further, the memory 31 may also include both an internal storage unit and an external storage device of the robot 3. The memory 31 is used for storing an operating system, an application program, a BootLoader (BootLoader), data, and other programs, such as program codes of the computer program. The memory 31 may also be used to temporarily store data that has been output or is to be output.
It should be noted that, because the contents of information interaction, execution process, and the like between the above units are based on the same concept as that of the embodiment of the method of the present application, specific functions and technical effects thereof may be specifically referred to a part of the embodiment of the method, and details thereof are not described herein again.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
The embodiments of the present application further provide a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a processor, the computer program implements the steps in the above-mentioned method embodiments.
The embodiments of the present application provide a computer program product, which when running on a robot, enables the robot to implement the steps in the above method embodiments when executed.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, all or part of the processes in the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium and can implement the steps of the embodiments of the methods described above when the computer program is executed by a processor. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer readable medium may include at least: any entity or device capable of carrying computer program code to a robot, recording medium, computer Memory, Read-Only Memory (ROM), Random-Access Memory (RAM), electrical carrier wave signals, telecommunications signals, and software distribution medium. Such as a usb-disk, a removable hard disk, a magnetic or optical disk, etc. In certain jurisdictions, computer-readable media may not be an electrical carrier signal or a telecommunications signal in accordance with legislative and patent practice.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed network device and method may be implemented in other ways. For example, the above described network device embodiments are merely illustrative, and for example, the division of the modules or units is only one logical functional division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (10)

1. A self-checking method, applied to a robot, comprising:
if the self-checking trigger information is detected, acquiring actual motion data of the robot;
and obtaining a movement behavior self-checking result according to the actual movement data and reference movement data, wherein the reference movement data is the movement data collected when the robot is in a normal movement behavior state.
2. The self-test method of claim 1, wherein before the obtaining actual motion data of the robot if the self-test trigger information is detected, the method comprises:
executing a specified task;
correspondingly, if the self-checking trigger information is detected, the acquiring of the actual motion data of the robot comprises:
and if the self-checking trigger information is detected, continuing to execute the specified task, and acquiring actual motion data corresponding to the specified task which is continuously executed by the robot.
3. The self-checking method according to claim 1, wherein the self-checking trigger information is designated geographical location information, and correspondingly, before acquiring actual motion data of the robot if the self-checking trigger information is detected, the method includes:
determining the geographic position information of the robot;
correspondingly, if the self-checking trigger information is detected, the actual motion data of the robot is obtained as follows:
and if the detected geographic position information of the robot is the designated geographic position information, acquiring the actual motion data of the robot.
4. The self-checking method according to claim 1, wherein the self-checking trigger information is designated marker information, and correspondingly, before acquiring actual motion data of the robot if the self-checking trigger information is detected, the method includes:
determining information of the articles within a preset range;
correspondingly, if the self-checking trigger information is detected, the actual motion data of the robot is obtained as follows:
and if the information of the articles in the preset range is detected to be the designated marker information, acquiring the actual motion data of the robot.
5. The self-test method of claim 1, wherein the actual motion data comprises: actual motion trajectory, the reference motion data comprising: a reference motion trajectory; correspondingly, the obtaining of the exercise behavior self-inspection result according to the actual exercise data and the reference exercise data includes:
and obtaining a motion behavior self-checking result according to the deviation between the actual motion track and the reference motion track.
6. The self-checking method of claim 5, wherein the obtaining of the self-checking result of the motion behavior according to the deviation between the actual motion trajectory and the reference motion trajectory comprises:
and obtaining a motion behavior self-checking result according to the deviation between the form of the actual motion track and the form of the reference motion track.
7. The self-test method of claim 1, wherein the actual motion data comprises: the reference motion data comprises a motion control command corresponding to the actual motion trajectory: correspondingly, obtaining a motion behavior self-checking result according to the actual motion data and the reference motion data includes:
and obtaining a motion behavior self-checking result according to the deviation between the motion control command corresponding to the actual motion track and the motion control command corresponding to the reference motion track.
8. A self-inspection apparatus applied to a robot, comprising:
the first acquisition unit is used for acquiring actual motion data of the robot if self-checking trigger information is detected;
and the second acquisition unit is used for acquiring a movement behavior self-checking result according to the actual movement data and reference movement data, wherein the reference movement data is the movement data acquired when the robot is in a normal movement behavior state.
9. A robot comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the steps of the method according to any of claims 1 to 7 are implemented when the computer program is executed by the processor.
10. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 7.
CN201911419107.9A 2019-12-31 2019-12-31 Self-checking method and device Active CN111136689B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911419107.9A CN111136689B (en) 2019-12-31 2019-12-31 Self-checking method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911419107.9A CN111136689B (en) 2019-12-31 2019-12-31 Self-checking method and device

Publications (2)

Publication Number Publication Date
CN111136689A true CN111136689A (en) 2020-05-12
CN111136689B CN111136689B (en) 2022-01-11

Family

ID=70523099

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911419107.9A Active CN111136689B (en) 2019-12-31 2019-12-31 Self-checking method and device

Country Status (1)

Country Link
CN (1) CN111136689B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113673199A (en) * 2021-10-21 2021-11-19 苏州浪潮智能科技有限公司 Design self-checking method, system and device for controllable depth drilling and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006312208A (en) * 2005-05-09 2006-11-16 Toyota Motor Corp Emergency stop device of movable robot
CN104781781A (en) * 2014-11-14 2015-07-15 深圳市大疆创新科技有限公司 Control method, apparatus and moving equipment for object moving
CN107657866A (en) * 2016-07-25 2018-02-02 北京东易晖煌国际教育科技有限公司 A kind of Intelligent teaching robot of Modular programmable
CN107671887A (en) * 2017-08-22 2018-02-09 广东美的智能机器人有限公司 Robot self-test control method, robot and dispatch server
CN108214554A (en) * 2018-02-05 2018-06-29 刘春梅 A kind of self-checking system for intelligent track-traffic crusing robot
CN109878452A (en) * 2019-03-27 2019-06-14 北京三快在线科技有限公司 A kind of theft preventing method and device of unmanned dispensing device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006312208A (en) * 2005-05-09 2006-11-16 Toyota Motor Corp Emergency stop device of movable robot
CN104781781A (en) * 2014-11-14 2015-07-15 深圳市大疆创新科技有限公司 Control method, apparatus and moving equipment for object moving
CN107657866A (en) * 2016-07-25 2018-02-02 北京东易晖煌国际教育科技有限公司 A kind of Intelligent teaching robot of Modular programmable
CN107671887A (en) * 2017-08-22 2018-02-09 广东美的智能机器人有限公司 Robot self-test control method, robot and dispatch server
CN108214554A (en) * 2018-02-05 2018-06-29 刘春梅 A kind of self-checking system for intelligent track-traffic crusing robot
CN109878452A (en) * 2019-03-27 2019-06-14 北京三快在线科技有限公司 A kind of theft preventing method and device of unmanned dispensing device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113673199A (en) * 2021-10-21 2021-11-19 苏州浪潮智能科技有限公司 Design self-checking method, system and device for controllable depth drilling and storage medium

Also Published As

Publication number Publication date
CN111136689B (en) 2022-01-11

Similar Documents

Publication Publication Date Title
US11002840B2 (en) Multi-sensor calibration method, multi-sensor calibration device, computer device, medium and vehicle
CN110850872A (en) Robot inspection method and device, computer readable storage medium and robot
EP3629052A1 (en) Data collecting method and system
CN110470333B (en) Calibration method and device of sensor parameters, storage medium and electronic device
CN107578427B (en) Method and device for detecting dynamic obstacle and computer readable storage medium
CN112179361B (en) Method, device and storage medium for updating work map of mobile robot
CN111381586A (en) Robot and movement control method and device thereof
CN110774319B (en) Robot and positioning method and device thereof
CN111288971B (en) Visual positioning method and device
CN111060132A (en) Calibration method and device for travelling crane positioning coordinates
CN112686951A (en) Method, device, terminal and storage medium for determining robot position
CN113907663A (en) Obstacle map construction method, cleaning robot and storage medium
CN110850882A (en) Charging pile positioning method and device of sweeping robot
CN113029167A (en) Map data processing method, map data processing device and robot
CN111136689B (en) Self-checking method and device
KR20200010506A (en) Method and apparatus for determining the location of a static object
KR20190081334A (en) Method for tracking moving trajectory based on complex positioning and apparatus thereof
CN110861087A (en) Robot initialization positioning method and device, mobile robot and storage medium
CN114001728A (en) Control method and device for mobile robot, storage medium and electronic equipment
WO2022088613A1 (en) Robot positioning method and apparatus, device and storage medium
CN111157012B (en) Robot navigation method and device, readable storage medium and robot
CN113932790A (en) Map updating method, device, system, electronic equipment and storage medium
CN112223281A (en) Robot and positioning method and device thereof
CN115507840A (en) Grid map construction method, grid map construction device and electronic equipment
CN111158364B (en) Robot repositioning method and device and terminal equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant