CN113858208B - Robot detection method and device, electronic equipment and storage medium - Google Patents

Robot detection method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN113858208B
CN113858208B CN202111255933.1A CN202111255933A CN113858208B CN 113858208 B CN113858208 B CN 113858208B CN 202111255933 A CN202111255933 A CN 202111255933A CN 113858208 B CN113858208 B CN 113858208B
Authority
CN
China
Prior art keywords
component
robot
detected
detection result
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111255933.1A
Other languages
Chinese (zh)
Other versions
CN113858208A (en
Inventor
吴燕芸
凌圆梦
郭双
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Keenlon Intelligent Technology Co Ltd
Original Assignee
Shanghai Keenlon Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Keenlon Intelligent Technology Co Ltd filed Critical Shanghai Keenlon Intelligent Technology Co Ltd
Priority to CN202111255933.1A priority Critical patent/CN113858208B/en
Publication of CN113858208A publication Critical patent/CN113858208A/en
Application granted granted Critical
Publication of CN113858208B publication Critical patent/CN113858208B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/0095Means or methods for testing manipulators

Abstract

The embodiment of the invention discloses a robot detection method, a robot detection device, electronic equipment and a storage medium. The method comprises the following steps: before the robot executes a task, a first connectivity detection result of a first component to be detected of the robot is determined in response to a detection request of the robot; the first component to be detected comprises a main control algorithm board; and if the first connectivity detection result is successful, determining a second connectivity detection result of a second component to be detected of the at least one robot. By operating the technical scheme provided by the embodiment of the invention, the problem that when the faults occur actually, a user contacts maintenance personnel to remotely assist in troubleshooting can be solved, the faults are assisted in processing after being positioned to the robot, the problem of the working efficiency of the robot is reduced, and the effects of improving the detection efficiency and the working efficiency of the robot are realized.

Description

Robot detection method and device, electronic equipment and storage medium
Technical Field
The embodiment of the invention relates to the robot technology, in particular to a robot detection method, a device, electronic equipment and a storage medium.
Background
Along with the popularization of intelligent robots, the intelligent robots are widely applied in industries such as service, so that the labor cost is reduced and the working efficiency is improved.
However, the operation of the robot may generate faults, and in the prior art, when the faults only occur actually, the user contacts with maintenance personnel to assist in troubleshooting remotely, and assists in processing the faults after positioning the robot, so that the working efficiency of the robot is reduced.
Disclosure of Invention
The embodiment of the invention provides a robot detection method, a robot detection device, electronic equipment and a storage medium, so as to improve the detection efficiency and the working efficiency of a robot.
In a first aspect, an embodiment of the present invention provides a robot detection method, including:
before a robot executes a task, responding to a detection request of the robot, and determining a first connectivity detection result of a first component to be detected of the robot; the first component to be detected comprises a main control algorithm board;
if the first connectivity detection result is successful, determining a second connectivity detection result of a second component to be detected of at least one robot; the second component to be detected comprises at least one of a laser radar, a stereoscopic vision component, an image acquisition component, a chassis control board, a motor drive board, an inertial sensor and a collision bar safety edge component.
In a second aspect, an embodiment of the present invention further provides a robot detection apparatus, including:
the first result determining module is used for determining a first communication detection result of a first component to be detected of the robot in response to a detection request of the robot before the robot executes a task; the first component to be detected comprises a main control algorithm board;
the second result determining module is used for determining a second connectivity detection result of the second to-be-detected component of at least one robot if the first connectivity detection result is successful; the second component to be detected comprises at least one of a laser radar, a stereoscopic vision component, an image acquisition component, a chassis control board, a motor drive board, an inertial sensor and a collision bar safety edge component.
In a third aspect, an embodiment of the present invention further provides an electronic device, including:
one or more processors;
storage means for storing one or more programs,
the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the robot detection method as described above.
In a fourth aspect, an embodiment of the present invention further provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the robot detection method as described above.
Before a robot executes a task, the embodiment of the invention responds to a detection request of the robot to determine a first communication detection result of a first component to be detected of the robot; the first component to be detected comprises a main control algorithm board; if the first connectivity detection result is successful, determining a second connectivity detection result of a second component to be detected of at least one robot; the second component to be detected comprises at least one of a laser radar, a stereoscopic vision component, an image acquisition component, a chassis control board, a motor drive board, an inertial sensor and a collision bar safety edge component. When the problems of faults and the like actually occur, a user contacts maintenance personnel to remotely assist in troubleshooting, the faults are positioned to the robot to assist in processing, the working efficiency of the robot is reduced, and the effects of improving the detection efficiency and the working efficiency of the robot are achieved.
Drawings
Fig. 1 is a flowchart of a robot detection method according to a first embodiment of the present invention;
fig. 2 is a flowchart of a robot detection method according to a second embodiment of the present invention;
fig. 3 is a schematic structural diagram of a robot detection device according to a third embodiment of the present invention;
fig. 4 is a schematic structural diagram of a robot according to a fourth embodiment of the present invention.
Detailed Description
The invention is described in further detail below with reference to the drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting thereof. It should be further noted that, for convenience of description, only some, but not all of the structures related to the present invention are shown in the drawings.
Example 1
Fig. 1 is a flowchart of a robot detection method according to an embodiment of the present invention, where the embodiment is applicable to a situation where a robot performs connectivity detection, the method may be performed by a robot detection device according to the embodiment of the present invention, and the device may be implemented by software and/or hardware. Referring to fig. 1, the method for detecting a robot provided in this embodiment includes:
step 110, before a robot executes a task, determining a first connectivity detection result of a first component to be detected of the robot in response to a detection request of the robot; the first component to be detected comprises a main control algorithm board.
The tasks executed by the robot are preset tasks such as a distribution task, a guiding task and the like. Before the robot executes the task, responding to a detection request of the robot, wherein the detection request can be sent by a user manually clicking a detection key of the robot, can also be sent by a user remotely operating the robot, can also be generated by itself when the robot is started, can also be generated by itself when a control APP of the robot is started and operated, and the like, and the embodiment is not limited to the detection request. Before the robot executes the task, the self-checking is performed in response to the detection request, so that the problem that the robot only finds out faults when executing the task is avoided, and the task execution efficiency is improved.
When the robot is started and automatically sends out a detection request, detection skipping, automatic detection overtime exit, manual exit by a user in the detection process and the like can be supported, and the working efficiency of the robot is improved.
After the robot obtains the detection request, determining a first connectivity detection result of a first component to be detected of the robot, wherein connectivity detection is to detect the connection state of the component, and determining whether the connectivity detection result is to detect whether the component is successfully connected. The detection result may be determined by sending a specified detection signal to the component to be detected according to the feedback condition of the component to be detected to the detection signal, which is not limited in this embodiment.
The first component to be detected is a component to be detected firstly, the first component to be detected comprises a main control algorithm board of the robot, wherein the main control algorithm board is a circuit board with a core control function, and the main control algorithm board can comprise an MCU and the like.
In this embodiment, optionally, determining a first connectivity detection result of the first component to be detected of the robot includes:
determining the first connectivity detection result according to the acquisition state of the component signals of the first component to be detected; wherein the acquisition state includes at least one of acquisition correctness, acquisition timeout, and acquisition error.
The first component to be detected is a signal returned to the signal receiving equipment of the robot in response to the connectivity detection signal sent by the robot.
If the acquiring state includes that the acquiring is correct, that is, the acquired component signal is a signal conforming to the expected signal, the determined first connectivity detection result is that the connection is successful.
If the acquiring state comprises acquiring timeout, that is, acquiring time of the component signal exceeds preset time, determining that the first connectivity detection result is the connection timeout. The reason for the connection timeout may be a signal transmitting end fault, a first component to be detected fault, a communication fault, and the like. And when the acquisition timeout times reach the preset times, determining that the first communication detection result is the connection timeout, and avoiding the connection timeout caused by transient reasons such as network fluctuation.
If the acquired state includes an acquisition error, that is, the acquired component signal is a signal inconsistent with the expected signal, the first connectivity detection result is determined to be a connection failure.
The first connectivity detection result is determined through the acquisition state of the component signals of the first component to be detected, so that the type of the detection result can be timely and accurately determined, subsequent analysis and processing are facilitated according to the detection result, for example, the overtime reason is analyzed according to the connection overtime, the first component to be detected is processed according to the connection failure result, and the working efficiency of the robot is improved.
In this embodiment, optionally, after determining the first connectivity detection result of the first component to be detected of the robot, the method further includes:
and if the first connectivity detection result is failure, stopping the determining process of the second connectivity detection result.
When the first connectivity detection result is failure, namely the first component to be detected possibly has failure, so that the detection accuracy of the second connectivity detection is affected, and thus the subsequent determination process of the second connectivity detection result is stopped, so that the detection efficiency is improved; and when the first connectivity detection result is failure, the robot is automatically locked, other faults, accidents and the like caused by abnormal operation of the robot are avoided, and safety is ensured.
Step 120, if the first connectivity detection result is successful, determining a second connectivity detection result of a second component to be detected of at least one robot; the second component to be detected comprises at least one of a laser radar, a stereoscopic vision component, an image acquisition component, a chassis control board, a motor drive board, an inertial sensor and a collision bar safety edge component.
If the first connectivity detection result is successful, that is, when the connection state of the first component to be detected is successful, determining a second connectivity detection result of a second component to be detected of the at least one robot, where the second component to be detected is a component of a lower layer of the first component to be detected, for example, a component controlled by the first component to be detected. The second connectivity detection result may include detecting whether the connection between the second component to be detected and the first component to be detected is successful, or may include detecting whether the connection between the second component to be detected is successful, and by way of example, if the first component to be detected is a main control algorithm board and the second component to be detected is a laser radar, the second connectivity detection result of the laser radar may be that whether the main control algorithm board is communicated with the laser radar. If the second component to be detected is a chassis control board and a motor drive board, the second connectivity detection result of the motor drive board may be whether the main control algorithm board is connected with the motor drive board and the motor drive board is connected with the chassis control board.
When the second connectivity detection result of a certain second component to be detected is failure, the connectivity of other second components to be detected is continuously detected, and related processing personnel are notified of the failure result through an application program and other ways, so that the processing personnel can conveniently and timely acquire the second connectivity failure result, and corresponding fault processing is timely carried out.
The second component to be detected comprises at least one of a laser radar, a stereoscopic vision component, an image acquisition component, a chassis control board, a motor drive board, an inertial sensor and a collision bar safety edge component. The stereoscopic vision component in the second component to be detected can be binocular stereoscopic vision equipment; the image capturing component may be an image capturing device, which is not limited in this embodiment.
Optionally, a detection report is generated according to the first connectivity detection result and/or the second connectivity detection result, where the detection report may include a final detection result, a detection time, and the like of each component to be detected. For example, if the second connectivity detection result of the stereoscopic vision component is detected to be abnormal in 09:57 of the year 6 and 20 of 2019, the abnormal state and the corresponding time are recorded and displayed at a preset position, so that the user can intuitively obtain the first connectivity detection result and/or the second connectivity detection result. And the time of the cloud or the control system can be automatically synchronized to the robot at preset time points such as when the robot is started, so that the time in the detection report is ensured to be consistent with the standard time.
And the detection information generated in the detection process can be synchronized to a robot log and uploaded to a robot background, wherein the detection information can comprise a robot ID, detected components, detection starting time, detection ending time, detection type, detection result and the like, and is convenient for subsequent analysis and processing.
The detection report and the robot log may be checked remotely or locally, which is not limited in this embodiment.
And determining a second connectivity detection result of at least one second component to be detected, so as to determine whether the robot executes a task or not or perform subsequent detection analysis on the robot according to each second connectivity detection result.
According to the technical scheme provided by the embodiment, a first communication detection result of a first component to be detected of the robot is determined by responding to a detection request of the robot before the robot executes a task; the first component to be detected comprises a main control algorithm board; and if the first connectivity detection result is successful, determining a second connectivity detection result of a second component to be detected of the at least one robot. Through completing connectivity fault detection before the machine performs tasks, the normal operation of the robot is guaranteed conveniently, when the detection result is failure, the position of a failed component can be accurately positioned, and when the detection result is success, the tasks can be performed in time. When the faults occur actually, a user contacts maintenance personnel to remotely assist in troubleshooting the problems, and the problems of the robot are reduced by assisting in processing the faults after the faults are positioned on the robot, so that the effects of improving the detection efficiency and the working efficiency of the robot are achieved.
Example two
Fig. 2 is a flowchart of a robot detection method according to a second embodiment of the present invention, and the present technical solution is to make a supplementary explanation for a process after determining a second connectivity detection result of a second component to be detected of at least one robot. Compared with the scheme, the scheme is particularly optimized, and the method further comprises the step of obtaining the component operation result of the second component to be detected;
and determining a component function detection result of at least one second component to be detected according to the component operation result. Specifically, a flow chart of the robot detection method is shown in fig. 2:
step 210, before a robot executes a task, determining a first connectivity detection result of a first component to be detected of the robot in response to a detection request of the robot; the first component to be detected comprises a main control algorithm board.
Step 220, if the first connectivity detection result is successful, determining a second connectivity detection result of a second component to be detected of at least one robot; the second component to be detected comprises at least one of a laser radar, a stereoscopic vision component, an image acquisition component, a chassis control board, a motor drive board, an inertial sensor and a collision bar safety edge component.
And 230, obtaining a component operation result of the second component to be detected.
The operation result of the component is data obtained when the component executes the function of the component or an execution result obtained when the component executes the function of the component.
For example, the function of the lidar in the second component to be detected may be positioning; the function of the stereoscopic vision component can be to acquire three-dimensional geometric information of an object; the function of the image acquisition component can be to acquire images; the functions of the chassis control board may include controlling the robot bottom hardware, such as the chassis; the motor driving plate can be a driving motor, and the functions can comprise a driving wheel, a driven wheel and the like for controlling the robot; the inertial sensor may itself function to detect and measure acceleration and rotational motion; the collision bar safety edge component can be used for detecting whether collision occurs with an object or not; taking the laser radar and the collision strip safety edge component as an example, the operation result of the components of the laser radar can comprise the position data of the robot, and the operation result of the components of the collision strip safety edge component can be a collision signal returned by the robot when the robot collides with an obstacle.
And 240, determining a component function detection result of at least one second component to be detected according to the component operation result.
The component function detection result is used for indicating whether the function of the component is normal, and if the laser radar is used for positioning, the component function detection result indicates whether the positioning function of the laser radar is accurate or not.
The component function detection result of at least one second component to be detected is determined according to the component operation result, which may be the component function detection result of the component determined according to the single component operation result, or may be the component function detection result determined by combining a plurality of component operation results.
For example, the robot can be controlled to collide with the obstacle, a signal returned by the collision strip safety side assembly is obtained, and if the signal shows that the robot collides with the obstacle, the function detection result of the components of the collision strip safety side assembly is determined to be normal.
And when the laser radar and the inertial sensor are used for positioning, the robot can be moved to a test place, and whether the positioning information of the laser radar and the inertial sensor is the same as the known position of the test place is judged, if so, the function detection results of the components of the laser radar and the inertial sensor are normal.
In this embodiment, optionally, if the second component to be detected includes the lidar, determining a component function detection result of at least one second component to be detected according to the component operation result includes:
obtaining environmental information of the current position of the robot according to the laser radar operation result;
and determining a component function detection result of the laser radar according to a comparison result of the map information of the current position and the environment information.
Determining a current position through a laser radar, detecting and obtaining environment information of the current position, comparing the environment information with map information acquired in advance at the current position, determining whether the similarity exceeds a preset percentage threshold, if so, determining that the function detection result of the laser radar component is normal, and if not, determining that the function detection result of the laser radar component is abnormal, and carrying out abnormality warning.
Before the robot executes the task, the component function detection result of the laser radar is determined according to the comparison result of the map information and the environment information of the current position, so that the component function detection result of the laser radar is accurately obtained, the problem that the robot affects the execution of the actual task due to the abnormality of the laser radar is avoided, and the accuracy of the task execution of the robot and the task completion efficiency are improved.
In this embodiment, optionally, determining a component function detection result of at least one second component to be detected according to the component operation result includes:
obtaining a component set to be detected according to a preset combination relation of at least two second components to be detected;
and determining the component function detection result of the component set to be detected according to the component operation result set corresponding to the component set to be detected.
The preset combination relation is a combination relation between different second components to be detected, whether the second components to be detected need to be cooperated or not can be determined according to the fact that whether the second components to be detected need to be cooperated or not, and the positioning function of the robot needs to be combined with the inertial sensor by the aid of the laser radar or the inertial sensor and the image acquisition component, so that the laser radar and the inertial sensor have the preset combination relation or the laser radar and the inertial sensor and the image acquisition component have the preset combination relation.
And forming a component set to be detected according to the second components to be detected with the preset combination relation, acquiring component operation results corresponding to the components in the component set to be detected, and jointly determining component function detection results of the component set to be detected.
For example, if a preset combination relationship exists among the laser radar, the inertial sensor and the image acquisition component, acquiring a laser radar operation result, an inertial sensor operation result and an image acquisition component operation result which are respectively corresponding to the laser radar, the inertial sensor and the image acquisition component; and determining a detection result of the robot positioning function together according to the laser radar operation result, the inertial sensor operation result and the image acquisition assembly operation result.
And determining the component function detection results of the component set to be detected according to the second component set to be detected with the preset combination relation, and acquiring a plurality of component function detection results through single operation of the robot, so that the efficiency of determining the component function detection results is improved. And the component function detection results of the component set to be detected are jointly determined through the component operation result set, so that for the functions completed through the combination of a plurality of components, the error of determining the corresponding robot function detection results according to a single second component to be detected can be reduced, and the accuracy of determining the robot function detection results is improved.
In this embodiment, optionally, if the second to-be-detected component set includes the image acquisition assembly and the inertial sensor, determining, according to a component operation result set corresponding to the to-be-detected component set, a component function detection result of the to-be-detected component set includes:
acquiring a starting position of the robot, and controlling the robot to move to a target label;
obtaining a label image of the target label according to the operation result of the image acquisition component so as to determine a label position corresponding to the target label according to the label image;
obtaining a sensor data change result of the robot moving from the starting position to the target tag according to the operation result of the inertial sensor;
and determining the functional detection results of the image acquisition component and the inertial sensor according to the position change results of the starting position and the label position and the sensor data change results.
The start position of the robot may be acquired by starting the robot at a preset position. The control robot moves to the target tag, where the target tag may be the tag closest to the starting position, which is not limited in this embodiment. The target label can be located indoor ceiling department, and the image acquisition subassembly can be located the top of robot, is convenient for gather the label image.
And obtaining a label image of the target label according to the operation result of the image acquisition component, namely carrying out image acquisition on the target label, wherein the position information corresponding to the label image is predetermined, and the label position corresponding to the target label can be obtained by carrying out image recognition on the target label image.
And obtaining a sensor data change result in the path of the robot from the starting position to the target label according to the operation result of the inertial sensor.
The position change result can comprise the distance and the direction of the robot from the starting position to the label position, the position change result can be obtained through calculation of the coordinate data of the starting position and the target label position in the map, meanwhile, the estimated position change result can be obtained through calculation of the sensor data change result, the estimated position change result is compared with the position change result, whether the difference value exceeds a preset threshold value is determined, and if the difference value does not exceed the preset threshold value, the image acquisition assembly and the inertial sensor function detection result can be determined to be normal; if the detected abnormal condition exceeds the detected abnormal condition, at least one of the image acquisition component and the inertial sensor can be determined to be abnormal, subsequent single abnormal detection can be carried out, and abnormal alarming is carried out on abnormal components.
The robot is controlled to move from the starting position to the target label, and meanwhile, the function detection results of the image acquisition assembly and the inertial sensor are obtained, so that the efficiency of determining the function detection results of the components is improved. And the functional detection results of the image acquisition component and the inertial sensor are obtained together through the position change result of the robot and the sensor data change result, so that the accuracy of determining the functional detection results of the components is improved.
According to the embodiment of the invention, before the robot executes the task and after the robot is determined to be capable of operating normally, the component function detection result of at least one second component to be detected is determined according to the component operation result, and whether each second component to be detected can execute the corresponding function accurately is determined on the premise that the robot is capable of operating normally, so that the problem that the robot has abnormal functions to influence the execution of the actual task due to the second components to be detected is avoided, and the accuracy of the execution of the task of the robot and the efficiency of the completion of the task are improved.
Example III
Fig. 3 is a schematic structural diagram of a robot detection device according to a third embodiment of the present invention. The device can be realized by hardware and/or software, and the robot detection method provided by any embodiment of the invention can be executed and has the corresponding functional modules and beneficial effects of the execution method. As shown in fig. 3, the apparatus includes:
a first result determining module 310, configured to determine, before a robot performs a task, a first connectivity detection result of a first component to be detected of the robot in response to a detection request for the robot; the first component to be detected comprises a main control algorithm board;
a second result determining module 320, configured to determine a second connectivity detection result of a second component to be detected of at least one robot if the first connectivity detection result is successful; the second component to be detected comprises at least one of a laser radar, a stereoscopic vision component, an image acquisition component, a chassis control board, a motor drive board, an inertial sensor and a collision bar safety edge component.
According to the technical scheme provided by the embodiment, a first communication detection result of a first component to be detected of the robot is determined by responding to a detection request of the robot before the robot executes a task; the first component to be detected comprises a main control algorithm board; if the first connectivity detection result is successful, determining a second connectivity detection result of the second to-be-detected components of the at least one robot, and when the detection result is failure, accurately positioning the failed components, and when the detection results are all successful, executing the task in time. When the faults occur actually, a user contacts maintenance personnel to remotely assist in troubleshooting the problems, and the problems of the robot are reduced by assisting in processing the faults after the faults are positioned on the robot, so that the effects of improving the detection efficiency and the working efficiency of the robot are achieved.
Based on the above technical solutions, optionally, if the second connectivity detection result is successful, the apparatus further includes:
the operation result obtaining module is used for obtaining the component operation result of the second component to be detected after the second result determining module;
and the function detection result determining module is used for determining the component function detection result of at least one second component to be detected according to the component operation result.
On the basis of the above technical solutions, optionally, if the second component to be detected includes the lidar, the function detection result determining module includes:
the environment information obtaining unit is used for obtaining the environment information of the current position of the robot according to the laser radar operation result;
and the first function detection result determining unit is used for determining the function detection result of the components of the laser radar according to the comparison result of the map information of the current position and the environment information.
On the basis of the above technical solutions, optionally, the function detection result determining module includes: comprising the following steps:
the component set obtaining unit is used for obtaining a component set to be detected according to a preset combination relation of at least two second components to be detected;
and the second function detection result determining unit is used for determining the component function detection result of the component set to be detected according to the component operation result set corresponding to the component set to be detected.
On the basis of the above technical solutions, optionally, if the second component set to be detected includes the image acquisition component and the inertial sensor, the second function detection result determining unit includes:
the starting position acquisition subunit is used for acquiring the starting position of the robot and controlling the robot to move to the target label;
the label position determining subunit is used for obtaining a label image of the target label according to the operation result of the image acquisition component so as to determine the label position corresponding to the target label according to the label image;
the data change result obtaining subunit is used for obtaining a sensor data change result of the robot moving from the starting position to the target label according to the operation result of the inertial sensor;
and the function detection result determining subunit is used for determining the function detection results of the image acquisition assembly and the inertial sensor according to the position change results of the starting position and the label position and the sensor data change results.
On the basis of the above technical solutions, optionally, the first result determining module includes:
a first result determining unit, configured to determine the first connectivity detection result according to an acquisition state of a component signal of the first component to be detected; wherein the acquisition state includes at least one of acquisition correctness, acquisition timeout, and acquisition error.
On the basis of the above technical solutions, optionally, the apparatus further includes:
and the determining process stopping module is used for stopping the determining process of the second connectivity detection result if the first connectivity detection result is failed after the first result determining module.
Example IV
Fig. 4 is a schematic structural diagram of a robot according to a fourth embodiment of the present invention, and as shown in fig. 4, the robot includes a processor 40, a memory 41, an input device 42 and an output device 43; the number of processors 40 in the robot may be one or more, one processor 40 being taken as an example in fig. 4; the processor 40, the memory 41, the input means 42 and the output means 43 in the robot may be connected by a bus or other means, in fig. 4 by way of example.
The memory 41 is a computer readable storage medium, and may be used to store a software program, a computer executable program, and modules, such as program instructions/modules corresponding to the robot inspection method in the embodiment of the present invention. The processor 40 executes various functional applications and data processing of the robot by running software programs, instructions and modules stored in the memory 41, i.e., implements the above-described robot inspection method.
The memory 41 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, at least one application program required for functions; the storage data area may store data created according to the use of the terminal, etc. In addition, memory 41 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid-state storage device. In some examples, memory 41 may further include memory remotely located relative to processor 40, which may be connected to the robot via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
Example five
A fifth embodiment of the present invention also provides a storage medium containing computer-executable instructions, which when executed by a computer processor, are for performing a robot detection method, the method comprising:
before a robot executes a task, responding to a detection request of the robot, and determining a first connectivity detection result of a first component to be detected of the robot; the first component to be detected comprises a main control algorithm board;
if the first connectivity detection result is successful, determining a second connectivity detection result of a second component to be detected of at least one robot; the second component to be detected comprises at least one of a laser radar, a stereoscopic vision component, an image acquisition component, a chassis control board, a motor drive board, an inertial sensor and a collision bar safety edge component.
Of course, the storage medium containing the computer executable instructions provided in the embodiments of the present invention is not limited to the method operations described above, and may also perform the related operations in the robot detection method provided in any embodiment of the present invention.
From the above description of embodiments, it will be clear to a person skilled in the art that the present invention may be implemented by means of software and necessary general purpose hardware, but of course also by means of hardware, although in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art in the form of a software product, which may be stored in a computer readable storage medium, such as a floppy disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a FLASH Memory (FLASH), a hard disk or an optical disk of a computer, etc., and include several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method according to the embodiments of the present invention.
It should be noted that, in the embodiment of the robot detecting device, each unit and module included are only divided according to the functional logic, but not limited to the above-mentioned division, so long as the corresponding functions can be implemented; in addition, the specific names of the functional units are also only for distinguishing from each other, and are not used to limit the protection scope of the present invention.
Note that the above is only a preferred embodiment of the present invention and the technical principle applied. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, while the invention has been described in connection with the above embodiments, the invention is not limited to the embodiments, but may be embodied in many other equivalent forms without departing from the spirit or scope of the invention, which is set forth in the following claims.

Claims (7)

1. A robot inspection method comprising:
before a robot executes a task, responding to a detection request of the robot, and determining a first connectivity detection result of a first component to be detected of the robot; the first component to be detected comprises a main control algorithm board; the detection request is generated by the robot when the robot is started or the control APP of the robot is started and operated;
if the first connectivity detection result is successful, determining a second connectivity detection result of a second component to be detected of the robot; the second component to be detected is a lower-level component of the first component to be detected; the second component to be detected comprises an image acquisition component and an inertial sensor; the second connectivity detection result comprises detection of whether the connection between the second component to be detected and the first component to be detected is successful or not;
if the second connectivity detection result is successful, acquiring a component operation result of the second component to be detected;
obtaining a component set to be detected according to a preset combination relation of at least two second components to be detected; the preset combination relation is a combination relation among different second components to be detected and is determined by whether the different components to be detected need to cooperate or not;
the component set to be detected comprises the image acquisition component and the inertial sensor, acquires the starting position of the robot, and controls the robot to move to a target tag;
obtaining a label image of the target label according to the operation result of the image acquisition component so as to determine a label position corresponding to the target label according to the label image;
obtaining a sensor data change result of the robot moving from the starting position to the target tag according to the operation result of the inertial sensor;
and determining the functional detection results of the image acquisition component and the inertial sensor according to the position change results of the starting position and the label position and the sensor data change results.
2. The method according to claim 1, wherein if the second component to be detected includes a laser radar, determining a component function detection result of the second component to be detected according to the component operation result includes:
obtaining environmental information of the current position of the robot according to the laser radar operation result;
and determining a component function detection result of the laser radar according to a comparison result of the map information of the current position and the environment information.
3. The method of claim 1, wherein determining a first connectivity test result for a first component to be tested of the robot comprises:
determining the first connectivity detection result according to the acquisition state of the component signals of the first component to be detected; wherein the acquisition state includes at least one of acquisition correctness, acquisition timeout, and acquisition error.
4. The method according to claim 1, further comprising, after determining a first connectivity test result of a first component to be tested of the robot:
and if the first connectivity detection result is failure, stopping the determining process of the second connectivity detection result.
5. A robot inspection device, comprising:
the first result determining module is used for determining a first communication detection result of a first component to be detected of the robot in response to a detection request of the robot before the robot executes a task; the first component to be detected comprises a main control algorithm board; the detection request is generated by the robot when the robot is started or the control APP of the robot is started and operated;
the second result determining module is used for determining a second connectivity detection result of a second component to be detected of the robot if the first connectivity detection result is successful; the second component to be detected is a lower-level component of the first component to be detected; the second component to be detected comprises an image acquisition component and an inertial sensor;
the second connectivity detection result comprises detection of whether the connection between the second component to be detected and the first component to be detected is successful or not;
the operation result obtaining module is used for obtaining the element operation result of the second element to be detected if the second connectivity detection result is successful;
the function detection result determining module is used for determining a component function detection result of at least one second component to be detected according to the component operation result;
wherein, the function detection result determining module includes:
the component set obtaining unit is used for obtaining a component set to be detected according to a preset combination relation of at least two second components to be detected; the preset combination relation is a combination relation among different second components to be detected and is determined by whether the different components to be detected need to cooperate or not;
the second function detection result determining unit is used for determining a component function detection result of the component set to be detected according to the component operation result set corresponding to the component set to be detected;
the component set to be detected comprises the image acquisition component and the inertial sensor, and the second function detection result determining unit comprises:
the starting position acquisition subunit is used for acquiring the starting position of the robot and controlling the robot to move to the target label;
the label position determining subunit is used for obtaining a label image of the target label according to the operation result of the image acquisition component so as to determine the label position corresponding to the target label according to the label image;
the data change result obtaining subunit is used for obtaining a sensor data change result of the robot moving from the starting position to the target label according to the operation result of the inertial sensor;
and the function detection result determining subunit is used for determining the function detection results of the image acquisition assembly and the inertial sensor according to the position change results of the starting position and the label position and the sensor data change results.
6. A robot, the robot comprising:
one or more processors;
storage means for storing one or more programs,
when executed by the one or more processors, causes the one or more processors to implement the robotic detection method of any one of claims 1-4.
7. A computer-readable storage medium, on which a computer program is stored, characterized in that the program, when being executed by a processor, implements the robot detection method according to any one of claims 1-4.
CN202111255933.1A 2021-10-27 2021-10-27 Robot detection method and device, electronic equipment and storage medium Active CN113858208B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111255933.1A CN113858208B (en) 2021-10-27 2021-10-27 Robot detection method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111255933.1A CN113858208B (en) 2021-10-27 2021-10-27 Robot detection method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113858208A CN113858208A (en) 2021-12-31
CN113858208B true CN113858208B (en) 2024-01-23

Family

ID=78997985

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111255933.1A Active CN113858208B (en) 2021-10-27 2021-10-27 Robot detection method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113858208B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20130027339A (en) * 2011-09-07 2013-03-15 엘지전자 주식회사 Mobile robot, and system and method for remotely controlling the same
CN109623877A (en) * 2019-01-21 2019-04-16 广州高新兴机器人有限公司 Self checking method, self-checking system and the computer storage medium of robot
CN112536819A (en) * 2020-05-26 2021-03-23 深圳优地科技有限公司 Fault processing method and device, robot and storage medium
CN112549045A (en) * 2020-12-05 2021-03-26 武汉千音科技有限公司 Submarine cable fault detection robot for ocean engineering
CN113447059A (en) * 2021-06-03 2021-09-28 北京百度网讯科技有限公司 Detection method and device for sensor of automatic driving automobile and electronic equipment

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20130027339A (en) * 2011-09-07 2013-03-15 엘지전자 주식회사 Mobile robot, and system and method for remotely controlling the same
CN109623877A (en) * 2019-01-21 2019-04-16 广州高新兴机器人有限公司 Self checking method, self-checking system and the computer storage medium of robot
CN112536819A (en) * 2020-05-26 2021-03-23 深圳优地科技有限公司 Fault processing method and device, robot and storage medium
CN112549045A (en) * 2020-12-05 2021-03-26 武汉千音科技有限公司 Submarine cable fault detection robot for ocean engineering
CN113447059A (en) * 2021-06-03 2021-09-28 北京百度网讯科技有限公司 Detection method and device for sensor of automatic driving automobile and electronic equipment

Also Published As

Publication number Publication date
CN113858208A (en) 2021-12-31

Similar Documents

Publication Publication Date Title
CN106840242B (en) Sensor self-checking system and multi-sensor fusion system of intelligent driving automobile
US10414452B2 (en) System and method for vehicle inspection
EP3232285B1 (en) Method and arrangement for monitoring and adapting the performance of a fusion system of an autonomous vehicle
CN110834642B (en) Vehicle deviation identification method and device, vehicle and storage medium
CN111881520A (en) Anomaly detection method and device for automatic driving test, computer equipment and storage medium
CN113212453B (en) Automatic driving vehicle fusion navigation decision method in internet environment
US11189113B2 (en) Forward collision avoidance assist performance inspection system and method thereof
CN113110266B (en) Remote control monitoring early warning method for automatic driving vehicle and storage medium
CN112536819A (en) Fault processing method and device, robot and storage medium
US20220264081A1 (en) Sensing performance evaluation and diagnosis system and sensing performance evaluation and diagnosis method for external-environment recognition sensor
CN113879324B (en) Intelligent driving vehicle sensor fault processing method and device
CN114265303A (en) Automatic driving control system and vehicle
CN113237688B (en) Robot, robot obstacle avoidance performance testing method and device and readable storage medium
CN113858208B (en) Robot detection method and device, electronic equipment and storage medium
CN111752279B (en) Multi-sensor fusion self-checking method and system for robot
WO2021035701A1 (en) Sensor detection method and vehicle-mounted control terminal
CN115973180A (en) Steering detection method, apparatus, device and medium for autonomous vehicle
KR20180055433A (en) Autonomous driving system fail-safe utility and method thereof
CN114572138B (en) Automatic driving vehicle accident fault self-checking method, device, equipment and storage medium
CN115390431A (en) Controller and vehicle
CN113485318A (en) Position correction method, device, equipment and storage medium
CN114248766A (en) Apparatus and method for ensuring travel track of own vehicle
CN113884123A (en) Sensor calibration method and device, vehicle and storage medium
CN112367352A (en) Vehicle abnormality detection method and system
CN112631272A (en) Method and equipment for remotely recovering power of robot

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant