CN115741713A - Robot working state determination method, device, equipment and medium - Google Patents

Robot working state determination method, device, equipment and medium Download PDF

Info

Publication number
CN115741713A
CN115741713A CN202211493348.XA CN202211493348A CN115741713A CN 115741713 A CN115741713 A CN 115741713A CN 202211493348 A CN202211493348 A CN 202211493348A CN 115741713 A CN115741713 A CN 115741713A
Authority
CN
China
Prior art keywords
subtask
current
state
robot
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211493348.XA
Other languages
Chinese (zh)
Inventor
万小丽
吴曼玲
刘景亚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
CISDI Engineering Co Ltd
CISDI Research and Development Co Ltd
Original Assignee
CISDI Engineering Co Ltd
CISDI Research and Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by CISDI Engineering Co Ltd, CISDI Research and Development Co Ltd filed Critical CISDI Engineering Co Ltd
Priority to CN202211493348.XA priority Critical patent/CN115741713A/en
Publication of CN115741713A publication Critical patent/CN115741713A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Manipulator (AREA)

Abstract

The invention provides a method, a device, equipment and a medium for determining the operation state of a robot, which are characterized in that the current operation information is matched with preset operation information corresponding to each target operation subtask by acquiring a target operation task and the current operation information of a target robot, and the current operation subtask is determined; collecting a plurality of operation characteristic parameters of a current operation subtask to obtain a current fusion characteristic vector; inputting the current fusion feature vector into a pre-trained time sequence analysis model to obtain estimated operation time; comparing the current operation time with the estimated operation time to obtain an operation time difference, and determining the subtask operation state of the current operation subtask according to the operation time difference so as to determine the target task operation state of the target operation task, namely the robot operation state of the target robot; and determining the task operation state of the subtask according to the operation characteristic parameters, the current operation time and a preset time sequence analysis model, so as to obtain the target task operation state and the robot operation state.

Description

Robot working state determination method, device, equipment and medium
Technical Field
The application relates to the technical field of intelligent robots, in particular to a method, a device, equipment and a storage medium for determining a robot working state.
Background
With the innovation of new-generation information technology, the robot industry planning clearly proposes to promote the fusion application of the robot and new technologies such as artificial intelligence, 5G, big data, cloud computing and the like, improve the intellectualization and networking level of the robot, and strengthen the function safety, the network safety and the data safety. At present, safety accidents in the operation process of an industrial robot are still rare, enterprises generally adopt passive safety modes such as electronic fences or industry specifications to ensure the safety of a working scene, but the modes have the problems of poor flexibility, poor safety real-time performance and the like, the operation state of the robot can be easily identified initiatively in an application scene with complex and multiple uncontrollable factors, and the whole task is out of control due to forced operation under the condition that the operation state of the robot is abnormal. Therefore, how to automatically identify the working state of the robot, the passive protection is changed into the active avoidance, and the active safety performance of the industrial robot working system is improved has important significance.
However, currently, independent or fusion processing is mainly performed by acquiring signals of the robot itself, such as vibration, pose, speed, and moment of the robot, and whether the working state of the robot is normal is judged by detecting parameter changes of the robot itself in the working process, but often the working task of the robot is not completed independently and needs to be matched with other external devices or systems, so that only the parameter information of the robot itself is detected to obtain a judgment conclusion, and the accuracy is relatively low.
Disclosure of Invention
In view of the above drawbacks of the prior art, the present invention provides a method, an apparatus, a system, a device, and a medium for determining a working state of a robot, so as to solve the technical problem that the working state of the robot cannot be accurately identified only by using the state characteristics of the robot itself.
The invention provides a robot working state determining method, which comprises the following steps: acquiring a target operation task of a target robot and current operation information of the target robot, wherein the current operation information comprises current operation time, the target operation task comprises a plurality of target operation subtasks, and each target operation subtask is correspondingly provided with preset operation information; matching the current operation information with preset operation information corresponding to each target operation subtask to determine a current operation subtask matched with the current operation information from a plurality of target operation subtasks; collecting a plurality of operation characteristic parameters of the current operation subtask, and performing characteristic extraction and fusion processing on the plurality of operation characteristic parameters to obtain a current fusion characteristic vector of the current operation subtask; inputting the current fusion feature vector into a preset time sequence analysis model, obtaining estimated operation time, and determining an operation time difference according to the current operation time and the estimated operation time; and determining the subtask operation state of the current operation subtask according to the operation time difference, determining the target task operation state of the target operation task according to the subtask operation state, and determining the target task operation state as the robot operation state of the target robot.
In an embodiment of the present invention, before acquiring the target operation task of the target robot, the method further includes: acquiring a plurality of preset operation tasks of a target robot; and dividing the preset operation tasks into a plurality of preset operation subtasks according to the priori knowledge, and determining preset operation characteristic parameters to be acquired of the preset operation subtasks and preset operation information of the preset operation subtasks.
In an embodiment of the present invention, matching the current job information with preset job information corresponding to each of the target job subtasks, so as to determine a current job subtask matching the current job information from a plurality of target job subtasks includes: the preset operation information comprises preset operation time and a preset pose state, and the current operation information comprises current operation time and a current pose state; matching the current operation time with the preset operation time to obtain a preset operation subtask which has an association relation with the preset operation time, and determining the preset operation subtask as the current operation subtask; or matching the current pose state with the preset pose state to obtain a preset operation subtask associated with the preset pose state, and determining the preset operation subtask as the current operation subtask.
In an embodiment of the present invention, the collecting the plurality of job characteristic parameters of the current job subtask includes: matching the current operation subtask with a plurality of preset operation subtasks to obtain a candidate preset operation subtask with the highest similarity to the current operation subtask; determining a preset operation characteristic parameter identifier of a preset operation subtask based on a preset operation characteristic parameter to be acquired of the preset operation subtask; determining alternative preset operation characteristic parameter identifications of the north line preset operation subtasks according to the incidence relation between the preset operation subtasks and the preset operation characteristic parameter identifications; determining the candidate preset operation characteristic parameter identification as the current operation characteristic parameter identification of the current operation subtask; and acquiring the operation characteristic parameters of the current operation subtask based on the current operation characteristic parameter identification, wherein the operation characteristic parameters comprise at least two of robot main body parameters, peripheral cooperation device parameters and space environment parameters of the target robot operation.
In an embodiment of the present invention, after obtaining the current fusion feature vector of the current job subtask, the method further includes: inputting the current fusion feature vector into a preset time sequence analysis model corresponding to the current operation subtask, and obtaining model output as estimated operation time; and determining the difference value of the current working time and the estimated working time as a working time difference.
In an embodiment of the present invention, determining the subtask job status of the current job subtask according to the job time difference includes: when the operation time difference of the current operation subtask of the target robot is smaller than a preset threshold value, judging that the subtask operation state of the current operation subtask of the target robot is normal, keeping the current operation state, and entering a next-stage operation subtask after the current operation subtask is completed until the target operation task is completed; and when the operation time difference of the current operation subtask of the target robot is larger than or equal to a preset threshold value, judging that the subtask operation state of the current operation subtask of the target robot is abnormal, sending an abnormal alarm and suspending the current operation subtask.
In one embodiment of the present invention, determining a target task operation state of the target operation task based on the subtask operation state, and determining the target task operation state as the robot operation state of the target robot comprises: when the subtask operation state of the current operation subtask of the target robot is a normal state, entering a next operation subtask; if the subtask operation state of each operation subtask is a normal state, judging that the target task operation state of the target operation task is a normal state, namely the robot operation state of the target robot is a normal state; and if the subtask operation state of at least one operation subtask exists in the abnormal state, judging that the target task operation state of the target operation task is the abnormal state, namely the robot operation state of the target robot is the abnormal state.
In an embodiment of the present invention, after determining the job status of the target job task based on the job statuses of the subtasks, the method further includes: when the system judges that the subtask operation state of the current operation subtask is an abnormal state, the subtask operation state of the current operation subtask is artificially detected based on priori knowledge, and if the subtask operation state of the current operation subtask of the target robot is determined to be the abnormal state through detection, namely the system judges accurately, the current operation subtask of the target robot is stopped; and if the subtask operation state of the current operation subtask of the target robot is determined to be a normal state through detection, namely the system judges abnormally, generating a state association relation between the current fusion characteristic vector and the subtask operation state based on the current fusion characteristic vector of the robot operation subtask and the conclusion that the subtask operation state is normal, determining the current fusion characteristic vector and the current operation time as training samples, and updating and training the time sequence analysis model.
The present invention provides a robot working state specifying device including: the information acquisition module is used for acquiring a target operation task of a target robot and current operation information of the target robot, wherein the current operation information comprises current operation time, the target operation task comprises a plurality of target operation subtasks, and each target operation subtask is correspondingly provided with preset operation information; the operation subtask determining module is used for matching the current operation information with preset operation information corresponding to each target operation subtask so as to determine a current operation subtask matched with the current operation information from a plurality of target operation subtasks; the information acquisition module is used for acquiring a plurality of operation characteristic parameters of the current operation subtask, and performing characteristic extraction and fusion processing on the plurality of operation characteristic parameters to obtain a current fusion characteristic vector of the current operation subtask; the operation time difference determining module is used for comparing the current fusion feature vector with a preset fusion feature vector, determining the preset operation time corresponding to the preset feature vector with the maximum similarity as standard operation time, and determining an operation time difference according to the current operation time and the standard operation time; and the robot working state determining module is used for determining the subtask working state of the current working subtask according to the working time difference, determining the target task working state of the target working task according to the subtask working state, and determining the target task working state as the robot working state of the target robot.
The present invention provides an electronic device, including: one or more processors; storage means for storing one or more programs which, when executed by the one or more processors, cause the electronic device to implement the robot work state determination method as described above.
The present invention provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor of a computer, causes the computer to execute a robot work state determination method as described above.
The invention has the beneficial effects that: the invention provides a method, a device, a system, equipment and a medium for determining the operation state of a robot, wherein a plurality of operation characteristic parameters of a current operation subtask of the robot are obtained, the characteristic extraction is carried out on the plurality of operation characteristic parameters to obtain a plurality of operation characteristic codes, and the plurality of characteristic codes are fused to generate a fusion characteristic vector of the current operation subtask; comparing the fusion characteristic vector with a preset standard working state to obtain an operation time difference, and determining a subtask working state of the current operation subtask based on the operation time difference; determining an operation state of the target operation task based on the subtask operation state to identify an operation state of the target robot; the invention judges the operation state of the operation subtask through the operation characteristic parameter, the current operation time and the preset standard operation time, thereby determining the operation state of the target operation task, wherein the operation characteristic parameter not only comprises the parameter information of the robot, but also comprises other operation information of the surrounding environment, thereby greatly improving the accuracy of determining the operation state of the robot.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present application and together with the description, serve to explain the principles of the application. It is obvious that the drawings in the following description are only some embodiments of the application, and that for a person skilled in the art, other drawings can be derived from them without inventive effort. In the drawings:
fig. 1 is a schematic diagram of an implementation environment for robot work state determination, shown in an exemplary embodiment of the present application;
FIG. 2 is a flow diagram illustrating a machine job status determination in accordance with an exemplary embodiment of the present application;
FIG. 3 is a flowchart illustrating a robot job status determination overall flow diagram in accordance with an exemplary embodiment of the present application;
FIG. 4 is a flowchart illustrating a robot job status determination in accordance with an exemplary embodiment of the present application;
FIG. 5 is a schematic diagram of a robot work state determination system shown in an exemplary embodiment of the present application;
fig. 6 is a block diagram illustrating a robot working state determination apparatus according to an exemplary embodiment of the present application;
FIG. 7 illustrates a schematic structural diagram of a computer system suitable for use to implement the electronic device of the embodiments of the subject application.
Detailed Description
Other advantages and effects of the present invention will become apparent to those skilled in the art from the disclosure of the present specification, wherein the following description is made for the embodiments of the present invention with reference to the accompanying drawings and the preferred embodiments. The invention is capable of other and different embodiments and of being practiced or of being carried out in various ways, and its several details are capable of modification in various respects, all without departing from the spirit and scope of the present invention. It should be understood that the preferred embodiments are illustrative of the invention only and are not limiting upon the scope of the invention.
It should be noted that the drawings provided in the following embodiments are only for illustrating the basic idea of the present invention, and the components related to the present invention are only shown in the drawings rather than drawn according to the number, shape and size of the components in actual implementation, and the type, quantity and proportion of the components in actual implementation may be changed freely, and the layout of the components may be more complicated.
In the following description, numerous details are set forth to provide a more thorough explanation of embodiments of the present invention, however, it will be apparent to one skilled in the art that embodiments of the present invention may be practiced without these specific details, and in other embodiments, well-known structures and devices are shown in block diagram form, rather than in detail, in order to avoid obscuring embodiments of the present invention.
Firstly, it should be noted that, in the field of active safety research in the industrial robot field, the identification and judgment of the working state of the robot are realized mainly by acquiring the self signals of the robot such as vibration, pose, speed and moment of the robot for individual or fusion processing and forming a state identification model in the modes of dimension reduction, clustering, distance evaluation, hyperplane separation and the like, but as the robot working system needs to realize a complete working task, peripheral cooperative equipment and spatial condition omnibearing cooperation are also needed in addition to the robot, the accurate working state cannot be obtained only by using the self state characteristics of the robot in the working process, and further the whole working safety of the system cannot be ensured.
Fig. 1 is a schematic diagram of an implementation environment for determining a working state of a robot during a navigation process according to an exemplary embodiment of the present application.
As shown in fig. 1, the system architecture may include an information acquisition apparatus 101 and a computer device 102. The computer device 102 may be at least one of a desktop Graphics Processing Unit (GPU) computer, a GPU computing cluster, a neural network computer, and the like. Related technicians can acquire the operation characteristic parameters of the target robot under the current operation through the information acquisition device 101, then process the acquired operation characteristic parameters based on the computer equipment 102, and compare the processed information with standard parameters determined according to the prior knowledge, thereby determining the operation state of the robot.
Fig. 2 is a flowchart illustrating a robot work state determination according to an exemplary embodiment of the present application.
As shown in fig. 2, in an exemplary embodiment, the robot working status determining method at least includes steps S210 to S250, which are described in detail as follows:
step S210, a target operation task of the target robot and current operation information of the target robot are obtained, the current operation information comprises current operation time, the target operation task comprises a plurality of target operation subtasks, and preset operation information is correspondingly set in each target operation subtask.
The method comprises the following steps of obtaining a plurality of preset operation tasks of the target robot before obtaining the target operation task of the target robot; and dividing the plurality of preset operation tasks into a plurality of preset operation subtasks according to the priori knowledge, and determining preset operation characteristic parameters to be acquired of the preset operation subtasks and preset operation information of the preset operation subtasks.
It should be noted that, before determining the working state of a certain working task of the robot, the working task needs to be configured into a plurality of key time-space intervals, that is, key working sub-processes affecting whether the working task can be normally completed, where the working sub-processes include, but are not limited to, tool switching, workpiece taking and placing, plugging, and contacting; meanwhile, the time range and the space range of each key time-space interval configuration in the operation task can be different. Generally, a complete job task is configured into a plurality of job subtasks according to a time sequence according to different job contents of the job task at different times, and the robot pose states of the job subtasks are different due to the different job contents. In addition, the current operation information collected at each operation subtask or each time refers to the robot operation information and the information of the robot cooperative device at the current time, including but not limited to the operation time of the robot, the pose state of the robot, and the relative position relationship between the robot and the peripheral cooperative device.
In an embodiment of the present invention, taking the temperature measurement operation process of the temperature measurement sampling robot as an example, it is first determined that the target operation task of the temperature measurement sampling robot is temperature measurement, and then the target operation task is divided into four operation subtasks of temperature measurement gun tool grabbing, temperature measurement probe plugging, gun falling temperature measurement, and temperature measurement probe scraping and peeling according to prior knowledge, and the operation characteristic parameters of each operation subtask are respectively set as follows: in the sub task of grasping operation by the temperature measuring gun tool, the selected operation characteristic parameters are a robot posture signal, a robot joint torque signal, a quick-change disc device attaching signal and a quick-change disc locking signal; in the sub task of the temperature measuring probe plugging operation, the selected operation characteristic parameters are a robot position and posture signal, a robot moment signal, a temperature measuring gun end approach switch signal and a probe video signal; in the sub task of the gun-off temperature measurement operation, the selected operation characteristic parameters are a robot attitude signal, a robot moment signal, a furnace mouth video signal and a temperature measurement sensor signal; in the subtask of the temperature measuring probe scraping operation, a robot position and posture signal, a robot moment signal and a scraper proximity switch signal are selected.
Step S220, matching the current job information determined by the job status with preset job information corresponding to each job status determination target job subtask, so as to determine a current job subtask matched with the current job information from the plurality of target job subtasks.
Matching the job status determination current job information with preset job information corresponding to each job status determination target job subtask, so as to determine a current job subtask matched with the current job information from a plurality of target job subtasks, including: the method comprises the steps that operation state determination preset operation information comprises preset operation time and a preset pose state, and the operation state determination current operation information comprises current operation time and a current pose state; matching the current operation time with the preset operation time to obtain a preset operation subtask which has an incidence relation with the preset operation time, and determining the operation state to determine the preset operation subtask as the current operation subtask; or matching the current pose state with a preset pose state to obtain a preset operation subtask associated with the preset pose state, and determining the operation state determined preset operation subtask as the current operation subtask.
It should be understood that, when the robot completes a certain job task, the target job task is divided into a plurality of preset job subtasks according to the time change sequence, so that the preset job subtasks and the job time have a unique corresponding relationship, and then the association relationship between the job time with the job time as a and the preset job subtasks as a can be obtained; and because the operation content of each preset operation subtask is different and the robot pose state during operation is different, a unique corresponding relation also exists between the robot pose state and the preset operation subtask, so that the association relation between the robot pose state with the preset operation subtask B and the preset operation subtask can be obtained when the robot pose state is B.
In an embodiment of the present invention, taking the temperature measurement operation in which the target operation task is the temperature measurement sampling robot as an example, the current operation time of the target robot is firstly collected, and the current operation subtask is obtained as the temperature measurement gun tool grabbing operation subtask according to the current operation time and the preset time relationship between the operation subtasks in the temperature measurement operation process of the temperature measurement sampling robot.
The method for acquiring the operation state to determine a plurality of operation characteristic parameters of the current operation subtask comprises the following steps: determining the current operation subtask as a target operation subtask, and obtaining a current operation characteristic parameter identifier of the current operation subtask based on a preset operation characteristic parameter identifier of the target operation subtask; and acquiring the operation characteristic parameters of the current operation subtasks based on the current operation characteristic parameter identification, wherein the operation characteristic parameters comprise at least two of robot main body parameters, peripheral cooperation device parameters and space environment parameters of the target robot operation.
In an embodiment of the present invention, taking a temperature measurement gun tool grabbing task subtask in a temperature measurement operation process of a temperature measurement sampling robot as an example, after determining that the current operation subtask is the temperature measurement gun tool grabbing task subtask, obtaining operation characteristic parameters of the temperature measurement sampling robot based on preset information, where the operation characteristic parameters include a robot main body parameter and a peripheral cooperative device parameter, and acquiring a robot main body parameter including a robot position and posture signal and a robot joint torque signal, and a peripheral cooperative device parameter including a quick change disc device attachment signal and a quick change disc locking signal, and determining the robot position and posture signal, the robot joint torque signal, the quick change disc device attachment signal and the quick change disc locking signal acquired by the robot main body parameter, the robot joint parameter and the peripheral cooperative device parameter as the current operation characteristic parameters.
And step S230, collecting a plurality of operation characteristic parameters of the operation state determination current operation subtasks, and performing characteristic extraction and fusion processing on the plurality of operation state determination operation characteristic parameters to obtain a current fusion characteristic vector of the current operation subtasks.
It should be understood that, the operation characteristic parameters acquired by the information acquisition module generally have no problem such as timing misalignment of the acquired parameters due to different parameter types, which results in a large difference in the value range of the characteristic parameters and even greater noise interference, so that the acquired characteristic parameters need to be preprocessed by means of resampling, high-frequency filtering, and the like to obtain the characteristic parameters with a small timing misalignment and a small value difference, so as to facilitate subsequent processing.
In an embodiment of the invention, taking a temperature measuring gun tool grabbing task subtask in a temperature measuring operation process of a temperature measuring sampling robot as an example, the operation characteristic parameters are determined to be a robot posture signal, a robot joint torque signal, a quick-change disc device attaching signal and a quick-change disc locking signal according to priori knowledge. After the operation subtask is determined, the information acquisition device is used for acquiring information of a robot posture signal, a robot joint torque signal, a quick-change disc device attaching signal and a quick-change disc locking signal. If the sampling frequencies of the robot joint torque signals and the video signals are not consistent, resampling is carried out on the low sampling frequency signals, so that the sampling frequencies of the robot joint torque signals and the video signals of the same operation subtasks are consistent in quantity and time sequence alignment is carried out; in addition, in the case that the collected robot joint torque signal generally has large high-frequency noise, the torque signal is filtered through a high-frequency filtering algorithm to obtain a smoother torque signal.
Fig. 3 is a flowchart illustrating a robot work state determination according to an exemplary embodiment of the present application.
As shown in fig. 3, firstly, dividing a target operation task of a robot into a plurality of operation subtasks according to prior knowledge, then determining the current operation subtask of the robot according to the current working time or pose state of the target robot, collecting operation feature parameters of the current operation subtask, preprocessing and feature extraction processing the obtained operation feature parameters to obtain operation feature codes corresponding to the operation feature parameters, performing fusion processing on the obtained plurality of operation feature codes to obtain fusion feature vectors corresponding to the operation subtasks, then obtaining estimated operation time based on the fusion feature vectors, and comparing the estimated operation time with the current operation time to obtain operation time differences, thereby determining the operation state of the subtasks based on the operation time differences.
In an embodiment of the present invention, taking a temperature measurement gun tool grabbing operation subtask in a temperature measurement operation process of a temperature measurement sampling robot as an example, after obtaining operation characteristic parameters such as a robot pose signal, a robot joint torque signal, a quick change disc device attachment signal, a quick change disc locking signal, and the like, and preprocessing, connecting the robot pose signal and the robot joint torque signal in series to obtain a one-dimensional vector, processing video information of the quick change disc device attachment signal and the quick change disc locking signal through self-encoding, converting a three-dimensional image signal into the one-dimensional vector to obtain one-dimensional feature vectors of a plurality of operation feature codes, and connecting the obtained plurality of one-dimensional feature vectors in series to form a fusion feature vector in the operation subtask state.
Step S240, inputting the current fusion characteristic vector determined by the operation state into a preset time sequence analysis model, and obtaining the output of the model as estimated operation time; and determining the operation time difference according to the current operation time and the estimated operation time.
In an embodiment of the invention, taking a temperature measurement gun tool grabbing task subtask in a temperature measurement operation process of a temperature measurement sampling robot as an example, after determining that the current operation subtask is the temperature measurement gun tool grabbing task subtask based on the current operation time, inputting the obtained current fusion feature vector into a time series analysis model (namely, the temperature measurement gun tool grabbing task time series analysis model) corresponding to the current operation subtask to obtain an estimated operation time of the temperature measurement gun tool grabbing task subtask; and then calculating the current working time and the estimated working time of the temperature measuring gun tool for grabbing the working subtasks, and determining the obtained difference value as the working time difference.
And step S250, determining the subtask operation state of the current operation subtask according to the operation state determination operation time difference, determining the target task operation state of the target operation task according to the operation state determination subtask operation state, and determining the operation state determination target task operation state as the robot operation state of the operation state determination target robot.
Determining the subtask operation state of the current operation subtask according to the operation time difference comprises the following steps: when the operation time difference of the current operation subtask of the target robot is smaller than a preset threshold value, judging that the subtask operation state of the current operation subtask of the target robot is normal, keeping the current operation state, and entering a next-stage operation subtask after the current operation subtask is completed until the target operation task is completed; and when the operation time difference of the current operation subtask of the target robot is larger than or equal to a preset threshold value, judging that the subtask operation state of the current operation subtask of the target robot is abnormal, sending an abnormal alarm and suspending the current operation subtask.
In an embodiment of the present invention, taking a temperature measuring gun tool grabbing task subtask in a temperature measuring operation process of a temperature measuring sampling robot as an example, by processing an operation characteristic parameter, a fusion characteristic vector in a current operation state of the temperature measuring sampling robot is obtained as X, a preset (training) time sequence model is input with X to obtain an estimated operation time Tp, a preset operation time difference threshold value of the operation time difference threshold value is M, first, an operation time difference between the current operation time Tn and the estimated operation time Tp is determined as ER1, and if ER1 is smaller than M, the current operation state of the temperature measuring gun tool grabbing task subtask in the temperature measuring operation process of the temperature measuring sampling robot is determined as a normal state.
In an embodiment of the present invention, taking a temperature measurement gun tool grabbing task subtask in a temperature measurement operation process of a temperature measurement sampling robot as an example, by processing operation characteristic parameters, a fusion characteristic vector in a current operation state of the temperature measurement sampling robot is obtained as X, a preset (training) time sequence model is input with X to obtain an estimated operation time Tp, and a preset operation difference threshold is N, a time difference ER2 between the current operation time Tn and the estimated operation time Tp is first determined, and if ER2 is greater than N, it is determined that the current operation state of the temperature measurement gun tool grabbing task subtask in the temperature measurement operation process of the temperature measurement sampling robot is an abnormal state.
Fig. 4 is a flowchart illustrating a robot work state determination according to an exemplary embodiment of the present application.
As shown in fig. 4, after the current job subtask is determined, the subtask job status of the job subtask is further determined, and when the subtask job status is a normal status, it is determined whether the job subtask is the last job subtask of the target job task to which the job subtask belongs, if so, it is determined that the job status is a normal status during the target job task is executed by the robot, and if not, the next job subtask is entered and the above determination is repeated; when the subtask operation state is not a normal state, namely the subtask operation state is an abnormal state, the robot is manually intervened, related workers manually judge the subtask operation state of the operation subtask according to priori knowledge, if the subtask operation state is determined to be abnormal, the operation state in the process that the robot executes the target operation task is determined to be an abnormal state, and if the subtask operation state is determined to be normal, the related operation data is stored, the next operation subtask is entered, and the robot continues to operate.
Determining the operating state of the target operation task based on the subtask operating state to identify the operating state of the target robot includes: when the subtask operation state of the current operation subtask of the target robot is an abnormal state, judging that the operation state of the target robot is the abnormal state; and when the subtask operation state of the current operation subtask of the target robot is a normal state, continuously identifying the operation state of the operation subtask at the next stage, and if the operation states of all the operation subtasks are normal states, judging that the operation state of the target robot is a normal state.
After determining the task state of the target task based on the subtask task state, the method for determining the robot task state further includes: when the system judges that the subtask operation state of the current operation subtask is an abnormal state, the subtask operation state of the current operation subtask is artificially detected based on priori knowledge; if the subtask operation state of the current operation subtask of the target robot is determined to be an abnormal state through detection, namely the system judges accurately, the current operation subtask of the target robot is stopped; if the subtask operation state of the current operation subtask of the target robot is determined to be a normal state through detection, namely the system judges abnormally, the state association relation between the current fusion feature vector and the subtask operation state is generated based on the current fusion feature vector of the robot operation subtask and the conclusion that the subtask operation state is normal, and the current fusion feature vector and the current operation time are determined as training samples to train the time sequence analysis model.
In an embodiment of the invention, taking the temperature measurement task of the temperature measurement sampling robot as an example, when the current operation subtask is determined to be the temperature measurement gun tool grabbing task according to the current operation data information, the subtask operation state of the temperature measurement gun tool grabbing task is a normal state, and the temperature measurement gun tool grabbing task is determined to be not the last subtask of the temperature measurement task, the next subtask, namely the temperature measurement probe plugging task, is entered, the above determination is repeated until the temperature measurement probe scraping subtask is entered, and the subtask operation state of the temperature measurement probe scraping subtask is determined to be a normal state by detection, so that the operation state of the target robot in the temperature measurement task process is determined to be a normal state.
In an embodiment of the present invention, taking the temperature measurement task of the temperature measurement sampling robot as an example, it is determined that the current task subtask is the temperature measurement gun tool grabbing task according to the current operation data information, and the subtask operation state of the temperature measurement gun tool grabbing task is an abnormal state, that is, an abnormal state. And the related staff intervenes to recheck the subtask operation state of the robot, determines that the subtask operation state of the grabbing task of the temperature measuring gun tool is an abnormal state based on the priori knowledge, and judges that the operation state of the target robot in the temperature measuring operation task process is the abnormal state.
In an embodiment of the present invention, taking the temperature measurement task of the temperature measurement sampling robot as an example, when it is determined from the current operation data information that the current operation subtask thereof is the temperature measurement gun tool grabbing task, and the subtask operation state of the temperature measurement gun tool grabbing task thereof is an abnormal state, that is, an abnormal state. And the related staff intervenes to recheck the subtask operation state, determines that the subtask operation state of the task grabbed by the temperature measuring gun tool is a normal state based on the priori knowledge, identifies the subtask operation state as an automatic judgment error, generates a state association relation between the current fusion characteristic vector and a judgment conclusion that the operation state is normal based on the current operation characteristic parameters of the robot, namely the current human posture signal, the robot joint torque signal, the quick-change disc device fitting signal and the quick-change disc locking signal, determines the current fusion characteristic vector and the current operation time as training samples, and trains the time sequence analysis model.
It should be understood that determining the current fused feature vector and the current working time as training samples to train the time series analysis model includes: collecting current fusion characteristic vectors and current operation time of a plurality of current operation subtasks, and marking to generate a training sample data set; training an initial time sequence analysis model through the training sample data set; and determining the trained time sequence analysis model as a subtask operation state time sequence analysis model determined by the operation state of the robot, and inputting the fusion characteristic vector of the subtask of the robot operation into the subtask operation state time sequence analysis model to obtain the subtask operation state of the operation subtask through the output estimated operation time.
In an embodiment of the invention, taking a temperature measuring gun tool grabbing task subtask in a temperature measuring operation process of a temperature measuring sampling robot as an example, firstly, acquiring operation characteristic parameters of the robot in the temperature measuring gun tool grabbing subtask, generating fusion characteristic parameters, taking the obtained fusion characteristic vectors as input sample information, artificially marking the current time values of the robot in the corresponding subtask as output sample information, and generating a sample data set; and then training the time sequence analysis model through the sample data set to obtain a temperature measuring gun grabbing time sequence analysis model. And in the process of determining the next operation state, if the acquired operation characteristic parameters are processed to obtain a fusion characteristic vector Y, inputting the obtained fusion characteristic vector Y into a temperature measuring gun grabbing time sequence analysis model to obtain an estimated operation time value, and obtaining a conclusion that the subtask operation state is normal according to the time difference between the estimated time value and the current time value.
Fig. 5 is a schematic diagram of a robot working state determination system according to an exemplary embodiment of the present application, and as shown, the robot working state determination system includes: the system comprises an operation task management module, an operation subtask management module, an information acquisition and storage module, a state signal processing module, a fusion characteristic vector generation module, a time sequence analysis module and a human-computer interaction module.
The system comprises an operation task management module, a task processing module and a task processing module, wherein the operation task management module is used for managing an operation task set of a target robot and determining one of the operation tasks as a target operation task, and the operation task set comprises one or more operation tasks; the operation subtask management module is used for decomposing the target operation task into a plurality of operation subtasks based on the priori knowledge and determining the category of the operation characteristic parameters of each operation subtask; the information acquisition and storage module is used for acquiring and storing a plurality of operation characteristic parameters of each operation subtask of the target robot; the state signal processing module is used for carrying out standardization processing on the collected multiple operation characteristic parameters to obtain multiple operation characteristic standard parameters; the fusion feature vector generation module is used for extracting features of the operation feature standard parameters to obtain a plurality of operation feature codes which have a mapping relation with the operation feature standard parameters, and fusing the operation feature codes to obtain a fusion feature vector of the target robot in the current operation subtask; the time sequence analysis module is used for acquiring estimated time operation time based on the fusion characteristic vector, further acquiring the difference between the estimated operation time and the current operation time, and judging the operation state of the robot in the current operation subtask according to the operation time difference and a preset threshold value; and the human-computer interaction module is used as an output end of the automatically identified information and an input end of the manual verification information so as to conveniently carry out manual calibration on the identification conclusion.
It should be understood that, the state signal processing module performs normalization processing on the acquired characteristic parameters, that is, the acquired parameter information with timing sequence not corresponding or large numerical range is processed by methods including but not limited to resampling, high-frequency filtering and the like, so as to obtain operation characteristic parameters with timing sequence corresponding and numerical range within a preset range; and the human-computer interaction module is used for outputting a conclusion of automatic identification, such as 'abnormal subtask operation state', inputting information after the operation state of the robot is manually detected, such as 'normal robot operation state', and the like, and inputting the conclusion of manual detection into the operation state identification and control center of the robot so as to facilitate the robot to further identify and determine the current operation state.
In addition, the robot working state determining system comprises a training mode and a recognition mode, and the two modes can be freely switched. The robot can monitor and recognize the current operation state of the robot in real time in the recognition mode, acquire more operation tasks and information such as operation subtasks, operation characteristic parameters and the like in the process of executing each operation task in the training mode, and input the acquired operation information into the operation state recognition and control center of the robot so as to improve the accuracy of the robot for automatically recognizing the operation state.
In an embodiment of the present invention, a job task 1, a job task 2, and a job task 3 exist in a job task management module, the job task 1 is taken as a target job task, the target job task and the job task 1 are divided into 4 sub-job tasks (i.e., key job spaces) according to prior knowledge, and the job characteristic parameter category of each job sub-task is specified, wherein the job characteristic parameters of the job sub-task 1 are moment information and quick change dial information; the operation characteristic parameters of the operation subtask 2 are a pose signal, a video signal and an approach signal; the operation characteristic parameters of the operation subtask 3 are a torque signal, a video signal and a temperature signal; the operation characteristic parameters of the operation subtask 4 are a pose signal and a moment signal.
In an embodiment of the present invention, taking a current operation subtask of a target robot as an operation subtask 1 as an example, a signal acquisition and storage module acquires and stores a torque signal and a quick-change disc signal in a current operation state, a state signal processing module processes the acquired torque signal and the quick-change disc signal in the current operation state based on a signal processing method including, but not limited to, high-frequency filtering, normalization, and resampling to obtain a torque signal code and a quick-change disc signal code in the current operation state, a fusion feature vector generation module fuses the torque signal code and the quick-change disc signal code in the current operation state based on a processing method including, but not limited to, serial fusion, and dimension reduction fusion to obtain a fusion feature vector in the current operation state, and the fusion feature vector is input to a time sequence analysis model corresponding to the operation subtask, where the time sequence analysis model includes, but not limited to, an HMM model, and an LSTM model, thereby determining an operation state of the subtask 1.
It should be understood that the subtask job status determination method for each job subtask is the same, and thus is not set forth herein in more detail.
It should be further noted that after the initial operation characteristic parameters are acquired by the information acquisition and storage module, the initial operation characteristic parameters need to be preprocessed to obtain standardized operation characteristic parameters for subsequent use, and the initialization method includes but is not limited to resampling, normalization, filtering and noise reduction; after obtaining the standardized operation characteristic parameters, coding the obtained operation characteristic parameters in a coding mode including but not limited to time-frequency index processing and image compression processing, wherein the obtained characteristic codes include time-frequency characteristic codes, switch characteristic codes and image characteristic codes so as to obtain operation characteristic codes of the operation characteristic parameters; the fusion feature vector generation module obtains a fusion feature vector through a method including but not limited to serial fusion or dimension reduction fusion based on the obtained multiple operation feature codes; the timing analysis module determines an estimated job time value for the current job subtask based on the resulting fused feature vector through a model including, but not limited to, an HMM model and an LSTM model. The subtask operation state determination processes of all the operation subtasks are independent and mutually incoherent, so that the related data preprocessing method, the characteristic parameter coding method, the characteristic coding fusion method and the time sequence analysis name can be the same or different, and an appropriate processing mode can be selected based on the data characteristics of all the operation characteristic parameters.
Fig. 6 is a block diagram of a robot working state determination device shown in an exemplary embodiment of the present application. The device can be applied to the implementation environment shown in fig. 1 and is specifically configured in the intelligent terminal 102. The apparatus may also be applied to other exemplary implementation environments, and is specifically configured in other devices, and the embodiment does not limit the implementation environment to which the apparatus is applied.
As shown in fig. 6, the exemplary robot working state determining apparatus includes: the robot system comprises an information acquisition module 610, a job subtask determination module 620, an information acquisition module 630, a job time difference determination module 640 and a robot job state determination module 650.
The information acquiring module 610 is configured to acquire a target operation task of a target robot and current operation information of the target robot determined by an operation state, where the current operation information determined by the operation state includes current operation time, the target operation task determined by the operation state includes a plurality of target operation subtasks, and each target operation subtask determined by the operation state is provided with preset operation information correspondingly; a job subtask determining module 620, configured to match the job status determination current job information with preset job information corresponding to each job status determination target job subtask, so as to determine a current job subtask matched with the current job information from the plurality of target job subtasks; the information acquisition module 630 is configured to acquire a plurality of job characteristic parameters of the job status determination job subtask, and perform characteristic extraction and fusion processing on the job characteristic parameters of the job status determination job to obtain a current fusion characteristic vector of the current job subtask; the operation time difference determining module 640 is configured to input the current fusion feature vector into a preset time sequence analysis model, obtain estimated operation time, and determine an operation time difference according to the current operation time and the estimated operation time; and a robot working state determining module 650 for determining a subtask working state of the current working subtask according to the working state determination working time difference, determining a target task working state of the target working task according to the working state determination subtask working state, and determining the working state of the target task determined by the working state determination target task as the robot working state of the target robot determined by the working state.
In addition, the robot working state recognition device also comprises a human-home interaction module which is used for displaying the working state judgment result obtained by the automatic robot recognition and used as an input end of the manual verification information.
In one embodiment of the invention, the robot automatically identifies and confirms that the robot working state is abnormal, and then the information of the abnormal robot working state is displayed on a display screen contained in a man-machine interaction module, so that related workers can directly read the current robot working state of the robot; after reading the information of 'abnormal robot working state', the related staff manually detects the working state of the robot based on the priori knowledge, and if the working state of the robot is consistent with the automatic robot recognition result, namely the working state of the robot is in an abnormal state, the working task of the robot is stopped; if the robot working state of the robot is detected to be a normal state by related workers based on the priori knowledge, the obtained conclusion that the robot working state is normal and the current working characteristic parameters of the robot are input to a management center of the robot to form a historical database to serve as reference data for identifying the robot working state next time, and therefore accuracy of automatic operation state identification of the robot is improved.
It should be noted that the robot working state determining apparatus provided in the foregoing embodiment and the robot working state determining method provided in the foregoing embodiment belong to the same concept, and specific ways of performing operations by each module and unit have been described in detail in the method embodiment, and are not described herein again. In practical applications, the robot working status determining apparatus provided in the above embodiment may allocate the above functions to different function modules according to needs, that is, divide the internal structure of the apparatus into different function modules to complete all or part of the above described functions, which is not limited herein.
An embodiment of the present application further provides an electronic device, including: one or more processors; a storage device for storing one or more programs, which when executed by the one or more processors, cause the electronic device to implement the robot working state determination method provided in the above-described embodiments.
FIG. 7 illustrates a schematic structural diagram of a computer system suitable for use to implement the electronic device of the embodiments of the subject application. It should be noted that the computer system 700 of the electronic device shown in fig. 7 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present application.
As shown in fig. 7, the computer system 700 includes a Central Processing Unit (CPU) 701, which can perform various appropriate actions and processes, such as executing the methods described in the above embodiments, according to a program stored in a Read-Only Memory (ROM) 702 or a program loaded from a storage section 708 into a Random Access Memory (RAM) 703. In the RAM 703, various programs and data necessary for system operation are also stored. The CPU 701, the ROM 702, and the RAM 703 are connected to each other via a bus 704. An Input/Output (I/O) interface 705 is also connected to the bus 704.
The following components are connected to the I/O interface 705: an input portion 706 including a keyboard, a mouse, and the like; an output section 707 including a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and a speaker; a storage portion 708 including a hard disk and the like; and a communication section 709 including a network interface card such as a LAN (Local area network) card, a modem, or the like. The communication section 709 performs communication processing via a network such as the internet. A drive 710 is also connected to the I/O interface 705 as needed. A removable medium 711, such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like, is mounted on the drive 710 as necessary, so that a computer program read out therefrom is mounted into the storage section 708 as necessary.
In particular, according to embodiments of the application, the processes described above with reference to the flow diagrams may be implemented as computer software programs. For example, embodiments of the present application include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising a computer program for performing the method illustrated by the flow chart. In such an embodiment, the computer program can be downloaded and installed from a network through the communication section 709, and/or installed from the removable medium 711. The computer program executes various functions defined in the system of the present application when executed by a Central Processing Unit (CPU) 701.
It should be noted that the computer readable medium shown in the embodiments of the present application may be a computer readable signal medium or a computer readable storage medium or any combination of the two. The computer readable storage medium may be, for example, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a Read-Only Memory (ROM), an Erasable Programmable Read-Only Memory (EPROM), a flash Memory, an optical fiber, a portable Compact Disc Read-Only Memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present application, a computer-readable signal medium may comprise a propagated data signal with a computer-readable computer program embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. The computer program embodied on the computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wired, etc., or any suitable combination of the foregoing.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. Each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present application may be implemented by software or hardware, and the described units may also be disposed in a processor. Wherein the names of the elements do not in some way constitute a limitation on the elements themselves.
Yet another aspect of the present application provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor of a computer, causes the computer to execute the robot work state determination method as described above. The computer-readable storage medium may be included in the electronic device described in the above embodiment, or may exist separately without being incorporated in the electronic device.
Another aspect of the application also provides a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the computer device executes the robot working state determination method provided in the above embodiments.
The foregoing embodiments are merely illustrative of the principles of the present invention and its efficacy, and are not to be construed as limiting the invention. Any person skilled in the art can modify or change the above-mentioned embodiments without departing from the spirit and scope of the present invention. Accordingly, it is intended that all equivalent modifications or changes which can be made by those skilled in the art without departing from the spirit and technical spirit of the present invention be covered by the claims of the present invention.

Claims (11)

1. A robot working state determination method, characterized by comprising:
acquiring a target operation task of a target robot and current operation information of the target robot, wherein the current operation information comprises current operation time, the target operation task comprises a plurality of target operation subtasks, and preset operation information is correspondingly arranged in each target operation subtask;
matching the current operation information with preset operation information corresponding to each target operation subtask to determine a current operation subtask matched with the current operation information from a plurality of target operation subtasks;
collecting a plurality of operation characteristic parameters of the current operation subtask, and performing characteristic extraction and fusion processing on the plurality of operation characteristic parameters to obtain a current fusion characteristic vector of the current operation subtask;
inputting the current fusion feature vector into a preset time sequence analysis model, acquiring estimated operation time, and determining an operation time difference according to the current operation time and the estimated operation time;
and determining the subtask operation state of the current operation subtask according to the operation time difference, determining the target task operation state of the target operation task according to the subtask operation state, and determining the target task operation state as the robot operation state of the target robot.
2. The method of determining a working state of a robot according to claim 1, wherein before acquiring the target working task of the target robot, the method further comprises:
acquiring a plurality of preset operation tasks of a target robot;
and dividing the preset operation tasks into a plurality of preset operation subtasks according to the priori knowledge, and determining preset operation characteristic parameters to be acquired of the preset operation subtasks and preset operation information of the preset operation subtasks.
3. The method of claim 2, wherein matching the current job information with preset job information corresponding to each of the target job subtasks to determine a current job subtask from the plurality of target job subtasks that matches the current job information comprises:
the preset operation information comprises at least one of preset operation time and a preset pose state, and the current operation information comprises at least one of current operation time and a current pose state;
matching the current operation time with the preset operation time to obtain a preset operation subtask which has an association relation with the preset operation time, and determining the preset operation subtask as the current operation subtask;
or the like, or, alternatively,
and matching the current pose state with a preset pose state to obtain a preset operation subtask associated with the preset pose state, and determining the preset operation subtask as the current operation subtask.
4. The robot work state determination method of claim 2, wherein collecting a plurality of work characteristic parameters of the current work subtask comprises:
determining the current operation subtask as a target operation subtask, and obtaining a current operation characteristic parameter identifier of the current operation subtask based on a preset operation characteristic parameter identifier of the target operation subtask;
and acquiring the operation characteristic parameters of the current operation subtask based on the current operation characteristic parameter identification, wherein the operation characteristic parameters comprise at least two of robot main body parameters, peripheral cooperation device parameters and space environment parameters of the target robot operation.
5. The method of determining a work state of a robot according to claim 1, wherein after obtaining the current fused feature vector for the current work subtask, the method further comprises:
inputting the current fusion feature vector into a preset time sequence analysis model corresponding to the current operation subtask, and obtaining model output as estimated operation time;
and determining the difference value of the current working time and the estimated working time as a working time difference.
6. The robot work state determination method according to claim 1, wherein determining a subtask work state of a current work subtask according to the work time difference comprises:
when the operation time difference of the current operation subtask of the target robot is smaller than a preset threshold value, judging that the subtask operation state of the current operation subtask of the target robot is normal, keeping the current operation state, and entering a next-stage operation subtask after the current operation subtask is completed until the target operation task is completed;
and when the operation time difference of the current operation subtask of the target robot is larger than or equal to a preset threshold value, judging that the subtask operation state of the current operation subtask of the target robot is abnormal, sending an abnormal alarm and suspending the current operation subtask.
7. The method according to claim 1, wherein determining a target task operation state of the target task based on the subtask operation state, and wherein determining the target task operation state as the robot operation state of the target robot comprises:
when the subtask operation state of the current operation subtask of the target robot is a normal state, entering a next operation subtask;
if the subtask operation state of each operation subtask is a normal state, judging that the target task operation state of the target operation task is a normal state, namely the robot operation state of the target robot is a normal state;
and if the subtask operation state of at least one operation subtask exists is an abnormal state, judging that the target task operation state of the target operation task is an abnormal state, namely the robot operation state of the target robot is an abnormal state.
8. A robot working status determination method according to any of claims 1-7, characterized in that after determining the working status of a target working task based on the subtask working status, the method further comprises:
when the system judges that the subtask operation state of the current operation subtask is an abnormal state, the subtask operation state of the current operation subtask is artificially detected based on prior knowledge,
if the subtask operation state of the current operation subtask of the target robot is determined to be an abnormal state through detection, namely the system judgment is accurate, the current operation subtask of the target robot is stopped;
and if the subtask operation state of the current operation subtask of the target robot is determined to be a normal state through detection, namely the system judges abnormally, generating a state association relation between the current fusion characteristic vector and the subtask operation state based on the current fusion characteristic vector of the robot operation subtask and the conclusion that the subtask operation state is normal, determining the current fusion characteristic vector and the current operation time as training samples, and updating and training the time sequence analysis model.
9. A robot working state determination device, characterized by comprising:
the information acquisition module is used for acquiring a target operation task of a target robot and current operation information of the target robot, wherein the current operation information comprises current operation time, the target operation task comprises a plurality of target operation subtasks, and each target operation subtask is correspondingly provided with preset operation information;
the operation subtask determining module is used for matching the current operation information with preset operation information corresponding to each target operation subtask so as to determine a current operation subtask matched with the current operation information from a plurality of target operation subtasks;
the information acquisition module is used for acquiring a plurality of operation characteristic parameters of the current operation subtask, and performing characteristic extraction and fusion processing on the plurality of operation characteristic parameters to obtain a current fusion characteristic vector of the current operation subtask;
the operation time difference determining module is used for inputting the current fusion characteristic vector into a preset time sequence analysis model to obtain estimated operation time; determining an operation time difference according to the current operation time and the estimated operation time;
and the robot working state determining module is used for determining the subtask working state of the current working subtask according to the working time difference, determining the target task working state of the target working task according to the subtask working state, and determining the target task working state as the robot working state of the target robot.
10. An electronic device, characterized in that the electronic device comprises:
one or more processors;
storage means for storing one or more programs which, when executed by the one or more processors, cause the electronic device to implement the robot work state determination method according to any one of claims 1 to 8.
11. A computer-readable storage medium, having stored thereon a computer program which, when executed by a processor of a computer, causes the computer to execute the robot working state determination method according to any one of claims 1 to 8.
CN202211493348.XA 2022-11-25 2022-11-25 Robot working state determination method, device, equipment and medium Pending CN115741713A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211493348.XA CN115741713A (en) 2022-11-25 2022-11-25 Robot working state determination method, device, equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211493348.XA CN115741713A (en) 2022-11-25 2022-11-25 Robot working state determination method, device, equipment and medium

Publications (1)

Publication Number Publication Date
CN115741713A true CN115741713A (en) 2023-03-07

Family

ID=85338284

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211493348.XA Pending CN115741713A (en) 2022-11-25 2022-11-25 Robot working state determination method, device, equipment and medium

Country Status (1)

Country Link
CN (1) CN115741713A (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150127155A1 (en) * 2011-06-02 2015-05-07 Brain Corporation Apparatus and methods for operating robotic devices using selective state space training
WO2019021058A2 (en) * 2017-07-25 2019-01-31 Mbl Limited Systems and methods for operations a robotic system and executing robotic interactions
CN109807903A (en) * 2019-04-10 2019-05-28 博众精工科技股份有限公司 A kind of robot control method, device, equipment and medium
JP2019155574A (en) * 2018-03-16 2019-09-19 日本電産株式会社 Robot control device, network system, robot parameter derivation method and program
US20200073710A1 (en) * 2018-08-30 2020-03-05 Baidu Online Network Technology (Beijing) Co., Ltd. Task scheduling method, apparatus, electronic device and storage medium
CN113568812A (en) * 2021-07-29 2021-10-29 北京奇艺世纪科技有限公司 State detection method and device for intelligent robot
CN113752266A (en) * 2021-11-09 2021-12-07 深圳市烨嘉为技术有限公司 Human-computer cooperation method, system and medium based on cooperative driving and controlling integrated robot
WO2022095616A1 (en) * 2020-11-03 2022-05-12 国网智能科技股份有限公司 On-line intelligent inspection system and method for transformer substation

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150127155A1 (en) * 2011-06-02 2015-05-07 Brain Corporation Apparatus and methods for operating robotic devices using selective state space training
WO2019021058A2 (en) * 2017-07-25 2019-01-31 Mbl Limited Systems and methods for operations a robotic system and executing robotic interactions
JP2019155574A (en) * 2018-03-16 2019-09-19 日本電産株式会社 Robot control device, network system, robot parameter derivation method and program
US20200073710A1 (en) * 2018-08-30 2020-03-05 Baidu Online Network Technology (Beijing) Co., Ltd. Task scheduling method, apparatus, electronic device and storage medium
CN109807903A (en) * 2019-04-10 2019-05-28 博众精工科技股份有限公司 A kind of robot control method, device, equipment and medium
WO2022095616A1 (en) * 2020-11-03 2022-05-12 国网智能科技股份有限公司 On-line intelligent inspection system and method for transformer substation
CN113568812A (en) * 2021-07-29 2021-10-29 北京奇艺世纪科技有限公司 State detection method and device for intelligent robot
CN113752266A (en) * 2021-11-09 2021-12-07 深圳市烨嘉为技术有限公司 Human-computer cooperation method, system and medium based on cooperative driving and controlling integrated robot

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
徐昊: "多机器人分布式协同操作未知物体的控制方法研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》, 1 June 2022 (2022-06-01) *

Similar Documents

Publication Publication Date Title
CN110889339A (en) Head and shoulder detection-based dangerous area grading early warning method and system
WO2021017000A1 (en) Method and apparatus for acquiring meter reading, and memory, processor and terminal
CN115781673A (en) Part grabbing method, device, equipment and medium
CN114612616A (en) Mapping method and device, electronic equipment and storage medium
CN110908789A (en) Visual data configuration method and system for multi-source data processing
CN114331114A (en) Intelligent supervision method and system for pipeline safety risks
CN115741713A (en) Robot working state determination method, device, equipment and medium
CN113297945A (en) Indoor equipment inspection auxiliary method and system based on mixed reality space positioning
CN113688125B (en) Abnormal value detection method and device based on artificial intelligence, electronic equipment and medium
US11947328B2 (en) Control device, control program, and control system
CN115512098A (en) Electronic bridge inspection system and inspection method
CN113608972A (en) Method, device, equipment and storage medium for displaying equipment vibration state
CN113393325A (en) Transaction detection method, intelligent device and computer storage medium
CN113191279A (en) Data annotation method, device, equipment, storage medium and computer program product
CN113158743B (en) Small target real-time detection and positioning method, system and equipment based on priori knowledge
CN114419451B (en) Method and device for identifying inside and outside of elevator, electronic equipment and storage medium
CN113177452B (en) Sample sealing method and device based on image processing and radio frequency technology
CN113393523B (en) Method and device for automatically monitoring computer room image and electronic equipment
CN114926656B (en) Object identification method, device, equipment and medium
CN113361539B (en) Instrument reading method and device of underground inspection robot and electronic equipment
CN115116008B (en) State recognition method and device for target object and storage medium
CN114705148B (en) Road bending point detection method and device based on secondary screening
CN112199418B (en) State identification method, device and equipment for industrial object
CN113554247A (en) Method, device and system for evaluating running condition of automatic guided vehicle
Petrecki et al. A new method for asynchronous mapping, localization and control of vehicles

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination