CN116423545B - Mobile cooperative robot integrated control system - Google Patents

Mobile cooperative robot integrated control system Download PDF

Info

Publication number
CN116423545B
CN116423545B CN202310375448.0A CN202310375448A CN116423545B CN 116423545 B CN116423545 B CN 116423545B CN 202310375448 A CN202310375448 A CN 202310375448A CN 116423545 B CN116423545 B CN 116423545B
Authority
CN
China
Prior art keywords
robot
execution
target
environment
instruction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310375448.0A
Other languages
Chinese (zh)
Other versions
CN116423545A (en
Inventor
杨一鸣
刘伟
杨胜体
吴创彬
朱彬能
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Mo Ying Technology Co ltd
Original Assignee
Shenzhen Mo Ying Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Mo Ying Technology Co ltd filed Critical Shenzhen Mo Ying Technology Co ltd
Priority to CN202310375448.0A priority Critical patent/CN116423545B/en
Publication of CN116423545A publication Critical patent/CN116423545A/en
Application granted granted Critical
Publication of CN116423545B publication Critical patent/CN116423545B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Abstract

The invention provides an integrated control system of a mobile cooperative robot, which comprises the following components: the acquisition module is used for acquiring first state information of the robot operating environment and acquiring second state information of the robot; the interaction module is used for outputting a robot interaction instruction based on the client; the decision module is used for determining a target execution task of the robot based on the robot interaction instruction, determining execution logic for executing the target execution task based on the first state information and the second state information, and determining a decision instruction according to the execution logic; and the control module is used for controlling the robot to perform movement cooperation based on the decision instruction. Through integrating different control items, the unified command of all control functions is conveniently realized in one main control, the accuracy of robot integrated control is guaranteed, the organic combination of feet (autonomous movement), hands (multi-axis mechanical arms) and eyes (vision) of the robot is realized, and the usability, the intelligence and the flexibility of the robot are improved.

Description

Mobile cooperative robot integrated control system
Technical Field
The invention relates to the technical field of robot control, in particular to an integrated control system of a mobile cooperative robot.
Background
In recent years, in order to promote the upgrade and development of the manufacturing industry in China, along with the continuous growth of the industrial scale and the speed increase of industrial robots, the technical innovation of the robots is also continuously developed, so that the requirements on the working efficiency, the operation accuracy and the like of the robots are higher and higher;
however, for the traditional industrial robot, the movement radius is limited by the size of the arm span due to the fixed position of the machine body, and when the movable industrial robot performs moving operation, the machine body and the mechanical arm cannot be coordinated and consistent, so that the working error is larger, and the working efficiency is reduced;
accordingly, in order to overcome the above-described problems, the present invention provides an integrated control system of a mobile cooperative robot.
Disclosure of Invention
The invention provides an integrated control system of a mobile cooperative robot, which is used for integrating different control projects, so that unified command of all control functions is conveniently realized in one main control, the accuracy of integrated control of the robot is ensured, the organic combination of feet (autonomous movement), hands (multi-axis mechanical arms) and eyes (vision) of the robot is realized, and the usability, intelligence and flexibility of the robot are improved.
The invention provides an integrated control system of a mobile cooperative robot, which comprises the following components:
the acquisition module is used for acquiring first state information of the robot operating environment and acquiring second state information of the robot;
the interaction module is used for outputting a robot interaction instruction based on the client;
the decision module is used for determining a target execution task of the robot based on the robot interaction instruction, determining execution logic for executing the target execution task based on the first state information and the second state information, and determining a decision instruction according to the execution logic;
and the control module is used for controlling the robot to perform movement cooperation based on the decision instruction.
Preferably, a mobile cooperative robot integrated control system, an acquisition module, includes:
the environment video acquisition unit is used for acquiring environment videos of an operation environment based on the robot, reading the environment videos, determining a plurality of video key frames in the environment videos, and picking a plurality of environment images corresponding to the video key frames in the environment videos;
the simulated three-dimensional space generating unit is used for splicing a plurality of environment images based on the extraction result, carrying out image deepening based on the splicing result, and outputting a simulated three-dimensional space of the operation environment on the display terminal according to the image deepening result;
A first state information obtaining unit, configured to read the simulated three-dimensional space, obtain a spatial structure of the operation environment and an operation type of the operation environment, and determine first state information of the robot operation environment based on the spatial structure of the operation environment and the operation type of the operation environment.
Preferably, an integrated control system of a mobile collaborative robot, an environmental video acquisition unit, includes:
the key frame acquisition subunit is used for acquiring a space boundary point of the operation environment and a working point of the robot in the operation environment, and determining an environment video based on the space boundary point and the working point of the robot;
the initial environment image acquisition subunit is used for positioning in the environment video based on the video key frame and picking an initial environment image corresponding to the video key frame based on a positioning result;
a noise reduction subunit configured to:
noise detection is carried out on the initial environment image, the noise type of the initial environment image is determined, and the noise reduction frequency range of the initial environment image is output in a preset noise reduction model according to the noise type;
acquiring a noise frequency band range of an initial environment image, matching the noise frequency band range in a noise reduction frequency band range based on the noise frequency band range, and determining a target noise reduction frequency band range for noise reduction of the initial environment image in the noise reduction frequency band range;
The environment image acquisition subunit is used for carrying out noise reduction processing on the initial environment image based on the target noise reduction frequency band range and acquiring the environment image corresponding to the video environment frame according to the noise reduction processing result.
Preferably, a mobile cooperative robot integrated control system, an acquisition module, includes:
a first position information obtaining unit, configured to obtain a movable part of the robot itself, position the movable part based on the robot, obtain first position information of the movable part of the robot, and determine an initial pose of the robot based on the first position information;
a second position information acquisition unit for sensing second position information of the robot in the operation environment based on the robot, and determining an initial position of the robot in the operation environment based on the second position information;
and a second state information acquisition unit configured to determine second state information of the robot based on the initial pose of the robot and the initial position of the robot in the operation environment.
Preferably, an integrated control system of a mobile cooperative robot, an interaction module, includes:
the instruction generation unit is used for acquiring the client requirements based on the client, reading the client requirements, determining a plurality of target characters in the client requirements, generating a plurality of interaction instruction elements according to the target characters, and comprehensively acquiring robot interaction instructions output by the client from the interaction instruction elements;
The communication link construction unit is used for acquiring a first communication address of the client, determining a second communication address of the robot control terminal and establishing a data communication link based on the first communication address and the second communication address;
and the instruction transmission unit is used for transmitting the robot interaction instruction from the client terminal to the robot control terminal based on the data communication link.
Preferably, a mobile cooperative robot integrated control system, the decision module includes:
the analysis unit is used for analyzing the robot interaction instruction to obtain a target execution task of the robot, wherein the target execution task comprises the following steps: a target workbench of the robot and an execution step of the robot on the target workbench;
a position point determining unit configured to determine a first position point of the target table based on the first state information, and determine a second position point of the robot based on the second state information;
a first execution gesture generating unit for determining a target route pattern of the second position reaching the first position point based on the first state information, and determining a first execution gesture of the robot according to the target route pattern;
a second execution posture generation unit for determining a second execution posture of the robot based on an execution step of the robot at the target table;
The execution logic determining unit is used for linking the first execution gesture with the second execution gesture and determining the execution logic of the robot execution target execution task based on the linked result;
and the instruction generation unit is used for generating a decision instruction based on the execution logic.
Preferably, a mobile cooperative robot integrated control system, a control module, includes:
when controlling the robot to perform movement cooperation based on the decision instruction, the execution gesture of the robot comprises: the robot chassis translates and the robot arm moves.
Preferably, a mobile cooperative robot integrated control system, a control module, includes:
a learning model construction unit for:
acquiring a first data set corresponding to a target execution task and a second data set when the robot performs mobile collaboration on the target execution task, and storing the first data set and the second data set;
performing first learning on the stored first data set, determining service characteristics of the target execution task, and generating a first learning model corresponding to the target execution task according to the service characteristics of the target execution task;
performing second learning on the second data set, determining the execution characteristics of the robot moving cooperation process, and constructing a second learning model corresponding to the robot executing the target execution task based on the execution characteristics;
Acquiring an association relation between a target execution task and robot execution operation, associating the first learning model with the second learning model based on the association relation, and generating a third learning model based on an association result;
the instruction generating unit is used for storing the third learning model, inputting the execution task into the third learning model for analysis when the execution task consistent with the execution characteristic of the target execution task exists, and outputting a first decision instruction for controlling the robot based on the analysis result;
a mobile cooperation control unit for:
performing first pre-operation on the execution task based on the first decision instruction to obtain pre-execution data of the robot, comparing the pre-execution data of the robot with preset expectations, and judging whether the first decision instruction is qualified;
when the pre-execution data is consistent with a preset expectation, judging that the first decision instruction is qualified, and controlling the robot to perform mobile cooperation on the execution task based on the first decision instruction;
when the pre-execution data is inconsistent with the preset expectation, the first decision instruction is judged to be unqualified, the first decision execution is corrected based on the difference value data of the pre-execution data and the preset expectation, a second decision instruction is generated, and the robot is controlled to perform mobile cooperation on the execution task based on the second decision instruction.
Preferably, a mobile cooperative robot integrated control system, a mobile cooperative control unit, includes:
the monitoring subunit is used for monitoring the mobile cooperation process of the robot in real time when the robot executes the task based on the preset monitoring equipment, and transmitting the monitoring result to the preset background terminal in real time;
the abnormal analysis subunit is used for dividing the flow of the monitoring result received in real time based on the background terminal to obtain a decomposition action of each execution link, calling a standard execution action corresponding to each flow based on the division result, and comparing the similarity of the decomposition action with the standard execution action;
and the alarm subunit is used for judging that the control of the second decision instruction on the robot is unqualified when the similarity value of the decomposition action and the standard execution action is smaller than a preset similarity threshold value, and regenerating the second decision instruction, otherwise, continuously monitoring the movement cooperation process of the robot until the robot completes the execution task.
Preferably, a mobile cooperative robot integrated control system, a control module, includes:
the parameter acquisition unit is used for acquiring three-phase stator mutual inductance for driving the robot chassis to translate and the radius of wheels of the robot chassis when the robot translates based on the robot chassis;
The first calculation unit is used for calculating the braking slip rate of the robot chassis based on the three-phase stator mutual inductance driving the robot chassis to translate and the radius of the wheels of the robot chassis;
the second calculation unit is used for constructing a target equation of the driving force and the braking slip rate of the motor of the robot chassis based on the braking slip rate of the robot chassis;
a movement control unit for:
determining a robot chassis motor driving force under the braking slip rate of the current robot chassis based on a target equation, and generating a robot movement control instruction based on the robot chassis motor driving force;
and controlling the robot to drive the robot chassis to drive the robot to move based on the robot movement control instruction.
Additional features and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention. The objectives and other advantages of the invention will be realized and attained by the structure particularly pointed out in the written description and claims thereof as well as the appended drawings.
The technical scheme of the invention is further described in detail through the drawings and the embodiments.
Drawings
The accompanying drawings are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate the invention and together with the embodiments of the invention, serve to explain the invention. In the drawings:
FIG. 1 is a diagram of a mobile collaborative robot integrated control system in accordance with an embodiment of the present invention;
FIG. 2 is a diagram showing a structure of an acquisition module in an integrated control system of a mobile collaborative robot according to an embodiment of the present invention;
fig. 3 is a schematic view of inspection of a robot in an integrated control system of a mobile collaborative robot in an embodiment of the invention.
Detailed Description
The preferred embodiments of the present invention will be described below with reference to the accompanying drawings, it being understood that the preferred embodiments described herein are for illustration and explanation of the present invention only, and are not intended to limit the present invention.
Example 1:
the present embodiment provides an integrated control system of a mobile cooperative robot, as shown in fig. 1, including:
the acquisition module is used for acquiring first state information of the robot operating environment and acquiring second state information of the robot;
the interaction module is used for outputting a robot interaction instruction based on the client;
the decision module is used for determining a target execution task of the robot based on the robot interaction instruction, determining execution logic for executing the target execution task based on the first state information and the second state information, and determining a decision instruction according to the execution logic;
and the control module is used for controlling the robot to perform movement cooperation based on the decision instruction.
In this embodiment, the operating environment may be a space where the robot needs to perform work or business.
In this embodiment, the first state information may be a device type contained inside the robot operating environment, an available travel route contained in the operating environment, a distribution condition of obstacles in the operating environment, and the like.
In this embodiment, the second state information may be a current position of the robot in the operating environment and a current working state, and specifically may be a working state or a standby state.
In this embodiment, the outputting of the robot interaction instruction based on the client may be that the user performs touch screen control or voice control on the robot based on the intelligent terminal.
In this embodiment, the target execution task may be an operation required to be performed by the robot, and specifically may be a target workbench of the robot and an execution step of the robot on the target workbench.
In this embodiment, the execution logic may be a representation of the operation steps corresponding to the robot executing the target execution task, the sequence of the operation steps, and the like.
In this embodiment, the decision instruction is determined according to execution logic, and is used to control each component in the robot to execute a corresponding operation according to the execution logic.
The beneficial effects of the technical scheme are as follows: through integrating different control items, the unified command of all control functions is conveniently realized in one main control, the accuracy of robot integrated control is guaranteed, the organic combination of feet (autonomous movement), hands (multi-axis mechanical arms) and eyes (vision) of the robot is realized, and the usability, the intelligence and the flexibility of the robot are improved.
Example 2:
on the basis of embodiment 1, this embodiment provides a mobile cooperative robot integrated control system, as shown in fig. 2, an acquisition module, including:
the environment video acquisition unit is used for acquiring environment videos of an operation environment based on the robot, reading the environment videos, determining a plurality of video key frames in the environment videos, and picking a plurality of environment images corresponding to the video key frames in the environment videos;
the simulated three-dimensional space generating unit is used for splicing a plurality of environment images based on the extraction result, carrying out image deepening based on the splicing result, and outputting a simulated three-dimensional space of the operation environment on the display terminal according to the image deepening result;
a first state information obtaining unit, configured to read the simulated three-dimensional space, obtain a spatial structure of the operation environment and an operation type of the operation environment, and determine first state information of the robot operation environment based on the spatial structure of the operation environment and the operation type of the operation environment.
In this embodiment, the environment video based on the robot capturing the operation environment may be video of the operation environment captured by a camera carried by the robot itself.
In this embodiment, the video key frame may be a picture that can characterize the operating environment characteristics or specific information in the collected environmental video.
In this embodiment, the ambient image may be a still picture corresponding to a video key frame.
In this embodiment, the image deepening may be to determine the actual distance between each building and the camera through the environmental image, so as to facilitate the construction of a simulated three-dimensional space.
In this embodiment, the simulated three-dimensional space may be determined from the environment image, i.e., a virtual space corresponding to the operating environment.
The beneficial effects of the technical scheme are as follows: the robot collects the environment video of the operation space and analyzes the environment video, so that the space structure of the operation environment and the operation type of the operation environment are accurately and effectively determined through the environment video, the accuracy of determining the first state information of the operation environment according to the space structure and the operation type of the operation environment is ensured, the robot is convenient to control to execute corresponding operation tasks, and the accuracy of robot operation is ensured.
Example 3:
on the basis of embodiment 2, this embodiment provides a mobile collaborative robot integrated control system, and an environmental video acquisition unit includes:
the key frame acquisition subunit is used for acquiring a space boundary point of the operation environment and a working point of the robot in the operation environment, and determining an environment video based on the space boundary point and the working point of the robot;
the initial environment image acquisition subunit is used for positioning in the environment video based on the video key frame and picking an initial environment image corresponding to the video key frame based on a positioning result;
a noise reduction subunit configured to:
noise detection is carried out on the initial environment image, the noise type of the initial environment image is determined, and the noise reduction frequency range of the initial environment image is output in a preset noise reduction model according to the noise type;
acquiring a noise frequency band range of an initial environment image, matching the noise frequency band range in a noise reduction frequency band range based on the noise frequency band range, and determining a target noise reduction frequency band range for noise reduction of the initial environment image in the noise reduction frequency band range;
the environment image acquisition subunit is used for carrying out noise reduction processing on the initial environment image based on the target noise reduction frequency band range and acquiring the environment image corresponding to the video environment frame according to the noise reduction processing result.
In this embodiment, the spatial boundary point may be a critical location of the operating environment and the non-operating environment.
In this embodiment, the positioning in the environmental video based on the video key frame may be positioning a picture corresponding to the video key frame from the environmental video.
In this embodiment, the initial ambient image may be an image corresponding to a video key frame extracted from the ambient video, which is unprocessed, i.e., contains image disturbance factors, etc.
In this embodiment, the noise detection of the initial environmental image may be performed by a preset image noise detection method.
In this embodiment, the preset noise reduction model is set in advance, and is used for performing noise reduction processing on the initial environment image.
In this embodiment, the noise reduction frequency band range may be a frequency band size in which the noise reduction model is preset to reduce noise.
In this embodiment, the noise band range may be a noise range contained in the initial environmental image.
In this embodiment, the target noise reduction band range may be a noise range size that needs to reduce noise of the initial environmental image.
The beneficial effects of the technical scheme are as follows: by analyzing the environment video, the video key frames are accurately and effectively determined from the environment video, the corresponding initial environment images are accurately and effectively extracted according to the determined video key frames, and finally, the noise reduction treatment is carried out on the obtained initial environment images, so that the accuracy and the reliability of the finally obtained environment images are ensured, the construction of a simulated three-dimensional space of an operation environment is facilitated, the accurate and effective acquisition of the first state information of the operation environment is ensured, and the control effect of a robot is ensured.
Example 4:
on the basis of embodiment 1, this embodiment provides a mobile collaborative robot integrated control system, and an acquisition module includes:
a first position information obtaining unit, configured to obtain a movable part of the robot itself, position the movable part based on the robot, obtain first position information of the movable part of the robot, and determine an initial pose of the robot based on the first position information;
a second position information acquisition unit for sensing second position information of the robot in the operation environment based on the robot, and determining an initial position of the robot in the operation environment based on the second position information;
and a second state information acquisition unit configured to determine second state information of the robot based on the initial pose of the robot and the initial position of the robot in the operation environment.
In this embodiment, the movable part is located at the bottom of the robot to drive the robot to move.
In this embodiment, the first position information may be a specific position condition characterizing the movable part above the robot.
In this embodiment, the initial pose may be a relative positional relationship between the robot body and the movable part, or the like, so as to facilitate effective determination of the second state information of the robot.
In this embodiment, the second location information may be a specific location that characterizes the robot in the operating environment.
In this embodiment, the initial position may be a starting position of the robot in the operating environment.
The beneficial effects of the technical scheme are as follows: the initial posture of the robot is determined by determining the relative position relation between the robot main body and the movable part, and then the initial position of the robot is locked by determining the position of the robot in the operation environment, and finally, the second state information of the robot is accurately and effectively determined according to the initial posture of the robot and the initial environment, so that the robot is conveniently and correspondingly controlled according to the state information of the robot, and the accuracy of the integrated control of the robot is ensured.
Example 5:
on the basis of embodiment 1, this embodiment provides a mobile collaborative robot integrated control system, and an interaction module includes:
the instruction generation unit is used for acquiring the client requirements based on the client, reading the client requirements, determining a plurality of target characters in the client requirements, generating a plurality of interaction instruction elements according to the target characters, and comprehensively acquiring robot interaction instructions output by the client from the interaction instruction elements;
The communication link construction unit is used for acquiring a first communication address of the client, determining a second communication address of the robot control terminal and establishing a data communication link based on the first communication address and the second communication address;
and the instruction transmission unit is used for transmitting the robot interaction instruction from the client terminal to the robot control terminal based on the data communication link.
In this embodiment, the customer requirements may be the type of operations that the customer needs to perform robotically and the one effect that needs to be achieved.
In this embodiment, the target character may be a keyword in text content obtained after text conversion of the client's requirement.
In this embodiment, the interaction instruction element is a part of the interaction instruction and is a basic component of the interaction instruction.
In this embodiment, the robot interaction instruction is generated according to the needs of the customer, and is used to control the robot to perform the operations required by the customer.
In this embodiment, the first communication address may be communication address information of the client, so as to facilitate corresponding interaction with the robot.
In this embodiment, the second communication address may be communication address information of the robot, so as to facilitate corresponding interaction with the client.
The beneficial effects of the technical scheme are as follows: through carrying out accurate effectual analysis to the customer demand that the customer end submitted to be convenient for generate corresponding robot interaction instruction according to the customer demand, and construct corresponding data communication link according to the communication address of customer end and robot, realize with the effectual transmission of robot interaction instruction to the robot, thereby realize carrying out effectual integrated control to the robot, improved the ease of use, intelligent and the flexibility of robot.
Example 6:
on the basis of embodiment 1, this embodiment provides a mobile collaborative robot integrated control system, and the decision-making module includes:
the analysis unit is used for analyzing the robot interaction instruction to obtain a target execution task of the robot, wherein the target execution task comprises the following steps: a target workbench of the robot and an execution step of the robot on the target workbench;
a position point determining unit configured to determine a first position point of the target table based on the first state information, and determine a second position point of the robot based on the second state information;
a first execution gesture generating unit for determining a target route pattern of the second position reaching the first position point based on the first state information, and determining a first execution gesture of the robot according to the target route pattern;
A second execution posture generation unit for determining a second execution posture of the robot based on an execution step of the robot at the target table;
the execution logic determining unit is used for linking the first execution gesture with the second execution gesture and determining the execution logic of the robot execution target execution task based on the linked result;
and the instruction generation unit is used for generating a decision instruction based on the execution logic.
In this embodiment, the first location point may be specific location information characterizing where the target workstation is located in the operating environment.
In this embodiment, the second location point may be location information where the robot is currently located in the operating environment.
In this embodiment, the target roadmap is a roadmap for characterizing the movement of the robot from the current position to the target table.
In this embodiment, the first execution pose may be a representation of the movement of the robot during the movement.
In this embodiment, the second execution pose may be a representation of the execution of the robot when executing the respective step, etc.
In this embodiment, the robot may perform inspection according to the target circuit diagram, and when a control instruction is received during the inspection, the corresponding operation may be performed, and the inspection diagram of the robot is shown in fig. 3.
The beneficial effects of the technical scheme are as follows: the interactive instruction is analyzed, so that the target execution task is accurately and effectively confirmed, the motion gesture of the robot and the gesture when the operation steps are executed are effectively confirmed according to the target execution task, and finally, the target execution task and the gesture are associated and linked, so that the execution logic of the robot for executing the target execution task is effectively analyzed, the robot is accurately and reliably controlled in an integrated mode, the accuracy and the effect of the integrated control are guaranteed, and the intelligence of the robot is improved.
Example 7:
on the basis of embodiment 1, this embodiment provides a mobile cooperative robot integrated control system, a control module includes:
when controlling the robot to perform movement cooperation based on the decision instruction, the execution gesture of the robot comprises: the robot chassis translates and the robot arm moves.
Example 8:
on the basis of embodiment 1, this embodiment provides a mobile cooperative robot integrated control system, a control module includes:
a learning model construction unit for:
acquiring a first data set corresponding to a target execution task and a second data set when the robot performs mobile collaboration on the target execution task, and storing the first data set and the second data set;
Performing first learning on the stored first data set, determining service characteristics of the target execution task, and generating a first learning model corresponding to the target execution task according to the service characteristics of the target execution task;
performing second learning on the second data set, determining the execution characteristics of the robot moving cooperation process, and constructing a second learning model corresponding to the robot executing the target execution task based on the execution characteristics;
acquiring an association relation between a target execution task and robot execution operation, associating the first learning model with the second learning model based on the association relation, and generating a third learning model based on an association result;
the instruction generating unit is used for storing the third learning model, inputting the execution task into the third learning model for analysis when the execution task consistent with the execution characteristic of the target execution task exists, and outputting a first decision instruction for controlling the robot based on the analysis result;
a mobile cooperation control unit for:
performing first pre-operation on the execution task based on the first decision instruction to obtain pre-execution data of the robot, comparing the pre-execution data of the robot with preset expectations, and judging whether the first decision instruction is qualified;
When the pre-execution data is consistent with a preset expectation, judging that the first decision instruction is qualified, and controlling the robot to perform mobile cooperation on the execution task based on the first decision instruction;
when the pre-execution data is inconsistent with the preset expectation, the first decision instruction is judged to be unqualified, the first decision execution is corrected based on the difference value data of the pre-execution data and the preset expectation, a second decision instruction is generated, and the robot is controlled to perform mobile cooperation on the execution task based on the second decision instruction.
In this embodiment, the first data set may be specific data corresponding to the target execution task, including a task type of the target execution task, a specific target of each target execution task, and the like.
In this embodiment, the second data set may be data representing correspondence of the robot when performing movement collaboration on the target, and specifically includes data such as a moving distance, an extension length of the mechanical arm, and the like.
In this embodiment, the first learning may be to analyze the first data set, so as to effectively determine the features of the target execution task, so as to generate a first learning model corresponding to the target execution task.
In this embodiment, the service characteristics may be a service type of the target execution task, a characteristic of the current service that is different from other services, and the like.
In this embodiment, the first learning model may be a model that is capable of analyzing the execution task, thereby facilitating determination of task characteristics of the execution task.
In this embodiment, the second learning may be an analysis of the second data set, thereby facilitating an efficient determination of the characteristics of the robot movement collaboration process, and also facilitating the generation of a second learning model that processes the movement collaboration data.
In this embodiment, the execution characteristic may be a cooperative relationship among links when the robot executes a task, or the like.
In this embodiment, the second learning model may be capable of analyzing the execution characteristics of the robot, so as to facilitate determination of the association relationship between the robot and the execution task.
In this embodiment, the third learning model may be obtained by associating the first learning model and the second learning model, so as to facilitate generation of a decision instruction for the robot.
In this embodiment, the first decision instruction may be an instruction for controlling the robot determined according to the result of the third learning model analysis.
In this embodiment, the first pre-operation may be to control the robot to perform the simulation operation through the obtained first decision instruction in advance, so as to check whether the first decision instruction is wrong.
In this embodiment, the pre-execution data may be operation data generated by controlling the robot to perform the corresponding operation through the first decision instruction.
In this embodiment, the preset expectations are set in advance, and are used to measure whether the control of the first decision instruction on the robot is qualified.
In this embodiment, the second decision instruction is a final decision instruction obtained by correcting the first decision instruction when the first decision instruction is not qualified.
The beneficial effects of the technical scheme are as follows: the method has the advantages that the corresponding data sets are analyzed when the target execution task and the robot perform mobile collaboration on the target execution task, the corresponding learning model is constructed according to the analysis result, and finally, the learning model is associated according to the association relation between the target execution task and the robot, so that the execution task can be accurately and effectively analyzed in time when the execution task is received, the decision instruction of the robot can be accurately acquired, and the accuracy and reliability of mobile collaboration control of the robot are guaranteed.
Example 9:
on the basis of embodiment 8, this embodiment provides a mobile cooperative robot integrated control system, a mobile cooperative control unit, including:
The monitoring subunit is used for monitoring the mobile cooperation process of the robot in real time when the robot executes the task based on the preset monitoring equipment, and transmitting the monitoring result to the preset background terminal in real time;
the abnormal analysis subunit is used for dividing the flow of the monitoring result received in real time based on the background terminal to obtain a decomposition action of each execution link, calling a standard execution action corresponding to each flow based on the division result, and comparing the similarity of the decomposition action with the standard execution action;
and the alarm subunit is used for judging that the control of the second decision instruction on the robot is unqualified when the similarity value of the decomposition action and the standard execution action is smaller than a preset similarity threshold value, and regenerating the second decision instruction, otherwise, continuously monitoring the movement cooperation process of the robot until the robot completes the execution task.
In this embodiment, the preset monitoring device is set in advance, and is used for monitoring the action condition of the robot in the mobile collaboration process.
In this embodiment, the flow dividing may divide a continuous action according to the flow, that is, split into independent execution steps, so as to facilitate checking whether each execution link meets the expected requirement.
In this embodiment, the decomposing action may be to obtain an action condition corresponding to each execution link of the robot after performing flow division on the monitoring result.
In this embodiment, the standard execution action is set in advance, and is used to measure whether the decomposition action of the robot is qualified.
In this embodiment, the preset similarity threshold is set in advance, and the reference for determining whether the decomposition action is similar to the standard execution action is adjustable.
The beneficial effects of the technical scheme are as follows: through the removal cooperation process to the robot is monitored to be convenient for in time carrying out accurate effectual assurance and understanding to the executive condition of robot, thereby be convenient for when appearing unusual, in time adjust the running condition of robot, thereby ensured the integration control effect to the robot, improved the rate of accuracy of control.
Example 10:
on the basis of embodiment 1, this embodiment provides a mobile cooperative robot integrated control system, a control module includes:
the parameter acquisition unit is used for acquiring three-phase stator mutual inductance for driving the robot chassis to translate and the radius of wheels of the robot chassis when the robot translates based on the robot chassis;
The first calculation unit is used for calculating the braking slip rate of the robot chassis based on the three-phase stator mutual inductance driving the robot chassis to translate and the radius of the wheels of the robot chassis;
wherein λ represents a braking slip rate of the robot chassis; phi represents the wheel rotation speed of the robot chassis; r represents a robotRadius of chassis wheels; m represents three-phase stator mutual inductance for driving the robot chassis to translate; t is t 0 Representing an initial point in time of movement of the robotic chassis; t is t 1 Representing a termination time point of the robot chassis movement; f (F) 0 Representing the tangential force of the wheels of the robot chassis; f (f) 1 Representing the rolling resistance of the wheels of the chassis of the robot; f (f) 2 Representing the bearing friction resistance of the robot chassis; dt represents differentiating over time;
the second calculation unit is used for constructing a target equation of the driving force and the braking slip rate of the motor of the robot chassis based on the braking slip rate of the robot chassis;
wherein F (lambda) represents a target equation of the driving force and the braking slip rate of the motor of the robot chassis; ζ represents the vertical load of the robot chassis; m represents the weight of the chassis wheels of the robot; v represents the moving speed of the robot; s represents the moving distance of the robot;
a movement control unit for:
determining a robot chassis motor driving force under the braking slip rate of the current robot chassis based on a target equation, and generating a robot movement control instruction based on the robot chassis motor driving force;
And controlling the robot to drive the robot chassis to drive the robot to move based on the robot movement control instruction.
In this embodiment, the robot movement control command may be generated based on a motor driving force of the robot chassis, and is used to control the movement driving force of the robot chassis, so as to ensure the stable operation of the robot.
The beneficial effects of the technical scheme are as follows: the target equation of the driving force and the braking slip rate of the motor of the robot chassis can be effectively constructed by accurately calculating the braking slip rate of the robot chassis, so that the driving force of the motor of the robot is controlled through the driving equation, the good power of the robot is effectively ensured, and the stable movement of the robot is realized.
It will be apparent to those skilled in the art that various modifications and variations can be made to the present invention without departing from the spirit or scope of the invention. Thus, it is intended that the present invention also include such modifications and alterations insofar as they come within the scope of the appended claims or the equivalents thereof.

Claims (7)

1. An integrated control system for a mobile cooperative robot, comprising:
the acquisition module is used for acquiring first state information of the robot operating environment and acquiring second state information of the robot;
The interaction module is used for outputting a robot interaction instruction based on the client;
the decision module is used for determining a target execution task of the robot based on the robot interaction instruction, determining execution logic for executing the target execution task based on the first state information and the second state information, and determining a decision instruction according to the execution logic;
the control module is used for controlling the robot to perform movement cooperation based on the decision instruction;
an acquisition module comprising:
the environment video acquisition unit is used for acquiring environment videos of an operation environment based on the robot, reading the environment videos, determining a plurality of video key frames in the environment videos, and picking a plurality of environment images corresponding to the video key frames in the environment videos;
the simulated three-dimensional space generating unit is used for splicing a plurality of environment images based on the extraction result, carrying out image deepening based on the splicing result, and outputting a simulated three-dimensional space of the operation environment on the display terminal according to the image deepening result;
a first state information obtaining unit, configured to read the simulated three-dimensional space, obtain a spatial structure of the operation environment and an operation type of the operation environment, and determine first state information of the robot operation environment based on the spatial structure of the operation environment and the operation type of the operation environment;
An environmental video acquisition unit comprising:
the key frame acquisition subunit is used for acquiring a space boundary point of the operation environment and a working point of the robot in the operation environment, and determining an environment video based on the space boundary point and the working point of the robot;
the initial environment image acquisition subunit is used for positioning in the environment video based on the video key frame and picking an initial environment image corresponding to the video key frame based on a positioning result;
a noise reduction subunit configured to:
noise detection is carried out on the initial environment image, the noise type of the initial environment image is determined, and the noise reduction frequency range of the initial environment image is output in a preset noise reduction model according to the noise type;
acquiring a noise frequency band range of an initial environment image, matching the noise frequency band range in a noise reduction frequency band range based on the noise frequency band range, and determining a target noise reduction frequency band range for noise reduction of the initial environment image in the noise reduction frequency band range;
the environment image acquisition subunit is used for carrying out noise reduction processing on the initial environment image based on the target noise reduction frequency band range and acquiring an environment image corresponding to the video environment frame according to the noise reduction processing result;
an acquisition module comprising:
a first position information obtaining unit, configured to obtain a movable part of the robot itself, position the movable part based on the robot, obtain first position information of the movable part of the robot, and determine an initial pose of the robot based on the first position information;
A second position information acquisition unit for sensing second position information of the robot in the operation environment based on the robot, and determining an initial position of the robot in the operation environment based on the second position information;
and a second state information acquisition unit configured to determine second state information of the robot based on the initial pose of the robot and the initial position of the robot in the operation environment.
2. The mobile collaborative robot integrated control system of claim 1, wherein the interaction module includes:
the instruction generation unit is used for acquiring the client requirements based on the client, reading the client requirements, determining a plurality of target characters in the client requirements, generating a plurality of interaction instruction elements according to the target characters, and comprehensively acquiring robot interaction instructions output by the client from the interaction instruction elements;
the communication link construction unit is used for acquiring a first communication address of the client, determining a second communication address of the robot control terminal and establishing a data communication link based on the first communication address and the second communication address;
and the instruction transmission unit is used for transmitting the robot interaction instruction from the client terminal to the robot control terminal based on the data communication link.
3. The mobile collaborative robot integrated control system of claim 1, wherein the decision-making module comprises:
the analysis unit is used for analyzing the robot interaction instruction to obtain a target execution task of the robot, wherein the target execution task comprises the following steps: a target workbench of the robot and an execution step of the robot on the target workbench;
a position point determining unit configured to determine a first position point of the target table based on the first state information, and determine a second position point of the robot based on the second state information;
a first execution gesture generating unit for determining a target route pattern of the second position reaching the first position point based on the first state information, and determining a first execution gesture of the robot according to the target route pattern;
a second execution posture generation unit for determining a second execution posture of the robot based on an execution step of the robot at the target table;
the execution logic determining unit is used for linking the first execution gesture with the second execution gesture and determining the execution logic of the robot execution target execution task based on the linked result;
and the instruction generation unit is used for generating a decision instruction based on the execution logic.
4. The mobile collaborative robot integrated control system of claim 1, wherein the control module includes:
when controlling the robot to perform movement cooperation based on the decision instruction, the execution gesture of the robot comprises: the robot chassis translates and the robot arm moves.
5. The mobile collaborative robot integrated control system of claim 1, wherein the control module includes:
a learning model construction unit for:
acquiring a first data set corresponding to a target execution task and a second data set when the robot performs mobile collaboration on the target execution task, and storing the first data set and the second data set;
performing first learning on the stored first data set, determining service characteristics of the target execution task, and generating a first learning model corresponding to the target execution task according to the service characteristics of the target execution task;
performing second learning on the second data set, determining the execution characteristics of the robot moving cooperation process, and constructing a second learning model corresponding to the robot executing the target execution task based on the execution characteristics;
acquiring an association relation between a target execution task and robot execution operation, associating the first learning model with the second learning model based on the association relation, and generating a third learning model based on an association result;
The instruction generating unit is used for storing the third learning model, inputting the execution task into the third learning model for analysis when the execution task consistent with the execution characteristic of the target execution task exists, and outputting a first decision instruction for controlling the robot based on the analysis result;
a mobile cooperation control unit for:
performing first pre-operation on the execution task based on the first decision instruction to obtain pre-execution data of the robot, comparing the pre-execution data of the robot with preset expectations, and judging whether the first decision instruction is qualified;
when the pre-execution data is consistent with a preset expectation, judging that the first decision instruction is qualified, and controlling the robot to perform mobile cooperation on the execution task based on the first decision instruction;
when the pre-execution data is inconsistent with the preset expectation, the first decision instruction is judged to be unqualified, the first decision execution is corrected based on the difference value data of the pre-execution data and the preset expectation, a second decision instruction is generated, and the robot is controlled to perform mobile cooperation on the execution task based on the second decision instruction.
6. The mobile cooperative robot integrated control system of claim 5, wherein the mobile cooperative control unit comprises:
The monitoring subunit is used for monitoring the mobile cooperation process of the robot in real time when the robot executes the task based on the preset monitoring equipment, and transmitting the monitoring result to the preset background terminal in real time;
the abnormal analysis subunit is used for dividing the flow of the monitoring result received in real time based on the background terminal to obtain a decomposition action of each execution link, calling a standard execution action corresponding to each flow based on the division result, and comparing the similarity of the decomposition action with the standard execution action;
and the alarm subunit is used for judging that the control of the second decision instruction on the robot is unqualified when the similarity value of the decomposition action and the standard execution action is smaller than a preset similarity threshold value, and regenerating the second decision instruction, otherwise, continuously monitoring the movement cooperation process of the robot until the robot completes the execution task.
7. The mobile collaborative robot integrated control system of claim 1, wherein the control module includes:
the parameter acquisition unit is used for acquiring three-phase stator mutual inductance for driving the robot chassis to translate and the radius of wheels of the robot chassis when the robot translates based on the robot chassis;
The first calculation unit is used for calculating the braking slip rate of the robot chassis based on the three-phase stator mutual inductance driving the robot chassis to translate and the radius of the wheels of the robot chassis;
the second calculation unit is used for constructing a target equation of the driving force and the braking slip rate of the motor of the robot chassis based on the braking slip rate of the robot chassis;
a movement control unit for:
determining a robot chassis motor driving force under the braking slip rate of the current robot chassis based on a target equation, and generating a robot movement control instruction based on the robot chassis motor driving force;
and controlling the robot to drive the robot chassis to drive the robot to move based on the robot movement control instruction.
CN202310375448.0A 2023-03-30 2023-03-30 Mobile cooperative robot integrated control system Active CN116423545B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310375448.0A CN116423545B (en) 2023-03-30 2023-03-30 Mobile cooperative robot integrated control system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310375448.0A CN116423545B (en) 2023-03-30 2023-03-30 Mobile cooperative robot integrated control system

Publications (2)

Publication Number Publication Date
CN116423545A CN116423545A (en) 2023-07-14
CN116423545B true CN116423545B (en) 2024-04-12

Family

ID=87079189

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310375448.0A Active CN116423545B (en) 2023-03-30 2023-03-30 Mobile cooperative robot integrated control system

Country Status (1)

Country Link
CN (1) CN116423545B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108297098A (en) * 2018-01-23 2018-07-20 上海大学 The robot control system and method for artificial intelligence driving
CN113325837A (en) * 2021-04-23 2021-08-31 北京启安智慧科技有限公司 Control system and method for multi-information fusion acquisition robot
WO2022188379A1 (en) * 2021-03-12 2022-09-15 国网智能科技股份有限公司 Artificial intelligence system and method serving electric power robot

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108297098A (en) * 2018-01-23 2018-07-20 上海大学 The robot control system and method for artificial intelligence driving
WO2022188379A1 (en) * 2021-03-12 2022-09-15 国网智能科技股份有限公司 Artificial intelligence system and method serving electric power robot
CN113325837A (en) * 2021-04-23 2021-08-31 北京启安智慧科技有限公司 Control system and method for multi-information fusion acquisition robot

Also Published As

Publication number Publication date
CN116423545A (en) 2023-07-14

Similar Documents

Publication Publication Date Title
Triyonoputro et al. Quickly inserting pegs into uncertain holes using multi-view images and deep network trained on synthetic data
Bagnell et al. An integrated system for autonomous robotics manipulation
Subakti et al. Indoor augmented reality using deep learning for industry 4.0 smart factories
US11741701B2 (en) Autonomous task performance based on visual embeddings
Beetz et al. Integrated, plan-based control of autonomous robot in human environments
US20220221364A1 (en) Sensor calibration method, apparatus, and device, data measurement method, apparatus, and device, and storage medium
CN113826051A (en) Generating digital twins of interactions between solid system parts
CN109159113B (en) Robot operation method based on visual reasoning
Hudson et al. Model-based autonomous system for performing dexterous, human-level manipulation tasks
CN116494201A (en) Monitoring integrated power machine room inspection robot and unmanned inspection method
CN116423545B (en) Mobile cooperative robot integrated control system
KR20230078760A (en) assembly monitoring system
CN113752264A (en) Mechanical arm intelligent equipment control method and system based on digital twins
Al-Amin et al. Sensor data based models for workforce management in smart manufacturing
CN114603599A (en) Robot collision detection method and device, computer equipment and storage medium
CN113050501A (en) Workshop virtual monitoring system and service terminal
CN115461199A (en) Task-oriented 3D reconstruction for autonomous robotic operation
Lin et al. Inference of 6-DOF robot grasps using point cloud data
CN110308669B (en) Modular robot self-repairing simulation system and method
CN113758481A (en) Grid map generation method, device, system, storage medium and electronic equipment
CN109934155B (en) Depth vision-based collaborative robot gesture recognition method and device
Hou et al. Mobile Manipulation Tutorial
Cintas et al. Robust behavior and perception using hierarchical state machines: A pallet manipulation experiment
CN113516122A (en) Robot vision system and method for intelligent energy conservation operation of power distribution room
CN112631272A (en) Method and equipment for remotely recovering power of robot

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant