CN116713992A - Electrical control system, method and device for humanoid robot - Google Patents

Electrical control system, method and device for humanoid robot Download PDF

Info

Publication number
CN116713992A
CN116713992A CN202310697602.6A CN202310697602A CN116713992A CN 116713992 A CN116713992 A CN 116713992A CN 202310697602 A CN202310697602 A CN 202310697602A CN 116713992 A CN116713992 A CN 116713992A
Authority
CN
China
Prior art keywords
humanoid robot
robot
unit
instruction
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310697602.6A
Other languages
Chinese (zh)
Inventor
姚运昌
朱世强
谢安桓
孔令雨
华强
周伟刚
程超
陈星宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Lab
Original Assignee
Zhejiang Lab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Lab filed Critical Zhejiang Lab
Priority to CN202310697602.6A priority Critical patent/CN116713992A/en
Publication of CN116713992A publication Critical patent/CN116713992A/en
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/161Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Abstract

The invention discloses an electrical control system for a humanoid robot, which comprises a robot body and a driving mechanism for driving the robot to move, wherein the electrical control system comprises a positioning fusion module, an understanding decision module, a motion control module and a joint driving module, and high-efficiency cooperative work among the modules is realized through a multi-core heterogeneous processing mode and a private 5G network. The invention also provides an electrical control method and device for the humanoid robot. The device provided by the invention is used for realizing the motion control of the humanoid robot in multiple scenes, high reliability and high real-time performance.

Description

Electrical control system, method and device for humanoid robot
Technical Field
The invention belongs to the technical field of robot control, and particularly relates to an electrical control system, method and device for a humanoid robot.
Background
At present, industrial robots have exerted great roles in various industries, along with the continuous development of computer technology and continuous progress of artificial intelligence, robots gradually permeate from industrial fields to fields of service, entertainment, education and the like, wherein the high intelligent characteristics of humanoid robots determine that the fields of service, entertainment and the like have great potential markets, and at present, various countries are actively developing research and development work of humanoid robots, and a large number of research units and enterprises at home and abroad push out related research products, and after a few years, the humanoid robots are applied to various fields of human life.
The working environment of the humanoid robot has the characteristics of multiple scenes, diversity, dynamics, uncertainty, complexity and the like, and high requirements are provided for the scene understanding, positioning accuracy, interaction capability, stability and the like. In order to meet the requirements, the humanoid robot needs to be provided with various sensing sensors such as positioning, vision, voice, touch and the like, and meanwhile needs to be provided with high real-time motion control performance to complete more complex tasks, so that a robot electrical control system with high intelligent degree, high computing capacity, good stability, high response speed, high instantaneity and high data bandwidth needs to be developed to realize stable and reliable control of the robot. At present, most humanoid robots mostly adopt a general CPU architecture to build an electric system of the whole machine, and cannot meet the large-scale operation tasks of participation of perception sensors such as vision, voice, positioning navigation and the like; in addition, most electrical control systems place non-real-time tasks and real-time tasks of the robot in a single system, and cannot effectively meet the high real-time control requirements of the robot.
Patent document CN110666820a discloses a high-performance industrial robot controller, which comprises a processor single board, a machine vision unit, a demonstrator, an external IO unit, a gripper unit, an external sensor and a motor driver; the processor single board comprises an ARM processor unit, an FPGA unit, a power supply module, an Ethernet interface A, an Ethernet interface B, an Ethernet interface C, an Ethernet interface D, IO interface and a CAN interface. The system adopts the FPGA unit to process all sensor data, and the independent FPGA unit needs to build a huge and complex system to finish the data processing problems of the perception sensors such as vision, voice, positioning navigation and the like.
Patent document CN115599024a discloses a high-precision turntable control system based on EtherCAT bus, which comprises a high-performance motion controller, a high-precision servo driver, a touch screen, a filter and a control power supply; the high-performance motion controller is communicated with a plurality of high-precision servo drivers through an EtherCAT bus, the high-precision servo drivers are connected with the filter, the touch screen is connected with the high-performance motion controller, and the control power supply is respectively connected with the high-performance motion controller and the high-precision servo drivers. The device connects a plurality of servo drivers through the EtherCAT bus, but how to introduce the method into a humanoid robot control system does not propose specific implementation means.
Disclosure of Invention
The invention aims to provide an electrical control system, method and device for a humanoid robot, which are used for realizing motion control of the humanoid robot in multiple scenes, high reliability and high real-time.
To achieve the first object, the present invention provides an electrical control system for a humanoid robot including a robot body and a driving mechanism driving the robot to move, the electrical control system including a localization fusion module, an understanding decision module, a motion control module, and a joint driving module.
And the positioning fusion module is used for acquiring the direction and coordinate information of the robot body so as to generate positioning information.
The understanding decision module is used for acquiring environmental data of a scene where the robot body is located, uploading the environmental data to the cloud for decision making to generate an interaction instruction and an action instruction, completing the interaction action of the robot body based on the interaction instruction in the robot body, and generating a motion instruction based on the action instruction and the positioning information.
The motion control module comprises a control calculation unit and an EtherCAT master station, wherein the EtherCAT master station is used for acquiring real-time state data of each joint and sending the joint instruction to a corresponding joint driving module by performing kinematics and dynamics analysis on a robot body through the control calculation unit according to the generated motion instruction and the real-time state data of each joint so as to acquire the joint instruction of each joint.
The joint driving module comprises a driving unit and an EtherCAT slave station with a plurality of state sensors, wherein the driving unit receives joint instructions of the EtherCAT master station, adjusts the output of the driving mechanism, and simultaneously transmits real-time state data acquired by the state sensors to the EtherCAT master station.
Preferably, the direction and coordinate information of the plurality of sensors are processed through data fusion and comparison crossing data to generate positioning information of the robot, and errors among the sensors can be eliminated through the data fusion and comparison crossing data, so that more accurate positioning information is obtained.
Specifically, the understanding decision module comprises a visual unit and a voice unit for acquiring environmental data, a cloud decision unit for outputting instructions and a perception planning unit;
the visual unit is used for acquiring image information of an external environment and an obstacle in a scene where the robot body is located;
the voice unit is used for acquiring voice information of an external environment;
the cloud decision unit generates corresponding interaction instructions and action instructions according to the issued task instructions and the received environment data;
the perception planning unit is used for uploading data and analyzing the data, the data uploading comprises the step of compressing and sending image information and voice information to the cloud decision unit as environment data, the data analyzing comprises the step of generating a motion instruction according to the motion instruction and the positioning information, and the step of generating the interaction action of the humanoid robot based on the interaction instruction.
Specifically, the cloud decision unit comprises a debugging terminal and an understanding decision end, wherein the debugging terminal is used for acquiring real-time state data of each joint in the EtherCAT master station to obtain the working state of the robot body, the understanding decision end is used for analyzing according to the issued task instruction and combining with environment data and the working state to obtain corresponding interaction instruction and action instruction, and the task instruction is issued by an operator or is automatically distributed according to a working plan.
Specifically, the action instruction includes a target location, a moving speed, and an action gesture.
Specifically, the interactive actions include audio output control and video display of the humanoid robot.
Preferably, the electrical control system adopts a private 5G network to perform data transmission between functional units of the understanding decision module and the motion control module, so that the coordination of a heterogeneous multi-core processing unit and a cloud decision unit to process the multi-tasks of the robot is ensured, the high-performance computing and analyzing capability is realized under the conditions of low power consumption and low cost of a robot body, the high-efficiency operation of the robot in a complex environment is ensured, and meanwhile, the coordination operation among multiple robots can be realized.
In order to achieve the second object, the present invention provides an electrical control method for a humanoid robot, which is implemented by the above electrical control system, comprising the steps of:
step 1, powering up the robot, and after the power-up is successful, collecting information by a state sensor and uploading the information to a debugging terminal of a cloud decision unit so as to feed back the working state of the whole humanoid robot.
And 2, acquiring environmental data in a scene where the humanoid robot is located through a voice unit and a visual unit, and compressing and uploading the environmental data through a perception planning unit.
And 3, performing task analysis on the task instruction by an understanding decision end of the cloud decision unit, performing scene understanding algorithm processing on the environment data, and generating corresponding interaction instructions and action instructions according to the task analysis result, the scene understanding result and the working state of the overall humanoid robot.
And 4, acquiring positioning information of the humanoid robot through a positioning fusion module, and sending the positioning information to a perception planning unit.
Step 5, the perception planning unit executes audio output control and video display of the humanoid robot according to the interaction instruction;
meanwhile, the perception planning unit performs path planning according to the positioning information and the action instruction to generate a corresponding motion instruction.
And 6, controlling the computing unit to carry out dynamic and kinematic settlement by combining the motion instruction and the real-time state data of each joint so as to obtain a corresponding joint instruction and sending the corresponding joint instruction to the joint driving module through the EtherCAT master station.
And 7, the driving unit adjusts the output of the driving mechanism according to the received joint instruction so that the humanoid robot executes tasks, and meanwhile, the EtherCAT slave station sends real-time state data acquired by the plurality of state sensors to the EtherCAT master station.
Specifically, the electrical control method further comprises a self-checking process of the humanoid robot, the debugging terminal evaluates according to the working state of the whole humanoid robot, when the evaluation result is that the humanoid robot cannot work normally, a prompt to be overhauled is sent, and when the evaluation result is that the humanoid robot can work normally, an executable task prompt is sent.
To achieve the third object, the present invention provides an electrical control device for a humanoid robot, comprising a computer memory, a sensor, a computer processor and a computer program stored in the computer memory and executable on the computer processor, the computer processor employing the electrical control system for a humanoid robot described above.
The computer processor, when executing the computer program, performs the steps of: and generating corresponding interaction instructions and action instructions in real time through the electrical control system according to the input task instructions so as to realize task execution and man-machine interaction of the humanoid robot.
Compared with the prior art, the invention has the beneficial effects that:
the electrical control system is divided into a positioning fusion module, an understanding decision module, a motion control module and a joint driving module, and efficient coordination among the modules is realized through a multi-core heterogeneous processing mode and a private 5G network, so that the motion control of the humanoid robot with multiple scenes, high reliability and high real-time performance is realized.
Drawings
Fig. 1 is a schematic diagram of an electrical control system according to the present embodiment;
fig. 2 is a flowchart of an electrical control method according to the present embodiment;
fig. 3 is a schematic diagram of an electrical control device according to the present embodiment;
FIG. 4 is a schematic diagram of a therCAT conversion circuit logic according to the present embodiment.
Detailed Description
The objects and effects of the present invention will become more apparent from the following detailed description of the preferred embodiments and the accompanying drawings, it being understood that the specific embodiments described herein are merely illustrative of the invention and not limiting thereof.
An electrical control system for a humanoid robot includes a robot body and a driving mechanism that drives the robot to move.
As shown in fig. 1, the electrical control system includes a localization fusion module, an understanding decision module, a motion control module, and a joint driving module.
The positioning fusion module is used for obtaining the position information of the robot, obtaining the direction and the position information of the robot through a plurality of positioning sensors such as UWB, laser radar, IMU, odometer and GPS, performing data fusion by adopting an FPGA processor, and performing error elimination on crossed data information to obtain accurate positioning information.
The understanding decision module comprises a visual unit, a voice unit, a cloud decision unit and a perception planning unit.
The vision unit is used for acquiring image information of the external environment of the robot and the objects, and comprises a depth camera and a display screen, wherein the depth camera is used for identifying the objects, the characters and the like in the environment, and the display screen can output interactive videos.
The voice unit is used for acquiring voice information of an external environment and comprises a microphone array and a loudspeaker device, wherein the microphone array is used for collecting voice instructions and processing the voice instructions, and the loudspeaker is used for outputting voice so as to achieve a man-machine interaction effect.
The cloud decision unit generates corresponding interaction instructions and action instructions according to the issued task instructions and the received environment data.
The perception planning unit is used for uploading data and analyzing the data:
the data uploading includes compressing and sending image information and voice information as environment data to a cloud decision unit.
The data analysis includes generating motion instructions from the motion instructions and the positioning information, and generating interactive motions of the humanoid robot based on the interactive instructions.
In addition, the cloud decision unit comprises a debugging terminal and an understanding decision end, wherein the debugging terminal is used for acquiring real-time state data of each joint in the EtherCAT master station to obtain the working state of the robot body, the understanding decision end is used for analyzing according to received task instructions and combining environment data and the working state to obtain corresponding interaction instructions and action instructions, and the task instructions are issued by an operator or distributed automatically according to a working plan.
The cloud decision unit comprises a large amount of calculation force, can provide calculation force support for a plurality of robots at the same time, coordinates the cooperation of the plurality of robots, and reduces the calculation pressure of the humanoid robot by transferring complex task instruction analysis work to the cloud decision unit.
The motion control module comprises a control calculation unit and an EtherCAT master station, wherein the control calculation unit is used for carrying out kinematics and dynamics analysis on the robot body to obtain joint instructions of all joints according to the generated motion instructions and the real-time state data of all joints, and the EtherCAT master station is used for obtaining the real-time state data of all joints and sending the real-time state data to the corresponding joint driving modules.
The joint driving module comprises a driving unit and an EtherCAT slave station with a plurality of state sensors, wherein the driving unit receives joint instructions of the EtherCAT master station, adjusts the output of the driving mechanism, and simultaneously transmits real-time state data acquired by the plurality of state sensors to the EtherCAT master station.
In addition, the understanding decision module and the motion control module perform data transmission through a private 5G network with high bandwidth and high safety, low-delay interconnection is realized, the cooperation of a heterogeneous multi-core processing unit and a cloud decision unit is guaranteed to process the robot multitasking, high-performance computing and analyzing capability is realized under the conditions of low power consumption and low cost of a robot body, high-efficiency operation of the robot in a complex environment is guaranteed, and meanwhile cooperative operation among multiple robots can be realized.
As shown in fig. 2, an electrical control method for a humanoid robot provided for this example is implemented by the electrical control system described in the above embodiment, and includes the following steps:
step 1, powering up the robot, collecting information by a state sensor after the power-up is successful and uploading the information to a debugging terminal of a cloud decision unit so as to feed back the working state of the whole humanoid robot, evaluating the whole humanoid robot according to the working state of the whole humanoid robot by the debugging terminal, sending a prompt to be overhauled when the evaluation result is that the whole humanoid robot cannot work normally, and sending an executable task prompt when the evaluation result is that the whole humanoid robot can work normally.
And 2, acquiring environmental data in a scene where the humanoid robot is located through a voice unit and a visual unit, and compressing and uploading the environmental data through a perception planning unit.
And 3, performing task analysis on the task instruction by an understanding decision end of the cloud decision unit, performing scene understanding algorithm processing on the environment data, and generating corresponding interaction instructions and action instructions according to the task analysis result, the scene understanding result and the working state of the overall humanoid robot.
And 4, acquiring positioning information of the humanoid robot through a positioning fusion module, and sending the positioning information to a perception planning unit.
And 5, the perception planning unit executes audio output control and video display of the humanoid robot according to the interaction instruction.
Meanwhile, the perception planning unit generates a corresponding motion instruction according to the positioning information and the motion instruction.
And 6, controlling the computing unit to carry out dynamic and kinematic settlement by combining the motion instruction and the real-time state data of each joint so as to obtain a corresponding joint instruction and sending the corresponding joint instruction to the joint driving module through the EtherCAT master station.
And 7, the driving unit adjusts the output of the driving mechanism according to the received joint instruction so that the humanoid robot executes tasks, and meanwhile, the EtherCAT slave station sends real-time state data acquired by the plurality of state sensors to the EtherCAT master station.
As shown in fig. 3, an electrical control device for a humanoid robot according to this embodiment includes a communication device, a data calculation unit with FPGA, a sensing and planning unit with ARM, a control calculation unit with X86, a sensor, and a driving device.
Wherein the sensor comprises:
the system comprises a GPS module, an IMU sensor, a UWB sensor, an odometer and a laser radar, wherein the GPS module, the IMU sensor, the UWB sensor, the odometer and the laser radar are used together with a data calculation unit with an FPGA, so that state information and positioning information of the robot are obtained.
The system comprises a microphone array, a loudspeaker, an audio acquisition card, an audio power amplifier, a depth vision camera and a face display screen, wherein the microphone array is matched with a perception planning unit with ARM;
a status sensor for use with a control computing unit having an X86.
The communication device comprises a private 5G network and an EtherCAT network.
The driving device includes a power supply battery for managing a battery-powered BMS system.
As shown in fig. 4, the EtherCAT conversion circuit logic specifically comprises the following steps:
and acquiring the original data of each sensor according to a specified sampling sequence.
And intercepting and converting the original data into data with a unified structure.
And after the data with the uniform structure is sent to the EtherCAT network through the EtherCAT slave station, the data is received and processed by the EtherCAT master station.
According to the embodiment, the invention performs multi-task data processing through the cooperation of different functional units, realizes high-performance computing and resolving power under the condition of low power consumption, realizes non-real-time scene understanding and high-reliability execution of real-time motion control tasks, ensures that a robot body realizes high-efficiency work under different environments under the low calculation power requirement, and reduces cost and system complexity. The system comprises a data processing unit with low delay and high concurrency, a perception planning unit with low power consumption, a control unit and a control unit, wherein the data processing unit is used for realizing the acquisition and fusion of position and direction data, and the perception planning unit with low power consumption is used for compressing voice and visual data, performing man-machine interaction, realizing a path and obstacle avoidance planning algorithm and outputting motion instructions; processing voice and visual information by utilizing powerful computing power resources of the cloud decision unit, realizing semantic understanding of complex and diverse scene environments, simultaneously analyzing task instructions sent by the debugging terminal, and outputting interaction instructions; and a high-performance and compatible control computing unit is utilized to realize motion control algorithm calculation and EtherCAT master station network, so that a stable real-time network is realized.
It will be appreciated by persons skilled in the art that the foregoing description is a preferred embodiment of the invention, and is not intended to limit the invention, but rather to limit the invention to the specific embodiments described, and that modifications may be made to the technical solutions described in the foregoing embodiments, or equivalents may be substituted for elements thereof, for the purposes of those skilled in the art. Modifications, equivalents, and alternatives falling within the spirit and principles of the invention are intended to be included within the scope of the invention.

Claims (10)

1. An electrical control system for a humanoid robot comprising a robot body and a drive mechanism driving the robot in motion, comprising:
the positioning fusion module is used for acquiring the direction and coordinate information of the robot body so as to generate positioning information;
the understanding decision module is used for acquiring environmental data of a scene where the robot body is located, uploading the environmental data to a cloud end for decision making to generate an interaction instruction and an action instruction, completing the interaction action of the robot body based on the interaction instruction in the robot body, and generating a motion instruction based on the action instruction and positioning information;
the motion control module comprises a control calculation unit and an EtherCAT master station, wherein the EtherCAT master station is used for acquiring the real-time state data of each joint and sending the joint command to the corresponding joint driving module according to the motion command and the real-time state data of each joint;
the joint driving module comprises a driving unit and an EtherCAT slave station with a plurality of state sensors, wherein the driving unit receives joint instructions of the EtherCAT master station, adjusts the output of the driving mechanism, and simultaneously transmits real-time state data acquired by the state sensors to the EtherCAT master station.
2. The electrical control system for a humanoid robot of claim 1, wherein the orientation and coordinate information of the plurality of sensors is processed by data fusion and comparison of crossing data to generate positioning information of the robot.
3. The electrical control system for a humanoid robot according to claim 1, wherein the understanding decision module comprises a visual unit and a voice unit for acquiring environmental data, a cloud decision unit for instruction output, and a perception planning unit;
the visual unit is used for acquiring image information of an external environment and an obstacle in a scene where the robot body is located;
the voice unit is used for acquiring voice information of an external environment;
the cloud decision unit generates corresponding interaction instructions and action instructions according to the issued task instructions and the received environment data;
the perception planning unit is used for uploading data and analyzing the data, the data uploading comprises the step of compressing and sending image information and voice information to the cloud decision unit as environment data, the data analyzing comprises the step of generating a motion instruction according to the motion instruction and the positioning information, and the step of generating the interaction action of the humanoid robot based on the interaction instruction.
4. The electrical control system for a humanoid robot according to claim 1 or 3, wherein the cloud decision unit comprises a debugging terminal and an understanding decision end, the debugging terminal is used for acquiring real-time state data of each joint in the EtherCAT master station to obtain the working state of the robot body, and the understanding decision end is used for analyzing according to the issued task instruction and combining the environment data and the working state to obtain the corresponding interaction instruction and the action instruction.
5. An electrical control system for a humanoid robot as claimed in claim 1 or 3, characterized in that the action instructions comprise a target location, a movement speed and an action pose.
6. An electrical control system for a humanoid robot as claimed in claim 1 or 3, characterised in that the interaction comprises audio output control and visual display of the humanoid robot.
7. The electrical control system for a humanoid robot of claim 1, wherein the electrical control system employs a private 5G network for data transmission between functional units of the understanding decision module and the motion control module.
8. An electrical control method for a humanoid robot, realized by an electrical control system for a humanoid robot as claimed in any one of claims 1 to 7, comprising:
step 1, powering up a robot, wherein after the robot is powered up successfully, a state sensor acquires information and uploads the information to a debugging terminal of a cloud decision unit so as to feed back the working state of the whole robot;
step 2, acquiring environmental data in a scene where the humanoid robot is located through a voice unit and a visual unit, and compressing and uploading the environmental data through a perception planning unit;
step 3, the understanding decision-making end of the cloud decision-making unit performs task analysis on the task instruction, and simultaneously performs scene understanding algorithm processing on the environmental data, and corresponding interaction instructions and action instructions are generated according to the task analysis result, the scene understanding result and the working state of the overall humanoid robot;
step 4, positioning information of the humanoid robot is obtained through a positioning fusion module and is sent to a perception planning unit;
step 5, the perception planning unit executes audio output control and video display of the humanoid robot according to the interaction instruction;
meanwhile, the perception planning unit performs path planning according to the positioning information and the action instruction to generate a corresponding motion instruction;
step 6, the control calculation unit combines the motion instruction and the real-time state data of each joint to carry out dynamic and kinematic settlement so as to obtain the joint instruction of the corresponding joint and send the joint instruction to the joint driving module through the EtherCAT master station;
and 7, the driving unit adjusts the output of the driving mechanism according to the received joint instruction so that the humanoid robot executes tasks, and meanwhile, the EtherCAT slave station sends real-time state data acquired by the plurality of state sensors to the EtherCAT master station.
9. The electrical control method for the humanoid robot according to claim 8, further comprising a self-checking process of the humanoid robot, wherein the debugging terminal evaluates according to the working state of the whole humanoid robot, and when the evaluation result is that the humanoid robot cannot work normally, a prompt to be overhauled is sent, and when the evaluation result is that the humanoid robot can work normally, an executable task prompt is sent.
10. An electrical control device for a humanoid robot, comprising a computer memory, a sensor, a computer processor and a computer program stored in the computer memory and executable on the computer processor, characterized in that the computer processor employs an electrical control system for a humanoid robot according to any one of claims 1-7;
the computer processor, when executing the computer program, performs the steps of: and generating corresponding interaction instructions and action instructions in real time through the electrical control system according to the input task instructions so as to realize task execution and man-machine interaction of the humanoid robot.
CN202310697602.6A 2023-06-12 2023-06-12 Electrical control system, method and device for humanoid robot Pending CN116713992A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310697602.6A CN116713992A (en) 2023-06-12 2023-06-12 Electrical control system, method and device for humanoid robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310697602.6A CN116713992A (en) 2023-06-12 2023-06-12 Electrical control system, method and device for humanoid robot

Publications (1)

Publication Number Publication Date
CN116713992A true CN116713992A (en) 2023-09-08

Family

ID=87864246

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310697602.6A Pending CN116713992A (en) 2023-06-12 2023-06-12 Electrical control system, method and device for humanoid robot

Country Status (1)

Country Link
CN (1) CN116713992A (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106849765A (en) * 2016-12-05 2017-06-13 重庆华数机器人有限公司 A kind of direct current common bus servo drive based on EtherCAT
CN108297098A (en) * 2018-01-23 2018-07-20 上海大学 The robot control system and method for artificial intelligence driving
CN108858195A (en) * 2018-07-16 2018-11-23 睿尔曼智能科技(北京)有限公司 A kind of Triple distribution control system of biped robot
CN110303471A (en) * 2018-03-27 2019-10-08 清华大学 Assistance exoskeleton control system and control method
CN110666820A (en) * 2019-10-12 2020-01-10 合肥泰禾光电科技股份有限公司 High-performance industrial robot controller
CN112394701A (en) * 2020-12-10 2021-02-23 之江实验室 Multi-robot cloud control system based on cloud-edge-end hybrid computing environment
CN113867334A (en) * 2021-09-07 2021-12-31 华侨大学 Unmanned path planning method and system for mobile machinery
CN115599024A (en) * 2022-11-25 2023-01-13 九江精密测试技术研究所(Cn) High-precision turntable control system based on EtherCAT bus
CN115933387A (en) * 2022-11-25 2023-04-07 之江实验室 Robot control method, device and medium based on visual language pre-training model

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106849765A (en) * 2016-12-05 2017-06-13 重庆华数机器人有限公司 A kind of direct current common bus servo drive based on EtherCAT
CN108297098A (en) * 2018-01-23 2018-07-20 上海大学 The robot control system and method for artificial intelligence driving
CN110303471A (en) * 2018-03-27 2019-10-08 清华大学 Assistance exoskeleton control system and control method
CN108858195A (en) * 2018-07-16 2018-11-23 睿尔曼智能科技(北京)有限公司 A kind of Triple distribution control system of biped robot
CN110666820A (en) * 2019-10-12 2020-01-10 合肥泰禾光电科技股份有限公司 High-performance industrial robot controller
CN112394701A (en) * 2020-12-10 2021-02-23 之江实验室 Multi-robot cloud control system based on cloud-edge-end hybrid computing environment
CN113867334A (en) * 2021-09-07 2021-12-31 华侨大学 Unmanned path planning method and system for mobile machinery
CN115599024A (en) * 2022-11-25 2023-01-13 九江精密测试技术研究所(Cn) High-precision turntable control system based on EtherCAT bus
CN115933387A (en) * 2022-11-25 2023-04-07 之江实验室 Robot control method, device and medium based on visual language pre-training model

Similar Documents

Publication Publication Date Title
CN107589752B (en) Method and system for realizing cooperative formation of unmanned aerial vehicle and ground robot
CN101592951B (en) Common distributed control system for humanoid robot
US20180345504A1 (en) Autonomous localization and navigation equipment, localization and navigation method, and autonomous localization and navigation system
CN109471435B (en) Multi-heterogeneous mobile robot control system
CN110765635A (en) Collaboration method, system, electronic device, and medium for digital twin system
CN111739170B (en) Visual platform of industrial robot workstation
Liang et al. Real-time state synchronization between physical construction robots and process-level digital twins
CN109454641B (en) Multi-task division and data interaction method for motion controller
Wu et al. HydraMini: An FPGA-based affordable research and education platform for autonomous driving
CN214751405U (en) Multi-scene universal edge vision motion control system
CN114707304A (en) Virtual-real combined multi-unmanned aerial vehicle perception avoidance verification system and method
CN112800606A (en) Digital twin production line construction method and system, electronic device and storage medium
CN106774178B (en) Automatic control system and method and mechanical equipment
CN116713992A (en) Electrical control system, method and device for humanoid robot
CN114227719A (en) Mobile robot remote control system, method and medium
CN116244905A (en) Method, system, terminal and medium for monitoring state in real time in robot production process
Ruiz et al. Implementation of a sensor fusion based robotic system architecture for motion control using human-robot interaction
CN115284247A (en) Live working robot master-slave teleoperation method and system based on heterogeneous master hand
Newman et al. Embedded mobile ros platform for slam application with rgb-d cameras
CN208323396U (en) A kind of hardware platform of intelligent robot
Galvan et al. FPGA-based controller for haptic devices
Li et al. A new teaching system for arc welding robots with auxiliary path point generation module
CN114603551A (en) Control method and electronic equipment
CN113325733A (en) AR visual interactive simulation system based on cooperative robot
CN111643189A (en) Surgical robot master-slave control experimental system and method based on force feedback technology

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination