CN112847292A - Robot control method, control system and modular robot - Google Patents

Robot control method, control system and modular robot Download PDF

Info

Publication number
CN112847292A
CN112847292A CN202110011833.8A CN202110011833A CN112847292A CN 112847292 A CN112847292 A CN 112847292A CN 202110011833 A CN202110011833 A CN 202110011833A CN 112847292 A CN112847292 A CN 112847292A
Authority
CN
China
Prior art keywords
robot
motion
speed
information
turn
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110011833.8A
Other languages
Chinese (zh)
Inventor
杨健勃
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Keyi Technology Co Ltd
Original Assignee
Beijing Keyi Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Keyi Technology Co Ltd filed Critical Beijing Keyi Technology Co Ltd
Priority to CN202110011833.8A priority Critical patent/CN112847292A/en
Publication of CN112847292A publication Critical patent/CN112847292A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J5/00Manipulators mounted on wheels or on carriages
    • B25J5/007Manipulators mounted on wheels or on carriages mounted on wheels
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/08Programme-controlled manipulators characterised by modular constructions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Manipulator (AREA)

Abstract

The invention relates to the field of robots, in particular to a robot control method, a control system and a modular robot. Comprises the following steps of T1: providing a robot, the robot comprising at least one wheel, the robot having at least one kinematic pose; t2: adjusting the robot to the corresponding motion attitude, storing the motion attitude information corresponding to the motion attitude, and pushing the robot to walk on the road surface under the corresponding motion attitude to obtain the speed of the wheels; generating preset action control information according to the speed and the motion attitude information; t3: constructing and forming an operation model according to preset action control information; and T4: the operation model outputs actual motion control information according to the input of a user so as to control the robot to execute motion, the setting of the motion control information can be completed in a mode that the user pushes the robot to walk on the ground, the operation is simple and rapid, the user experience is improved, the motion of the robot can be well set according to the requirements of the user, and different requirements of the user are met.

Description

Robot control method, control system and modular robot
[ technical field ] A method for producing a semiconductor device
The invention relates to the field of robots, in particular to a robot control method, a control system and a modular robot.
[ background of the invention ]
The robot is widely used in the fields of life and industry, for example, the robot is used for training the development thinking ability of students in teaching, and is used for operations such as welding, spraying, assembling, carrying and the like in automatic production. Although a robot as an execution system has great flexibility and elasticity and can complete different work tasks, the existing robot has only one main function aiming at specific use purposes and occasions, the degree of freedom and the configuration are fixed, and the expansibility of the function and the reconfigurability of the configuration are lacked. In addition, the cost for developing a specific robot aiming at each field and each application is very high, and the popularization and the application of the robot are seriously restricted. Therefore, reconfigurable robots are produced.
Reconfigurable robots are usually obtained by combining a main module and a plurality of basic modules, wherein the appearance structures of the basic modules are the same, and connecting surfaces are arranged to realize combination, but a user cannot verify whether the combined structure is correct or not in the process of combining the modular robots, so that a large amount of repeated assembly work is brought to the user, and poor use experience is caused. At present, many reconfigurable robots are also provided with wheels, and the wheels control different motion states and drive the main module to move to different places, so as to help people to complete some specified tasks, such as shooting, detecting and the like. At present, in the process of adjusting the motion state of a robot, generally, a single adjustment is performed between a main module and a basic module or between a single adjustment wheel and a sub-module, and it is difficult to simultaneously adjust the main module and the wheel, so that the adjustment posture is limited, and the requirements of user diversity are difficult to meet, or the procedure of setting control action information is tedious, the adjustment speed is slow, the adjustment operation is complex, and the experience of a user is seriously affected.
[ summary of the invention ]
In order to solve the problems, the invention provides a robot control method, a control system and a modular robot.
The invention provides a robot control method, which solves the technical problem and comprises the following steps: providing a robot, the robot comprising at least one wheel, the robot having at least one kinematic pose; t2: adjusting the robot to a corresponding motion attitude, storing motion attitude information corresponding to the motion attitude, and pushing the robot to walk on the road surface under the corresponding motion attitude to obtain the speed of the wheels; generating preset action control information according to the speed and the motion attitude information; t3: constructing and forming an operation model according to the preset action control information; and T4: the operation model outputs actual motion control information according to the input of the user to control the robot to perform the motion.
Preferably, at least 2 wheels are arranged, the motion postures comprise forward postures, left-turning postures and right-turning postures, and the corresponding motion posture information comprises forward posture information, left-turning posture information and right-turning posture information; the step T2 includes: t21: controlling the robot to move forward, obtaining a forward speed ratio of each wheel, and determining forward preset action control information according to forward attitude information and the forward speed ratio of each wheel; t22: controlling the robot to do left-turning motion to obtain a left-turning speed ratio of each wheel, and determining left-turning preset action control according to left-turning attitude information and the left-turning speed ratio of each wheel; and T23: and controlling the robot to do right-turn motion to obtain the right-turn speed ratio of each wheel, and determining the preset right-turn action control according to the right-turn attitude information and the right-turn speed ratio of each wheel.
Preferably, the step T2 includes the following steps: t24: controlling the robot to move forward, and storing the forward posture information and the maximum forward speed of the robot; determining advance preset action control information according to the advance attitude information, the maximum advance speed and/or the user-defined speed; t25: controlling the robot to do left-turning motion, and storing left-turning attitude information and the maximum left-turning speed of the robot; determining left turning preset action control information according to the left turning attitude information, the maximum left turning speed and/or the user-defined speed; and T26: controlling the robot to do right-turn motion, and storing the right-turn attitude information and the maximum right-turn speed of the robot; and determining the preset right turning motion control information according to the right turning attitude information, the maximum right turning speed and/or the user-defined speed.
Preferably, at least one of the forward speed ratio, the left-turn speed ratio, and the right-turn speed ratio is adjusted in accordance with a preset adjustment ratio.
Preferably, at least one of the forward speed ratio, the left turn speed ratio and the right turn speed ratio is adjusted by direct editing or entering.
In order to solve the above technical problems, the present invention also provides a robot control system, which includes a robot assembled by a main body and at least one wheel connected to the main body, having an initial solid structure; a memory, in which one or more of said programs are stored, the memory being in communication with the main body, and one or more programs for executing the robot control method as described above.
Preferably, the main body comprises a plurality of module units, the robot control system further comprises a controller, the controller and the robot can transmit signals, the controller comprises a display screen, the display screen at least presents an operation disc, and a user controls the robot to move by operating the operation disc.
Preferably, the main body comprises a plurality of module units, the robot control system further comprises a controller, the controller and the robot can perform signal transmission, the controller comprises a display screen, the display screen at least presents a proportion bar, and a user can adjust at least one of the forward speed ratio, the left-turn speed ratio and the right-turn speed ratio by manipulating the proportion bar.
Preferably, a first editing button and a second editing button are arranged on the display screen, the first editing button is used for starting to set the robot to a corresponding motion posture, and the second editing button is used for starting to set the speed of the wheel.
In order to solve the technical problem, the present invention further provides a modular robot for performing the robot control method as described above.
Compared with the prior art, the robot control method, the robot control system and the modular robot have the following beneficial effects:
1. the robot control method includes the steps of: a robot control method includes the steps of T1: providing a robot, the robot comprising at least one wheel, the robot having at least one kinematic pose; t2: adjusting the robot to a corresponding motion attitude, storing motion attitude information corresponding to the motion attitude, and pushing the robot to walk on the road surface under the corresponding motion attitude to obtain the speed of the wheels; generating preset action control information according to the speed and the motion attitude information; t3: constructing and forming an operation model according to the preset action control information; and T4: the operation model outputs actual motion control information according to the input of a user so as to control the robot to execute motion, the setting of motion control information can be completed in a mode that the user pushes the robot to walk on the ground, the operation is simple and rapid, the user experience is improved, the motion of the robot can be set in a user-defined mode according to the requirements of the user, different requirements of the user are met, and the enjoyment of the user in playing the robot is improved.
2. The step T2 includes: t21: controlling the robot to move forward, obtaining a forward speed ratio of each wheel, and determining forward preset action control information according to forward attitude information and the forward speed ratio of each wheel; t22: controlling the robot to do left-turning motion to obtain a left-turning speed ratio of each wheel, and determining left-turning preset action control according to left-turning attitude information and the left-turning speed ratio of each wheel; and T23: the robot is controlled to do right-turn motion, a right-turn speed ratio of each wheel is obtained, right-turn preset motion control is determined according to right-turn attitude information and the right-turn speed ratio of each wheel, and the advancing speed ratio, the left-turn speed ratio and the right-turn speed ratio are all obtained in real time by pushing the robot to move, so that the advancing preset motion control information, the left-turn preset motion control information and the right-turn preset motion control information which are set according to the obtained advancing speed ratio, the left-turn speed ratio and the right-turn speed ratio can be better adapted to the robot, the adaptive degree of the robot motion is improved, and a better motion state is obtained.
3. And adjusting at least one of the forward speed ratio, the left-turn speed ratio and the right-turn speed ratio according to a preset adjusting ratio, and adjusting the forward speed ratio, the left-turn speed ratio and the right-turn speed ratio according to different vehicle running environments so as to further improve the stability and flexibility of vehicle running and enable a user to obtain better experience.
4. At least one of the advancing speed ratio, the left-turning speed ratio and the right-turning speed ratio is adjusted in a direct editing or inputting mode, and when the vehicle is in relatively poor form environments, the speed obtained by pushing the robot to move is adjusted, so that the speed adaptability can be further improved, and the stability and the harmony of vehicle running are improved.
The control system of the robot and the modular robot provided by the invention also have the advantages.
[ description of the drawings ]
Fig. 1 is a flowchart illustrating a robot control method according to a first embodiment of the present invention.
FIG. 2A is a schematic view of the robot of the first embodiment of the present invention in a straight-ahead position;
FIG. 2B is a schematic view of the robot of the first embodiment of the present invention in a right turn attitude;
fig. 3 is a sub-flowchart of step T0 in the robot control method according to the first embodiment of the present invention.
Fig. 4 is a sub-flowchart of step T2 in the robot control method according to the first embodiment of the present invention.
Fig. 5 is a schematic view of a sub-flow of step T2 in a modified embodiment of the robot control method according to the first embodiment of the present invention.
Fig. 6 is a schematic diagram of a proportional band when a speed is set by sliding the proportional band in the robot control method according to the first embodiment of the present invention.
Fig. 7 is a sub-flowchart illustrating the velocity obtained by the custom method in the robot control method according to the first embodiment of the present invention.
Fig. 8 is a schematic view of a sector area formed in the robot control method of the first embodiment of the present invention.
Fig. 9 is a schematic diagram of a circular area formed in the robot control method of the first embodiment of the present invention.
Fig. 10 is a schematic block diagram of a robot control system according to a second embodiment of the present invention.
Fig. 11 is a schematic block configuration diagram of a robot control system according to a third embodiment of the present invention.
Fig. 12A is an interface schematic diagram of a display screen of a controller in a robot control system according to a third embodiment of the present invention;
fig. 12B is another interface schematic of a display screen of a controller in a robot control system according to a third embodiment of the present invention;
fig. 12C is a schematic view of another interface of a display screen of a controller in the robot control system according to the third embodiment of the present invention.
[ detailed description ] embodiments
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is further described in detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
First embodiment
Referring to fig. 1, a first embodiment of the present invention provides a robot control method for controlling a robot to execute corresponding actions. The robot includes at least one wheel and a body coupled to the wheel. The robot has at least one motion gesture.
In some other embodiments, the robot is a modular robot, the body of the robot comprising at least two modular units. One of the at least two modular units is connected to a wheel, each modular unit comprising at least one sub-module connected to one of said wheels and relatively rotatable. That is, when the sub-modules are one, and the wheels are also one, one end of the sub-modules is connected to the wheels, and the wheels and the sub-modules can rotate relatively. When the number of the sub-modules is one, and the number of the wheels is two, two ends of each sub-module are respectively connected with each wheel. When at least two sub-modules are included between each module unit, any two adjacent sub-modules are connected and rotated with each other. For example, the two sub-modules can rotate relatively, and each module unit is preferably composed of two upper and lower hemispheres which can perform relative rotation movement, wherein both ends of one hemisphere are respectively connected with the two wheels. Each sub-module comprises at least one butt joint part, each butt joint part is provided with an interface, each interface has unique interface identification information, and the module units are connected with each other through the butt joint parts. It will be understood that when each sub-module comprises at least two docking portions, the two modular units are connected by a respective docking portion, a virtual connecting surface is formed at the connecting position of the two modular units, the two modular units are rotatable on the basis of the virtual connecting surface, and a plane in which at least one other docking portion of at least one of the two modular units is located intersects the virtual connecting surface. Similarly, each wheel is provided with a butt joint part, each butt joint part is provided with a pair of interfaces, each wheel has corresponding mark information and each butt joint interface has unique interface identification information, and the module unit is connected with the wheel through the butt joint interfaces of the module unit and the wheel.
For convenience of subsequent explanation and understanding, the following definitions are made herein, and the configuration information includes, but is not limited to, one or more of module kind information, position information, module number information, and initial angle information between two sub-modules, and one or more of wheel kind information, position information, wheel number information, and initial angle information between a sub-module and a wheel. For example, when the number of wheels is 2, the category information of the wheels may be defined as left and right wheels; when the number of the wheels is 4, the wheel type information may be defined as a front left wheel, a front right wheel, a rear left wheel, and a rear right wheel. Of course, the information about the type of wheel may be named otherwise, as long as the wheel can be marked and identified. The configuration information is information used for defining the connection relationship between adjacent module units and/or between the wheel and the sub-module, wherein the position information is used for recording the interface identification information of two butting parts for connecting the adjacent module units and the interface identification information of two butting parts for connecting the wheel and the sub-module; the interface identification information of each butt joint part represents the position of the butt joint part on the module unit where the butt joint part is located and the position of the wheel relative to the sub-module, so that the position information of each module unit and the position information of the wheel relative to the sub-module represent the absolute position of the butt joint part in a three-dimensional space configuration or a plane configuration; the module units in the same category are provided with the same module category identification, such as: when the cell monomers have a plurality of categories, each category of cell monomers has the same module type identification, and the module type identifications of the cell monomers in different categories are different, so that the module type information of the module unit can be known by identifying the module type identifications; the initial angle information between the two sub-modules refers to a relative angle value between the upper sub-module and the lower sub-module of the module unit; the module number information indicates the number of module units. The process of identifying the interface identification information of the two butting parts which are connected with each other between the two adjacent module units is a surface identification process, and the position information of the module units can be obtained by performing the surface identification. It is to be understood that the definitions herein apply equally to other embodiments of this specification.
The robot control method includes the steps of:
t1: providing a robot, the robot comprising at least one wheel, the robot having at least one kinematic pose;
t2: adjusting the robot to a motion attitude, storing motion attitude information corresponding to the motion attitude, and generating preset action control information according to the speed and the motion attitude information; and
t3: constructing and forming an operation model according to the preset action control information; and
t4: the operation model outputs actual motion control information according to the input of the user to control the robot to perform the motion.
It is understood that in the step T1, the robot may communicate with the remote terminal. In addition, when the robot itself can perform the operation of the subsequent step without communicating with the remote terminal, it itself can perform the module unit of the subsequent step.
It can be understood that, in the step T2, the robot is controlled to the motion posture, the motion posture information corresponding to the motion posture is saved, and the preset action control information is generated according to the speed and the motion posture information. Defining, the motion gesture of the robot comprises any one or more of a forward gesture, a left turn gesture, a right turn gesture, a backward gesture, a front left gesture, a front right gesture, a back left gesture, a back right gesture and a stop. The corresponding motion attitude information comprises forward attitude information, left turn attitude information and right turn attitude information.
Referring to fig. 2A and 2B, the movement posture is illustrated in the present embodiment when the number of the wheels is four. Establishing a coordinate system including an X axis and a Y axis perpendicular to each other, defining forward postures and backward postures of the wheels when a central connecting line of two wheels on the same row coincides with the X axis (as shown in fig. 2A), right turn postures and forward right postures of the wheels when an included angle V between the central connecting line of the two wheels and the X axis is an acute angle (as shown in fig. 2B), and left turn postures and forward left postures of the wheels when the central connecting line of the two wheels and the X axis is an obtuse angle (not shown). The rear left and right postures can also be defined by defining the deflection angle of the wheel. It should be noted that the definition of the motion gesture in this embodiment is only one of the manners, and the user may also define the motion gesture through another definition manner. When the robot includes a plurality of movement gestures, when controlling the robot to move, the robot needs to be regulated and controlled to the corresponding movement gestures. The robot control method further includes the steps of:
and T10, acquiring the motion attitude information of the robot. The step T10 is between the step T1 and the step T2. It is to be understood that, in the step T10, the remote terminal obtains initial virtual configuration information of the robot in the motion posture, where the initial virtual configuration information represents the motion posture information of the robot in the motion posture; at least one of the plurality of module units or at least one wheel acquires the initial virtual configuration information of the robot and stores the information instead of transmitting the information to the remote terminal. The initial virtual configuration information comprises one or more of position information, module type information, module quantity information, initial angle information between an upper sub-module and a lower sub-module, wheel type information, position information, wheel quantity information, initial angle information between the sub-modules and the wheels, and one or more of other information defining the connection relationship between adjacent module units. The module unit or the wheel transmits the module type information of the module unit or the wheel to the cell main body in a wireless transmission mode; after all the module units and the wheels transmit the position information to the remote terminal, the remote terminal obtains the number information of the modules and the number information of the wheels of the robot; the module unit detects the initial angles of the upper and lower sub-modules and wirelessly transmits the initial angle information to the remote terminal, and simultaneously detects the initial angles of the wheels and the sub-modules and wirelessly transmits the initial angle information to the remote terminal. It can be understood that the initial virtual configuration information corresponds to the attitude information in the initial virtual configuration. When the motion posture needs to be adjusted to a corresponding motion posture, the wheel and/or the main module are/is rotated to be adjusted to the corresponding motion posture, and the corresponding motion posture information also comprises one or more of position information, module type information, module quantity information, initial angle information between an upper sub module and a lower sub module, wheel type information, position information, wheel quantity information and initial angle information between the sub modules and the wheel, and one or more of other information defining the connection relation between adjacent module units.
Referring to fig. 3, the plurality of module units of the robot may include a plurality of identical or different module units, and the plurality of module units includes at least one module unit capable of communicating with the remote terminal.
For example, in some embodiments, a plurality of the module units includes a cell body and at least one cell monomer, that is, the robot includes a cell body and at least one cell monomer, wherein a signal transmission process between the cell body and the at least one cell monomer is described as follows:
the cell body is used for communicating with a remote terminal, the cell monomer directly connected with the cell body is defined as a primary cell monomer, the cell monomer connected with the primary cell monomer is a secondary cell monomer, the cell monomer connected with the M-level cell monomer is an (M +1) -level cell monomer, and M is an integer greater than or equal to 1. The method for acquiring the initial virtual configuration information of the robot specifically comprises the following steps:
t101: the cell body transmits signals to the primary cell monomer connected with the cell body through the butt joint part;
t102: after the primary cell monomer receives the signal, performing surface recognition to obtain interface identification information of a butting part of the signal sent by the cell main body, and transmitting the interface identification information of the butting part of the signal sent by the cell main body and the interface identification information of the butting part of the signal received by the primary cell monomer to the cell main body together to obtain position information of the primary cell monomer;
t103: signaling the (M +1) -grade cell monomer to the M-grade cell monomer; and
t104: after receiving the signals, the (M +1) level cell monomer performs surface recognition to obtain interface identification information of the butting part of the signals sent by the M level cell monomer, and the (M +1) level cell monomer transmits the interface identification information of the butting part of the signals sent by the M level cell monomer and the interface identification information of the butting part of the signals received by the cell monomer to the cell main body.
It is understood that the signal transmitted by the cell body to the primary cell monomer and the signal transmitted by the M-grade cell monomer to the (M +1) -grade cell monomer are preferably electric signals, and may also be wireless signals. When only the cell body and the primary cell monomer are present in the robot, the steps T103 and T104 may be omitted.
When the plurality of module units of the robot include a plurality of identical module units, one of the module units is defined as a main module unit, that is, the cell body, the module unit directly connected to the main module unit is a primary cell monomer, the module unit connected to the primary cell monomer is a secondary cell monomer, the module unit connected to the M-level cell monomer is a (M +1) -level cell monomer, and M is an integer greater than or equal to 1, and the above steps T101 to T104 are also performed. As a variation, the multi-stage cell units may directly transmit the respective position information to the remote terminal without being transmitted to the main module unit.
In summary, the process of acquiring the position information of the plurality of module units of the robot is: the module unit identifies the interface identification information of the butt joint part of the adjacent module unit connected with the module unit, and obtains the position information of the module unit according to the interface identification information of the butt joint part of the adjacent module unit and the interface identification information of the butt joint part of the module unit connected with the adjacent module unit.
It is understood that the process of acquiring the position information of the wheels of the robot may be the same as the process of acquiring the position information of the plurality of module units. That is, one of the wheels may be defined as a cell body, and the module unit directly connected to the cell body or the wheel is a primary cell monomer, and the specific identification process is consistent with the above and will not be described herein again.
It will also be appreciated that when the robot is a non-modular robot, the positional information between the wheels and the body of the robot, as well as the angular information, may also be obtained from a method between the modules to represent pose information for the robot.
In some other embodiments of the present invention, the remote terminal may also directly obtain angle information, position information, and the like of each module unit and wheel, and the remote terminal processes the corresponding angle information and position information to obtain the current configuration information of the modular robot, so as to complete the identification of the current entity configuration of the modular robot.
In addition, the following steps are performed before or simultaneously with the execution of step T101
Step T100: the modular unit or wheel sends out a broadcast signal informing the individual cell units that they are ready for area identification. It will be appreciated that the modular units may communicate wirelessly, which may be wifi communication, bluetooth communication or zigbee communication, preferably zigbee communication. The module unit or the wheel firstly broadcasts a signal to inform the other connected module units to enter a surface recognition preparation state, and the surface recognition action is carried out after the other module units receive the electric signal.
In step T101, each docking portion on the cell body sends a different electrical signal to a plurality of primary cell monomers, in step T102, the plurality of primary cell monomers obtain interface identification information of the docking portion of the cell body connected thereto according to the difference in the received electrical signals, each primary cell monomer returns the interface identification information of the docking portion, to which the cell body transmits the electrical signal, and the interface identification information of the docking portion, to which the cell body itself receives the electrical signal, to the cell body together, the cell body obtains position information of the primary cell monomer through arithmetic calculation, and after the plurality of primary cell monomers perform the same operation, the cell body obtains position information of the plurality of primary cell monomers. Similarly, in steps T103 and T104, each docking portion on the M-level cell monomer sends a different electrical signal to the plurality of (M +1) -level cell monomers, the plurality of (M +1) -level cell monomers obtain interface identification information of the docking portion of the M-level cell monomer connected thereto according to the difference in the received electrical signals, each (M +1) -level cell monomer returns the interface identification information of the docking portion where the M-level cell monomer transmits the electrical signal and the interface identification information of the docking portion where the cell body itself receives the electrical signal to the cell main body, the cell main body obtains the position information of the (M +1) -level cell monomer through algorithm calculation, and after the plurality of (M +1) -level cell monomers perform the same action, the cell main body obtains the position information of the plurality of (M +1) -level cell monomers. After a series of surface identification, the cell body obtains the position information of all cell monomers so as to obtain the configuration information of the robot, namely the motion attitude information of the robot.
It can be understood that when the cell main body or the cell monomer simultaneously sends different electrical signals to the plurality of next-level cell monomers, the plurality of next-level cell monomers reply the position information to the cell main body in a time-sharing sequence according to the interface identification information of the butt joint part of the cell main body or the cell monomer transmitting different electrical signals; or the cell main body or the cell monomers transmit the same or different electric signals to a plurality of next-level cell monomers in a time sequence, and the plurality of next-level cell monomers sequentially reply the position information thereof to the cell main body according to the time sequence of receiving the electric signals. For example: the cell body is provided with two butt joint parts, the interface identification information is respectively defined as 1 and 2, the cell body simultaneously sends two different electric signals to two primary cell monomers connected with the cell body, the primary cell monomer connected with the butt joint part 1 is set to firstly reply the position information, and after 10T (the specific time can be automatically adjusted), the primary cell monomer connected with the butt joint part 2 replies the position information.
In addition, between the steps T102 and T103 further includes
Step T102 a: the cell body stops sending the electric signal and informs the primary cell monomer directly connected with the cell body to send the electric signal to the secondary cell monomer connected with the primary cell monomer. It is understood that in said step T102a, the cell body is preferably informed of the primary cell monomer by means of a broadcast signal. It can be understood that, before the M-level cell monomer sends the electrical signal, the cell body controls the M-level cell monomer to send the electrical signal to the plurality of (M +1) -level cell monomers in a time-sequence manner by broadcasting the interface identification information of the plurality of docking portions of the M-level cell monomer, and the electrical signal sent by the M-level cell monomer to the plurality of (M +1) -level cell monomers may be the same or different, and preferably, the plurality of docking portions of the M-level cell monomer send different electrical signals.
In addition, in the steps T102 and T104, the cell body receives the position information transmitted from the cell monomers, individually numbers each cell monomer, and stores the position information of each cell monomer in association with the number. When the cell body communicates with the remote terminal, the cell body transmits the position information of each cell monomer and the number thereof to the remote terminal. After the remote terminal sends the action control information to the cell main body, the cell main body decomposes the control information according to different numbers and correspondingly transmits the decomposed control information to each cell monomer according to the numbers.
It is understood that in the step T10, the module unit, which may be a remote terminal or a module unit storing initial virtual configuration information, generates an initial virtual configuration of the robot according to the initial virtual configuration information. The remote terminal generates the initial virtual configuration of the robot through three-dimensional simulation or three-dimensional modeling and the like according to the obtained initial virtual configuration information. The initial virtual configuration can be displayed at the remote terminal so that a user can conveniently check the motion posture of the robot at any time.
In the above step T2, the speed is associated with the moving speed of the robot and/or obtained by user customization. Wherein the speed obtained by driving the robot to move under the corresponding motion posture is related to the motion speed of the robot. The user-defined speed can be understood as being edited, entered or other defined ways on the operation interface. It should be noted that, when the number of the wheels is at least two, some of the wheels may be obtained by movement, and some of the wheels may be obtained by customization.
In the above step T2, the test actual speed of the wheels is tested by manually pushing the robot to walk on the road surface, which is called a test driving mode. The purpose of this is to obtain a speed ratio between each wheel and then the user sets the actual speed of the wheels during actual walking according to the speed ratio. The speed ratio of each wheel is obtained through the test driving mode, so that the wheels can achieve better balance when actually moving. In order to obtain the test actual speed of the wheels during the test, a speed sensor may be mounted on each wheel, by which the actual speed of each wheel during travel is obtained.
Of course, as a variant, it is also possible to move the robot by pushing it on the APP software operating system.
As an example, when it is a speed obtained by a way of artificially pushing a robot to walk on a road surface, the robot control method includes the steps of:
t1: providing a robot, the robot comprising at least one wheel, the robot having at least one kinematic pose;
t2: adjusting the robot to a corresponding motion attitude, storing motion attitude information corresponding to the motion attitude, and pushing the robot to walk on the road surface under the corresponding motion attitude to obtain the speed of the wheels; generating preset action control information according to the speed and the motion attitude information;
t3: constructing and forming an operation model according to the preset action control information; and
t4: the operation model outputs actual motion control information according to the input of the user to control the robot to perform the motion.
Referring to fig. 4, the step T2 includes the following steps:
t21: controlling the robot to move forward, obtaining a forward speed ratio of each wheel, and determining forward preset action control information according to forward attitude information and the forward speed ratio of each wheel;
t22: controlling the robot to do left-turning motion to obtain a left-turning speed ratio of each wheel, and determining left-turning preset action control information according to left-turning attitude information and the left-turning speed ratio of each wheel; and
t23: and controlling the robot to do right-turn motion to obtain the right-turn speed ratio of each wheel, and determining right-turn preset action control information according to the right-turn attitude information and the right-turn speed ratio of each wheel.
In the above steps T21, T22, and T23, controlling the robot to make forward movement, controlling the robot to make left turn movement, and controlling the robot to make right turn movement all include a manner of pushing the robot to walk on the road surface or to control the robot on the APP.
The speed obtained by controlling the robot to move in the corresponding movement posture may be specifically realized as follows. When the number of the wheels is at least two, the forward speed ratio may be obtained by first obtaining a test actual speed corresponding to each of the wheels; calculating from said tested actual speed of each wheel to obtain a speed ratio between said each wheel.
Specifically, in the above step T21, after the forward speed ratio is obtained, the actual speed of one of the wheels may be given, and then the speed of all the wheels is calculated from the speed ratio between each of the wheels as the speed at which the preset motion control information is generated.
In addition, the left-turn speed ratio and the right-turn speed ratio are obtained in the same manner as the forward speed ratio, and are not described herein again.
Referring to fig. 5 of the drawings, a schematic diagram of a display device,
in some other embodiments, the step T2 includes the following steps
T24: controlling the robot to move forward, and storing the forward posture information and the maximum forward speed of the robot; determining advance preset action control information according to the advance attitude information, the maximum advance speed and/or the user-defined speed;
t25: controlling the robot to do left-turning motion, and storing left-turning attitude information and the maximum left-turning speed of the robot; determining left turning preset action control information according to the left turning attitude information, the maximum left turning speed and/or the user-defined speed; and
t26: controlling the robot to do right-turn motion, and storing the right-turn attitude information and the maximum right-turn speed of the robot; and determining the preset right turning motion control information according to the right turning attitude information, the maximum right turning speed and/or the user-defined speed.
In the above steps T24, T25, and T26, controlling the robot to make forward movement, controlling the robot to make left turn movement, and controlling the robot to make right turn movement all include a manner of pushing the robot to walk on the road surface or to control the robot on the APP. The speed obtained by controlling the robot to move in the corresponding movement posture can be specifically realized by the following steps: and setting the speed of one wheel as the actual maximum speed of the wheels in the actual movement process, and converting the actual maximum speeds of the other wheels according to the speed ratio to obtain the maximum forward speed, the maximum left-turning speed and the maximum right-turning speed of all the wheels. Of course, the maximum speed may also be obtained directly from the actual speed of travel of the propelled vehicle.
In the above steps T21, T22, T23, by calculating the speed ratio between the wheels, it is possible to perform on the operation interface entry modification, editing modification of the actual speed of each wheel according to the speed ratio and the actual experience value of the user at the same time.
Referring to fig. 6, in some other embodiments, a visual scale bar Z representation may be provided on the operator interface by sliding the control point F along the scale bar Z between points N and M, where point M represents the maximum speed ratio for each wheel. When sliding along point M to point N, the speed ratio of each wheel is scaled according to the length between the MNs that occupies the entire slider. For example, the maximum speed ratio at the beginning is: 10:9:8:9, the speed ratio of each wheel when slid to the midpoint of the slider bar will become: 5:4.5:4:4.5.
Referring to fig. 7, in the step T2, when there are at least two wheels, in the step T24, T25 or T26, the manner of obtaining the speed in a user-defined manner may be performed by the steps of:
step S21, setting the maximum actual speed of one wheel in the actual running process and the speed ratio of each wheel; and
step S22, the actual maximum speed of the remaining wheels is converted according to the speed ratio.
In the step S21, the maximum actual speed of one of the wheels during the actual movement and the speed ratio of each wheel may be set on the controller in signal communication with the robot. The controller may be an APP loaded on a remote terminal. The remote terminal may be a cell phone, an IPAD, a computer, or other device. Of course, it will be appreciated that the speed of one of the wheels may be adjusted individually after setting the speed of each wheel according to the speed ratio. Alternatively, the speed of each wheel may be set directly, without setting the speed ratio of each wheel.
Referring to fig. 8, a coordinate system is constructed, which includes an X-axis and a Y-axis perpendicular to each other. A circular area or a fan-shaped area including areas located in a first quadrant and a second quadrant is defined on a coordinate system with an origin as a center to form a manipulation area. The sector area in fig. 8 is the OPQ.
When the robot is in the forward gesture and is pushed to move from the circle center O to the vertex A, the robot moves forward, the speed at the circle center O is 0, and the speed at the vertex A is the maximum forward speed. Its maximum speed is defined as V1.
When the robot is in a left-turning posture, the pushing robot moves from the vertex A to the left endpoint Q along the circular arc in a left-turning motion, and the speed when the pushing robot moves to the left endpoint Q is the maximum left-turning speed. Its maximum speed is defined as V2.
When the robot is in a right-turn posture, the pushing robot moves from the vertex A to the right endpoint P along the circular arc in a left-turn motion, and the speed when the pushing robot moves to the right endpoint P is the maximum left-turn speed. Its maximum speed is defined as V3. It should be noted that, the forward movement is described by constructing a coordinate system, and the left-turn movement and the right-turn movement are only for intuitive presentation, and the movement trajectory may slightly deviate from the defined trajectory or other customized trajectories, and is mainly set according to the requirements of the user.
Referring to fig. 8, step T3 includes:
t31: constructing a coordinate system, and defining a circular area or a fan-shaped area on the coordinate system by taking an origin as a circle center to form a control area, wherein the fan-shaped area comprises areas positioned in a first quadrant and a second quadrant; take the steering area as the sector area OPQ as an example.
T32: and mapping the forward preset action control information, the left turn preset action control information and the right turn preset action control information to an upper vertex, a left vertex and a right vertex of the control area respectively. That is, the preset motion control information at each vertex includes the magnitude of the velocity, the attitude information, and the like.
Step T3 further includes the steps of:
t33: the user input corresponds to a control point in a coordinate system, when the control point is positioned in a first quadrant and a second quadrant, the control point positioned in the first quadrant is a front right movement control point, and front right preset movement control information is obtained by interpolation operation of the forward preset movement control information and right-turn preset movement control information;
the control point positioned in the second quadrant is a front left movement control point, and front left preset action control information is obtained by interpolation operation of the forward preset action control information and the left turn preset action control information;
the control point on the Y axis is obtained by converting the distance between the forward preset action control information and the center of a circle, the control point on the arc AQ is obtained by converting the left-turn preset action control information according to the arc length between the control point and the vertex A, and the control point on the arc AP is obtained by converting the right-turn preset action control information according to the arc length between the control point and the vertex A.
In this step, the control point X1 located in the second quadrant is taken as an example for explanation. The speed calculation process is as follows: firstly, the speed value V at A0 is obtained by conversion according to the arc length AA0AOThen according to the distance pair V of the O1 from the center of the circleAOAnd (5) conversion is carried out. The attitude information is slowly and gradually deflected from the straight attitude according to the angle of the included angle 1.
With continuing reference to FIG. 8, step T3 further includes the steps of:
t34: interpolating to a lower vertex of the control area according to the forward preset action control information to form backward preset action control information;
interpolating to a third quadrant of the control area according to the retreating preset action control information and the left-turning preset action control information to form rear left preset action control information;
interpolating to a fourth quadrant of the control area according to the backward preset action control information and the right-turn preset action control information to form backward right preset action control information.
Referring to fig. 9, the sector area further includes areas located in the third quadrant and the fourth quadrant, and there are blank areas QOQ1 and POP1 between the first quadrant and the fourth quadrant and between the second quadrant and the third quadrant of the sector area, where no control point is located. The user cannot operate the robot in the blank area of the user.
With continued reference to fig. 9, the user operation interface corresponds to an operation disk, which corresponds to a circular area or a sector area on the coordinate system;
the step T4 includes:
t41: operating on the operating disk to form an input;
t42: and the operation model calculates and obtains actual motion control information according to the control points corresponding to the input so as to control the robot to execute the motion.
In the above step T41, the user may form the manipulation signal by touching a circular or fan-shaped manipulation region on the manipulation disk. An angle sensing sensor and a speed detecting sensor are arranged on each cell unit and each wheel.
Second embodiment
Referring to fig. 10, a robot control system 30 for controlling a robot formed by splicing at least one wheel and a main body according to a second embodiment of the present invention includes:
the storage module 31 is configured to store motion posture information corresponding to a motion posture of the robot and preset motion control information;
and the model generating module 33 is configured to construct and form an operation model according to the preset action control information.
The robot control system 30 further comprises a speed sensor provided on each of said wheels. The speed sensor may be a codewheel type encoder for measuring the actual speed of the wheel. The actual speed of the wheel includes the linear and angular speed of the wheel.
Third embodiment
Referring to fig. 11, a robot control system 40 according to a third embodiment of the present invention is further provided, where the robot control system 40 further includes:
a robot 41 assembled from a main body and at least one wheel connected to the main body, having an initial solid structure;
a memory 43, and one or more programs, wherein one or more of the programs are stored in the memory, the memory being in communication with the module unit, the programs being for executing the instructions of the steps of:
storing motion attitude information corresponding to the motion attitude, and generating preset action control information according to the speed and the motion attitude information; and
constructing and forming an operation model according to the preset action control information; and
the operation model outputs actual motion control information according to the input of the user to control the robot to perform the motion.
In some other embodiments, the robot is a modular robot, and the body includes a plurality of modular units with wheels coupled to the modular units.
In addition, the plurality of module units include a cell main body and at least one cell monomer, each docking portion has unique interface identification information, and the cell monomer directly connected to the cell main body is defined as a primary cell monomer, and the acquiring of the initial virtual configuration information of the robot specifically includes the following steps:
the cell body transmits signals to the primary cell monomer connected with the cell body through the butt joint part;
after the primary cell monomer receives the signal, the primary cell monomer carries out surface recognition to obtain interface identification information of the butting part of the signal sent by the cell main body, and the primary cell monomer transmits the interface identification information of the butting part of the signal sent by the cell main body and the interface identification information of the butting part of the signal received by the primary cell monomer to the cell main body together to obtain the position information of the primary cell monomer.
The robot control system 40 further includes a controller 45, and the controller 45 and the robot 41 can perform signal transmission therebetween.
The controller 45 may be provided on a remote mobile terminal. The remote mobile terminal comprises electronic equipment such as a mobile phone, a tablet computer or a computer. Referring to fig. 12A, 12B and 12C, the controller 45 includes a display screen 451, and a user edits, saves or sets the preset motion control information in the corresponding motion gesture by manipulating the display screen 451. The display screen 451 is provided with a movement posture setting area a corresponding to a first editing button, a speed setting area B corresponding to a first editing button, a virtual configuration display area C, an operation video playing area D, and an edited steering wheel E. The operation is described below with reference to the display interface of the display screen 451. When the preset motion control information of the forward gesture needs to be set, the motion gesture setting area a is clicked first, the robot enters an interface shown in fig. 12B, and then the entity configuration is broken according to the tutorial of the operation video playing area D, so that the robot is in the forward gesture, and after the adjustment is completed, the motion gesture information corresponding to the corresponding forward gesture is stored at the same time. And after the posture frame is stored, clicking the speed setting area B to push the robot to advance to obtain the movement speed or obtain the speed in a user-defined mode. And after the speed adjustment is finished, storing the speed. The first editing button is used for starting and setting the robot to a corresponding motion gesture, and the second editing button is used for starting and setting the speed of the wheel.
The left turn attitude and the right turn attitude are set in the same manner as the forward attitude, and are not described herein again. The formation of the preset motion control information for the front left gesture, the front right gesture, the rear left gesture, and the rear right gesture is the same as that provided in the first embodiment, and is not described herein again. Referring to fig. 12C, when the preset motion control information corresponding to the motion gesture is edited, an operation disc corresponding to a circular area or a sector area on the coordinate system is provided on the controller 45. The user operates on the operation disc to form input; meanwhile, the operation model calculates and obtains actual motion control information according to the control points corresponding to the input so as to control the robot to execute the motion.
The display screen 451 presents at least one scale bar that the user adjusts by manipulating the scale to adjust at least one of the forward speed ratio, the left turn speed ratio, and the right turn speed ratio.
Fourth embodiment
A modular robot for performing the robot control method as provided in the first and variant embodiments.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents and improvements made within the spirit of the present invention are intended to be included within the scope of the present invention.

Claims (10)

1. A robot control method is characterized by comprising the following steps:
t1: providing a robot, the robot comprising at least one wheel, the robot having at least one kinematic pose;
t2: adjusting the robot to a corresponding motion attitude, storing motion attitude information corresponding to the motion attitude, and pushing the robot to walk on the road surface under the corresponding motion attitude to obtain the speed of the wheels; generating preset action control information according to the speed and the motion attitude information;
t3: constructing and forming an operation model according to the preset action control information; and
t4: the operation model outputs actual motion control information according to the input of the user to control the robot to perform the motion.
2. The robot control method according to claim 1, wherein the number of the wheels is at least 2, the motion attitudes include forward attitudes, left turn attitudes and right turn attitudes, and the corresponding motion attitude information includes forward attitude information, left turn attitude information and right turn attitude information;
the step T2 includes:
t21: controlling the robot to move forward, obtaining a forward speed ratio of each wheel, and determining forward preset action control information according to forward attitude information and the forward speed ratio of each wheel;
t22: controlling the robot to do left-turning motion to obtain a left-turning speed ratio of each wheel, and determining left-turning preset action control information according to left-turning attitude information and the left-turning speed ratio of each wheel; and
t23: and controlling the robot to do right-turn motion to obtain the right-turn speed ratio of each wheel, and determining right-turn preset action control information according to the right-turn attitude information and the right-turn speed ratio of each wheel.
3. The robot control method according to claim 1, wherein the step T2 includes the steps of:
t24: controlling the robot to move forward, and storing the forward posture information and the maximum forward speed of the robot; determining advance preset action control information according to the advance attitude information, the maximum advance speed and/or the user-defined speed;
t25: controlling the robot to do left-turning motion, and storing left-turning attitude information and the maximum left-turning speed of the robot; determining left turning preset action control information according to the left turning attitude information, the maximum left turning speed and/or the user-defined speed; and
t26: controlling the robot to do right-turn motion, and storing the right-turn attitude information and the maximum right-turn speed of the robot; and determining the preset right turning motion control information according to the right turning attitude information, the maximum right turning speed and/or the user-defined speed.
4. The robot control method according to claim 2, wherein at least one of the forward speed ratio, the left turn speed ratio, and the right turn speed ratio is adjusted in accordance with a preset adjustment ratio.
5. The robot control method of claim 2, wherein at least one of the forward speed ratio, the left turn speed ratio, and the right turn speed ratio is adjusted by direct editing or entering.
6. A robot control system characterized by comprising:
a robot assembled from a main body and at least one wheel connected to the main body, having an initial solid structure;
a memory, and one or more programs, wherein one or more of the programs are stored in the memory, the memory being in communication with the body, the program being for executing the robot control method of any of claims 2-5.
7. The robot control system of claim 6, wherein the body comprises a plurality of modular units, the robot control system further comprising a controller, the controller being in communication with the robot, the controller comprising a display screen, the display screen presenting at least one operating disk, the user operating the operating disk to control the movement of the robot.
8. The robot control system of claim 6, wherein the body comprises a plurality of modular units, the robot control system further comprising a controller, the controller being in signal communication with the robot, the controller comprising a display screen, the display screen presenting at least one scale bar, the scale bar being manipulated by a user to adjust at least one of the forward speed ratio, the left turn speed ratio, and the right turn speed ratio.
9. The robot control system of claim 6, wherein a first editing button and a second editing button are disposed on the display screen, the first editing button for initiating setting of the robot to a corresponding motion pose, the second editing button for initiating setting of the wheel speed.
10. A modular robot, characterized by: the robot is adapted to perform the robot control method of any of claims 1-5.
CN202110011833.8A 2021-01-06 2021-01-06 Robot control method, control system and modular robot Pending CN112847292A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110011833.8A CN112847292A (en) 2021-01-06 2021-01-06 Robot control method, control system and modular robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110011833.8A CN112847292A (en) 2021-01-06 2021-01-06 Robot control method, control system and modular robot

Publications (1)

Publication Number Publication Date
CN112847292A true CN112847292A (en) 2021-05-28

Family

ID=76004069

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110011833.8A Pending CN112847292A (en) 2021-01-06 2021-01-06 Robot control method, control system and modular robot

Country Status (1)

Country Link
CN (1) CN112847292A (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100152897A1 (en) * 2008-12-16 2010-06-17 MULLER Jeffrey Method & apparatus for controlling the attitude of a camera associated with a robotic device
CN111190387A (en) * 2020-01-07 2020-05-22 北京可以科技有限公司 Robot control method and system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100152897A1 (en) * 2008-12-16 2010-06-17 MULLER Jeffrey Method & apparatus for controlling the attitude of a camera associated with a robotic device
CN111190387A (en) * 2020-01-07 2020-05-22 北京可以科技有限公司 Robot control method and system

Similar Documents

Publication Publication Date Title
AU2020201554B2 (en) System and method for robot teaching based on RGB-D images and teach pendant
CN111190387B (en) Robot control method and system
CN108241339B (en) Motion solving and configuration control method of humanoid mechanical arm
CN103345285B (en) A kind of quadruped robot remote control thereof
US20170348858A1 (en) Multiaxial motion control device and method, in particular control device and method for a robot arm
CN110125944B (en) Mechanical arm teaching system and method
JP2730915B2 (en) Robot controller
CN107220099A (en) A kind of robot visualization virtual teaching system and method based on threedimensional model
CN103853133A (en) Robot system calibration method
CN108356806B (en) Modular robot control method and system
KR20170024769A (en) Robot control apparatus
CN109531577A (en) Mechanical arm calibration method, device, system, medium, controller and mechanical arm
CN110193816B (en) Industrial robot teaching method, handle and system
CN108326846B (en) Modular robot and module unit position calculation method thereof
CN114227681A (en) Robot off-line virtual teaching programming method based on infrared scanning tracking
Angelopoulos et al. Drone brush: Mixed reality drone path planning
CN112847292A (en) Robot control method, control system and modular robot
CN204525481U (en) A kind of unpowered articulated arm teaching machine
CN114839990A (en) Cluster robot experiment platform
CN109531579B (en) Mechanical arm demonstration method, device, system, medium, controller and mechanical arm
CN103009388B (en) Light wave transmitter as well as robot track locating system and robot track locating method
CN202943639U (en) Light wave emitter and robot trajectory seeking system
CN210589293U (en) Arm teaching device
CN117648042B (en) Industrial robot dragging teaching movement control method and system
Banda et al. Investigations on collaborative remote control of virtual robotic manipulators by using a Kinect v2 sensor

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination