CN114839990A - Cluster robot experiment platform - Google Patents
Cluster robot experiment platform Download PDFInfo
- Publication number
- CN114839990A CN114839990A CN202210472124.4A CN202210472124A CN114839990A CN 114839990 A CN114839990 A CN 114839990A CN 202210472124 A CN202210472124 A CN 202210472124A CN 114839990 A CN114839990 A CN 114839990A
- Authority
- CN
- China
- Prior art keywords
- robot
- cluster
- upper computer
- positioning
- experimental
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000002474 experimental method Methods 0.000 title claims abstract description 15
- 238000004891 communication Methods 0.000 claims abstract description 52
- 230000015572 biosynthetic process Effects 0.000 claims abstract description 14
- 238000004422 calculation algorithm Methods 0.000 claims abstract description 11
- 238000000034 method Methods 0.000 claims description 8
- 230000033001 locomotion Effects 0.000 claims description 7
- 230000007613 environmental effect Effects 0.000 claims description 3
- 229910052731 fluorine Inorganic materials 0.000 claims 1
- 125000001153 fluoro group Chemical group F* 0.000 claims 1
- 230000009286 beneficial effect Effects 0.000 abstract description 3
- 238000010586 diagram Methods 0.000 description 5
- 230000005540 biological transmission Effects 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 229910003460 diamond Inorganic materials 0.000 description 2
- 239000010432 diamond Substances 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 238000004088 simulation Methods 0.000 description 2
- 238000009825 accumulation Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 230000001934 delay Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 239000004744 fabric Substances 0.000 description 1
- 230000002349 favourable effect Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000008092 positive effect Effects 0.000 description 1
- 230000001681 protective effect Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000004904 shortening Methods 0.000 description 1
- 238000012795 verification Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0246—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
- G05D1/0253—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting relative motion information from a plurality of images taken successively, e.g. visual odometry, optical flow
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0221—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0223—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving speed control of the vehicle
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0276—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/02—Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Electromagnetism (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
The invention discloses a cluster robot experiment platform, and belongs to the technical field of robots. The experimental platform comprises a cluster robot, an experimental site, a communication system, a positioning system and an upper computer. The cluster robot is a desktop mobile robot driven by two wheels at different speeds and moves in an experimental field. The communication system carries a wireless communication module on the upper computer and each robot to form a wireless ad hoc network. The positioning system comprises a depth camera and a positioning program, the depth camera is installed right above the center of the experiment site, and the positioning program calculates the position of each cluster robot. And the upper computer sends a control command to a single robot or the whole cluster robot and displays the running state of the robot. The invention integrates the robot, the communication system, the positioning system and the upper computer, has the conditions for completing experiments such as formation control, area coverage and the like by centralized and distributed clusters, is convenient for transplanting and verifying the cluster control algorithm, and is beneficial to improving and evaluating the cluster control algorithm.
Description
Technical Field
The invention belongs to the technical field of robots, and particularly relates to a cluster robot experiment platform.
Background
At present, the cluster robot has become one of the hot research fields of the robot, and a plurality of cluster self-organizing control algorithms are emerged. In view of the cost, time consumption and complexity of physical experiments, especially for large-scale clusters and underwater clusters, most researchers can usually verify the effectiveness of the control algorithm only by means of theoretical proof and simulation. However, it is difficult to accurately simulate real interactions between robots and environments and the influence of environmental noise and interference, sensor errors, communication delays and other factors on the performance of the control method by theoretical calculation and simulation. Therefore, in order to realize the experimental verification of the trunking control algorithm, it is necessary to provide a trunking robot experimental platform.
Disclosure of Invention
In order to make up for the defects of the prior art, the invention provides a cluster robot experiment platform which has the functions of cluster positioning, communication, control, state information display and the like, supports the experiments of centralized and distributed cluster robots and is beneficial to verifying the effectiveness of a cluster control algorithm.
The invention provides an experimental platform of a cluster robot, which comprises the cluster robot, an experimental site, a communication system, a positioning system and an upper computer. The cluster robot is a desktop mobile robot driven by two wheels at different speeds and moves in an experimental field. The robot itself carries a distance sensor. The communication system is as follows: and wireless communication modules are carried on the upper computer and each cluster robot to form a wireless ad hoc network. The positioning system comprises a depth camera and a positioning program, the depth camera is installed right above the center of the experiment site and connected with the upper computer, and the positioning program calculates the pose information of each cluster robot and transmits the pose information to the upper computer. And the upper computer issues a control command of a single robot or the whole cluster robot and displays the running state of the robot.
In the positioning system, the positioning program utilizes the image shot by the depth camera to perform external auxiliary positioning on the robot.
In the positioning system, a positioning program is provided with two robot self-positioning modes, one mode is that the position of the robot at the current moment is obtained by accumulating and calculating the angle and the speed output by a gyroscope and an encoder carried by the robot; one is to obtain the current position by measuring the distance to the enclosure of the experimental site through a distance sensor carried by the robot when the direction of the robot is parallel to the X axis or the Y axis of the global coordinate system.
The upper computer is composed of an individual control module and a cluster control module. And the upper computer in the individual control module establishes communication with any robot in the cluster and sends a target pose, a tracking path and a target speed to the robot. Wherein the tracking path is represented by six coefficients of a binary quadratic equation. The cluster control module is characterized in that the upper computer is communicated with all robots in the cluster, and sends cluster control commands, robot motion states and environment information to all robots. The cluster control command includes a target position setting, a target formation setting, and a target area setting. The target area is a polygonal area and a circular area, wherein the polygonal area is represented by a set of coordinates of all vertexes of a polygon, and the circular area is represented by a circle center coordinate and a radius.
The upper computer sends the command in the format: robot address + function code + data.
The invention has the advantages and positive effects that:
(1) the cluster robot experiment platform integrates the robot, the communication system, the positioning system and the upper computer, has the conditions for completing experiments such as formation control, area coverage and the like by centralized and distributed clusters, and is convenient for transplanting and verifying a cluster control algorithm.
(2) The experimental platform for the cluster robot combines external auxiliary positioning with robot self-positioning, and the upper computer can store the motion path of the cluster and monitor the running state of the cluster, thereby being beneficial to the improvement and evaluation of a cluster control algorithm.
(3) In the experimental platform for the clustered robot, the tracking path of the robot is described by adopting the coefficients of a binary quadratic equation, the target polygonal area is described by adopting a polygonal vertex coordinate set, and the target circular area is described by adopting a circle center coordinate and a radius, so that the data volume of a control command can be effectively reduced, and the communication load is reduced.
Drawings
FIG. 1 is a schematic diagram of a clustered robot experimental platform according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of the robot motions of an embodiment of the present invention;
FIG. 3 is a schematic diagram of a robot self-positioning using a distance sensor according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of functional modules implemented by an upper computer according to an embodiment of the present invention;
fig. 5 is a schematic diagram of the formation of the cluster robot according to the embodiment of the present invention;
fig. 6 is a flowchart of the operation of the experimental platform for the clustered robot according to the embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples.
As shown in fig. 1, the experimental platform for the clustered robot in the embodiment of the present invention includes: the system comprises an upper computer 1, an experimental site 2, a cluster robot 3, a communication system 4 and a positioning system 5. The cluster robot 3 is a desktop mobile robot driven by two wheels in a differential mode and moves in the experimental field 2; the communication system 4 consists of a plurality of wireless ad hoc network communication modules, and the wireless communication modules are carried on the upper computer 1 and each cluster robot 3, so that mutual remote data transmission can be realized; the positioning system 5 comprises a depth camera and a positioning program, the depth camera is installed right above the center of the experimental site 2 and connected with the upper computer 1, and the positioning program calculates the pose information of each cluster robot 3 according to the obtained pictures and transmits the pose information to the upper computer 1; the upper computer 1 can issue a control command of a certain robot or the whole cluster and display the running state information of the robot or the whole cluster.
The cluster robot 3 adopts a modular design program, comprises a decision control module, a driving module, a communication module and a sensing module, and is easy to transplant a cluster control algorithm. The decision control module can call other modules to control the motion of the robot. Before the cluster experiment is carried out, a cluster control algorithm is written into a decision control module and is burnt into a main controller of the robot. The driving module is used for driving the robot to act. The communication module is a wireless communication module and is in ad hoc network communication with other wireless communication modules. Each robot carries a gyroscope, an encoder, and a distance sensor. The sensing module acquires the rotation angle of the robot through a gyroscope; acquiring pulse numbers through encoders of a left wheel motor and a right wheel motor so as to further calculate the speed of the robot; the distance to the surroundings can be measured by means of a distance sensor.
The bottom surface of experiment place 2 is spray-painted cloth, is the rail of KT board preparation all around, can come from by the size that changes the place through lengthening or shortening the rail.
The communication system 4 is composed of a plurality of wireless communication modules based on the ZigBee protocol, has long transmission distance and can stably work for a long time. The address of each wireless communication module is unique, and after being electrified, the wireless communication modules and the surrounding wireless communication modules automatically form a multi-hop mesh network, so that data can be sent to any node in the network. The upper computer 1 and the robot 3, to which the wireless communication module is connected, are both nodes in the network. Based on a communication network, the communication system can realize three communication modes, namely one-to-one, one-to-many, many-to-many and the like. The one-to-one mode means that data sent (or received) by one node (an upper computer or a robot) can only be received (or sent) by another specific node; a pair of multiple modes, which means that data transmitted (or received) by one node can be received (or transmitted) by multiple nodes; the many-to-many mode means that all nodes can send data to other nodes and can also receive data sent by other nodes.
The positioning system 5 comprises a robot self-positioning mode and an external auxiliary positioning mode, wherein the robot has two self-positioning modes.
The first self-positioning method is to perform an accumulation calculation to obtain the current position of the robot through angle and speed data output by a gyroscope and an encoder carried by the robot, as shown in fig. 2: setting a point at the lower left corner of the experimental site 2 as an origin, establishing a global coordinate system, wherein the pose of the robot can be represented by (x, y, theta), which are an x coordinate, a y coordinate and a robot corner theta of the central position of the robot respectively; the motion states thereof can be represented by (v, ω), which are the forward velocity v and the angular velocity ω, respectively. The sensing module can output the pulse number n generated by the left wheel encoder within the time delta t l The number n of pulses generated by the right wheel encoder r And a robot rotation angle theta. The wheel rotates for one circle, the number of pulses generated by the encoders of the left motor and the right motor is N, the diameter of the driving wheel is d, and the distance between the wheels is L. Let (x) i ,y i ,θ i )、(v i ,ω i ) Pose of the robot at time iThen, there are:
v i =(v l +v r )/2,ω i =(v r -v l )/L
θ i =θ,x i =x i-1 +v i ·Δt·cos(θ i ),y i =y i-1 +v i ·Δt·sin(θ i )
wherein v is l 、v r Respectively, the speeds of the left and right wheels of the robot, and the unit is mm/s. x is the number of i-1 、y i-1 Is the coordinate position of the robot at the moment i-1.
The second self-positioning mode is to obtain the current position by measuring the distance to the enclosure of the experimental site through the self-carried distance sensor when the robot direction is parallel to the X-axis or Y-axis of the global coordinate system, i.e. the angle is 0(360 °), 90 °, 180 ° or 270 °, as shown in fig. 3. The distance value measured by two distance sensors facing the left side wall and the lower side wall of the experimental site is l 1 、l 2 Then the current robot position, i.e. x, can be obtained i =l 1 、y i =l 2 。
The accuracy of the pose calculated by the two self-positioning methods depends on the precision of the robot sensor to a great extent, the noise of the encoder and the distance sensor and the drift of the gyroscope are inevitable, a non-negligible accumulated error is gradually generated along with the increase of the time of the robot, and the initial pose of the robot cannot be obtained. Therefore, the positioning system also provides an external auxiliary positioning mode to obtain the robot pose with high precision and no accumulated error. In the positioning mode, a depth camera arranged above the center of an experimental site is utilized, and the pose of a circular robot (namely a robot coordinate system) under a camera coordinate system is identified by a Hough transform function, so that the pose of the robot under a global coordinate system can be calculated and obtained, and the pose is expressed as follows:
P i = j i TP j
wherein, P j Is the pose of the robot under the depth camera coordinate system, j i t is the pose of the depth camera coordinate system relative to the global coordinate system, P i And the pose of the robot under the global coordinate is obtained.
The upper computer 1 is composed of an individual control module and a cluster control module, and has an individual or cluster control command issuing function and a state information display function, as shown in fig. 4. In the individual control module, the upper computer can establish communication with any robot in the cluster and send the target pose, the tracking path and the target speed of the robot. Wherein the object pose is (x) d ,y d ,θ d ) Represents; (v) for target speed d ,ω d ) Represents; the tracking path is expressed by six coefficients (a, b, c, d, e, f) of a binary quadratic equation, and the path expression is shown as follows:
ax 2 +by 2 +cxy+dx+ey+f=0
wherein x and y are position coordinates.
For example, a straight path y — x may be represented by (0,0,0, -1,1,0), and a circular path having a center at (5,5) and a radius at 10 may be represented by (1,1,0, -10, -10, -50).
The command format sent by the upper computer 1 is as follows: robot address + function code + data. And after receiving the transmitted data, the robot executes corresponding operation according to the cluster control program. In addition, the upper computer 1 displays set target points and paths in a drawing mode, and updates the pose of the robot in real time.
In the cluster control module, the upper computer needs to establish communication with all robots in the cluster, and transmits control commands, robot motion states, environmental information and the like. First, a cluster communication mode, i.e., centralized communication and distributed communication, needs to be selected. Centralized communication, namely each robot can obtain global environment information from an upper computer, wherein the global environment information comprises the pose and speed information of all other robots in a cluster and the external auxiliary positioning information of the robot; distributed communication, namely each robot can only obtain local environment information within a set range from an upper computer,the pose and speed information of other robots in the range can only depend on self-positioning, and the external auxiliary positioning information of the robots cannot be obtained. The cluster control command includes a target position setting, a target formation setting, and a target area setting. Wherein the target position is represented by (x) sd ,y sd ) Represents; the target formation comprises two types of triangle formation and diamond formation, and is represented by (type, D, α), wherein type 0 is triangle formation, type 1 is diamond formation, D is distance between individuals in formation, and α is angle between individuals in formation, as shown in fig. 5; the target area can be divided into a polygonal area and a circular area, wherein the polygonal area is represented by a set of all vertex coordinates of a polygon, i.e., { (x) 1 ,y 1 ),(x 2 ,y 2 ),(x 3 ,y 3 ) …, the circular area is represented by the coordinates of the center of the circle and the radius, i.e. (x) o ,y o R). The state display module can display the poses of all individuals in the cluster, the target formation and the target coverage area, and is favorable for detecting the state of the cluster and evaluating the experimental result.
In the prior art, the path and the target area are generally described by adopting a point sequence which is positioned on the contour of the path and the target area and is equally spaced, the accuracy of the description method is related to the interval size of the points, and as the path length increases or the target area increases, the length of the point sequence also increases, thereby increasing the data volume of the control command. The invention adopts the polygon vertex coordinate set to describe the target polygon area and adopts the circle center coordinate and the radius to describe the target circular area, the control command data volume has no direct relation with the path length or the size of the target area, and is only related with the shape, thereby reducing the data volume while maintaining the accuracy.
The working process of the experimental platform of the clustered robot is shown in fig. 6. Firstly, burning cluster control programs for all cluster robots, and putting the robots into an experimental field to be electrified after the cluster control programs are finished; opening an upper computer interface, starting a wireless communication module, and establishing an ad hoc network communication network with all robots; an external positioning module of the positioning system is used for sending a positioning result to each robot through an upper computer to finish the initialization of the pose of the robot; the upper computer sends an individual/cluster control command; the robot executes tasks according to the control program after receiving the command; updating the self pose through a self-positioning module in the task execution process, and issuing the self pose to an upper computer; the upper computer displays the pose of the individual/cluster; after completing the task, the robot waits for the next command.
In addition to the technical features described in the specification, the technology is known to those skilled in the art. Descriptions of well-known components and techniques are omitted so as to not unnecessarily obscure the present invention. The embodiments described in the above embodiments do not represent all embodiments consistent with the present application, and various modifications or variations which may be made by those skilled in the art without inventive efforts based on the technical solution of the present invention are still within the protective scope of the present invention.
Claims (8)
1. A clustered robot experimental platform, comprising: the system comprises a cluster robot, an experimental site, a communication system, a positioning system and an upper computer;
the cluster robot is a desktop mobile robot driven by two wheels in a differential mode and moves in the experimental site; the robot carries a distance sensor; the communication system is provided with a wireless communication module on the upper computer and each cluster robot to form a wireless ad hoc network; the positioning system comprises a depth camera and a positioning program, the depth camera is installed right above the center of the experimental site and connected with the upper computer, and the positioning program calculates the pose information of each cluster robot and transmits the pose information to the upper computer; and the upper computer issues a control command of a single robot or the whole cluster robot and displays the running state of the robot.
2. The clustered robot experimental platform of claim 1, wherein the clustered robot has a modular design of functions, comprising a decision control module, a driving module, a communication module and a sensing module; before a cluster experiment is carried out, a cluster control algorithm is written into the decision control module; the driving module drives the robot to act; the communication module is used for wireless communication; the sensing module acquires output data from a gyroscope, an encoder and a distance sensor carried by the robot and sends the output data to the positioning system.
3. The clustered robot experimental platform of claim 1 wherein the positioning system, the positioning program on the positioning system, is used for externally assisting in positioning the robot by:
acquiring an image shot by a depth camera, identifying the position of the robot under a camera coordinate system by utilizing a Hough transform function, and calculating to obtain the pose of the robot under a global coordinate system as follows:
wherein, P j Is the pose of the robot under the depth camera coordinate system,for the pose of the depth camera coordinate system relative to the global coordinate system, P i The pose of the robot under the global coordinate is determined; the global coordinate system is established by taking a point at the lower left corner of the experimental field as an origin.
4. The clustered robot experimental platform of claim 1, 2 or 3 wherein the positioning system, on which the positioning program performs robot self-positioning, adopts the following method:
setting a point at the lower left corner of an experimental site as an origin, and establishing a global coordinate system; through the angle and the speed of the gyroscope and the encoder output that robot self carried, add up the position that calculates and obtain the robot current moment, specifically include:
the wheels of the robot rotate for one circle, the number of pulses generated by encoders of motors of the left and right driving wheels is N, the diameter of each driving wheel is d, and the distance between the wheels is L; setting the coordinate position of the robot at the moment i-1Is set to x i-1 ,y i-1 (ii) a Acquiring the number n of pulses generated by a left wheel encoder of the robot in the time delta t l The number n of pulses generated by the right wheel encoder r And a robot rotation angle θ; and acquiring the pose of the robot at the moment i as follows:
v i =(v l +v r )/2,ω i =(v r -v l )/L
θ i =θ,x i =x i-1 +v i ·Δt·cos(θ i ),y i =y i-1 +v i ·Δt·sin(θ i )
wherein v is l 、v r Respectively representing the speeds of the left driving wheel and the right driving wheel of the robot; v. of i The forward speed of the robot at the moment i; omega i The angular velocity of the robot at the moment i; x is a radical of a fluorine atom i ,y i Coordinate position of the robot at time i, theta i And the turning angle of the robot at the moment i.
5. The clustered robot experimental platform of claim 1 wherein the positioning system, the positioning program thereon, performs robot self-positioning by:
setting a point at the lower left corner of an experimental site as an origin, and establishing a global coordinate system;
when the direction of the robot is parallel to the X axis or the Y axis of the global coordinate system, the distance between the robot and the enclosure of the experimental site is measured through a distance sensor carried by the robot to obtain the current position.
6. The clustered robot experimental platform of claim 1 or 3 wherein the upper computer is composed of an individual control module and a cluster control module;
the upper computer in the individual control module establishes communication with any robot in the cluster and sends a target pose, a tracking path and a target speed to the robots; wherein, the tracking path is expressed by six coefficients (a, b, c, d, e, f) of a binary quadratic equation, and the represented path expression is as follows:
ax 2 +by 2 +cxy+dx+ey+f=0
wherein x and y are position coordinates;
the upper computer in the cluster control module establishes communication with all robots in the cluster and sends a cluster control command, the motion state of the robots and environmental information to all the robots; the cluster control command comprises target position setting, target formation setting and target area setting; the target area is a polygonal area represented by a set of coordinates of all vertices of a polygon and a circular area represented by coordinates of a center of a circle and a radius.
7. The clustered robot experimental platform of claim 6 wherein said clustered control module is configured with two clustered communication modes comprising: centralized communication and distributed communication; under a centralized communication mode, each robot can obtain global environment information from an upper computer, wherein the global environment information comprises the poses of all other robots in a cluster and external auxiliary positioning information of the robot; under the distributed communication mode, each robot obtains local environment information in a set range of the robot from an upper computer, the local environment information comprises poses of other robots in the set range, and external auxiliary positioning information of the robot cannot be obtained.
8. The clustered robot experimental platform of claim 6 wherein the upper computer sends commands in the format of: robot address + function code + data.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210472124.4A CN114839990A (en) | 2022-04-29 | 2022-04-29 | Cluster robot experiment platform |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210472124.4A CN114839990A (en) | 2022-04-29 | 2022-04-29 | Cluster robot experiment platform |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114839990A true CN114839990A (en) | 2022-08-02 |
Family
ID=82567260
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210472124.4A Pending CN114839990A (en) | 2022-04-29 | 2022-04-29 | Cluster robot experiment platform |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114839990A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117009089A (en) * | 2023-09-28 | 2023-11-07 | 南京庆文信息科技有限公司 | Robot cluster supervision and management system based on distributed computing and UWB positioning |
-
2022
- 2022-04-29 CN CN202210472124.4A patent/CN114839990A/en active Pending
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117009089A (en) * | 2023-09-28 | 2023-11-07 | 南京庆文信息科技有限公司 | Robot cluster supervision and management system based on distributed computing and UWB positioning |
CN117009089B (en) * | 2023-09-28 | 2023-12-12 | 南京庆文信息科技有限公司 | Robot cluster supervision and management system based on distributed computing and UWB positioning |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111156998B (en) | Mobile robot positioning method based on RGB-D camera and IMU information fusion | |
JP6868028B2 (en) | Autonomous positioning navigation equipment, positioning navigation method and autonomous positioning navigation system | |
CN111240319A (en) | Outdoor multi-robot cooperative operation system and method thereof | |
CN111308490B (en) | Balance car indoor positioning and navigation system based on single-line laser radar | |
CN111536967A (en) | EKF-based multi-sensor fusion greenhouse inspection robot tracking method | |
CN109163675B (en) | Method for detecting angular pendulum shaft position accuracy based on laser tracker | |
Li et al. | Localization and navigation for indoor mobile robot based on ROS | |
CN104525420A (en) | Spraying robot control method based on three-dimensional model recognition | |
CN113093756A (en) | Indoor navigation robot based on laser SLAM under raspberry group platform | |
Shen et al. | Research and implementation of SLAM based on LIDAR for four-wheeled mobile robot | |
CN104525422A (en) | Method for controlling paths of spray robot | |
Tavakoli et al. | Motion control of an omnidirectional climbing robot based on dead reckoning method | |
CN110095659B (en) | Dynamic testing method for pointing accuracy of communication antenna of deep space exploration patrol device | |
CN107102653B (en) | Device and method for controlling ground angle of mounting equipment of unmanned aerial vehicle | |
CN104525424B (en) | A kind of optical measuring apparatus for coating robot coats' path setting | |
CN110837257A (en) | AGV composite positioning navigation system based on iGPS and vision | |
CN114839990A (en) | Cluster robot experiment platform | |
CN104525423A (en) | Spray method for controlling spray robot | |
CN204448383U (en) | A kind of intelligent robot paint finishing | |
CN204469968U (en) | A kind of optical measuring apparatus for coating robot coats's path setting | |
CN112461123B (en) | Method and device for realizing multiple transmitting stations of space positioning system | |
WO2021217341A1 (en) | Obstacle avoidance method, moveable platform, control device, and storage medium | |
CN110989350B (en) | Method and device for realizing pose optimization of underground mobile robot based on membrane calculation | |
Wang et al. | Micro aerial vehicle navigation with visual-inertial integration aided by structured light | |
Zana et al. | Feedback motion control of a spatial double pendulum manipulator relying on swept laser based pose estimation |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |