CN116430871A - Multi-scene building robot collaborative construction planning control method - Google Patents
Multi-scene building robot collaborative construction planning control method Download PDFInfo
- Publication number
- CN116430871A CN116430871A CN202310492240.7A CN202310492240A CN116430871A CN 116430871 A CN116430871 A CN 116430871A CN 202310492240 A CN202310492240 A CN 202310492240A CN 116430871 A CN116430871 A CN 116430871A
- Authority
- CN
- China
- Prior art keywords
- robot
- working
- construction
- robots
- central server
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000010276 construction Methods 0.000 title claims abstract description 71
- 238000000034 method Methods 0.000 title claims abstract description 34
- 230000003993 interaction Effects 0.000 claims abstract description 8
- 239000000463 material Substances 0.000 claims description 8
- 238000005516 engineering process Methods 0.000 claims description 7
- 230000001502 supplementing effect Effects 0.000 claims description 6
- 230000007613 environmental effect Effects 0.000 claims 1
- 230000003068 static effect Effects 0.000 claims 1
- 230000010485 coping Effects 0.000 abstract 1
- 239000011449 brick Substances 0.000 description 7
- 238000004140 cleaning Methods 0.000 description 7
- 238000009435 building construction Methods 0.000 description 6
- 238000001514 detection method Methods 0.000 description 6
- 238000011161 development Methods 0.000 description 2
- 238000013459 approach Methods 0.000 description 1
- 230000004888 barrier function Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 239000004035 construction material Substances 0.000 description 1
- 238000011109 contamination Methods 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 239000000428 dust Substances 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
- 230000001737 promoting effect Effects 0.000 description 1
- 239000002699 waste material Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0238—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
- G05D1/024—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0214—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0219—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory ensuring the processing of the whole working surface
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/02—Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Optics & Photonics (AREA)
- Electromagnetism (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
The invention discloses a multi-scene building robot collaborative construction planning control method, which comprises the following steps: selecting a working scene; selecting the types, the numbers and the parameters of robots; acquiring a working environment map and uploading the working environment map to a central server; marking a working environment map; the central server divides working areas according to task requirements and BIM information, and divides different working areas which are not required to the same type of robots which meet the task requirements; the central server subdivides the working areas of the same type of robots again according to the number of the same type of robots, and the division of the working areas is completed and sent to each airborne computer; the robot onboard computer generates a working path meeting the self working requirements in the distributed working subarea; the central server and each onboard computer form information interaction and are used for coping with sub-working area repartition after the robot fails; the invention can improve the construction efficiency and quality and ensure the continuity and reliability of the construction process.
Description
Technical Field
The invention relates to the field of construction machinery, in particular to a multi-scene construction robot collaborative construction planning control method
Background
Along with the continuous development and progress of technology, the building construction industry is continuously transformed and innovated. Therefore, the robot technology is applied to the field of building construction and becomes a very potential technical means.
However, in the conventional construction process, robotics have not been fully utilized. On the one hand, the particularity and complexity of the construction industry itself place a number of restrictions on the application of robotics. On the other hand, the limitations of robotics themselves also limit their application in building construction. To overcome these challenges and difficulties, the collaboration of multi-scene robots is a necessary technical approach.
The realization of the multi-scene building robot cooperative construction not only can improve the efficiency and quality of construction, but also can reduce the risk of personnel and robots, and has very important significance for promoting the development and innovation of building construction.
Disclosure of Invention
Aiming at the problems, the invention aims to provide a multi-scene building robot collaborative construction planning control method.
The invention provides a multi-scene building robot collaborative construction planning control method, which comprises the following steps: s1, selecting an indoor or outdoor working environment in a central server, and reading the type of a building robot in the working environment according to the environment of the working environment;
s2, inputting the types and the number of the robots;
s3, acquiring map information of the working environment and uploading the map information to a central server, wherein the map information is divided into two acquisition modes according to the known and unknown conditions of the map information;
when the map information is known, converting the map information into a grid map and directly uploading the grid map information to a central server;
when the map information is unknown, map point cloud information is obtained through mobile scanning of the SLAM robot in a working environment, and the map point cloud information is converted into a grid map and uploaded to a central server;
combining the map information with BIM technology to generate a working environment map;
s4, marking the working environment map, marking construction areas, charging pile positions, material supplementing positions and the like of different types of robots, and initializing the positions of the robots;
s5, dividing working areas by the central server according to task requirements and BIM information, and dividing different working areas which are not required to the same type of robots which meet the task requirements;
s6, dividing the working areas of the robots into subareas according to the number of the robots of the same type, and issuing the working subareas belonging to each robot to an airborne computer of the robot;
s7, a robot airborne computer generates a working path meeting the self working requirements in the allocated working subarea, the robot carries a laser radar and a GPS, and the airborne computer and a central server form information interaction for preventing collision with other robots in the moving process; .
And S8, the central server interacts information with each robot, and when one robot fails, the working area of the robot is distributed to other robots.
The multi-scene building robot collaborative construction planning control method has the preferable scheme that the central server and each robot onboard computer form information interaction and are used for transmitting the position information of the robot, the distributed working area information, the fault information of the robot and the like.
According to the multi-scene building robot collaborative construction planning control method, when BIM technology is integrated in the map information acquisition of the working environment, multiple links such as design drawings, construction plans, material management and real-time monitoring of construction sites can be integrated in part of construction cases, and information sharing and collaboration are realized; by applying the BIM technology, the construction efficiency and the precision of the intelligent robot are further improved, meanwhile, the waste and the error in the construction process are reduced, and the quality and the efficiency of the whole building construction are improved.
The method for controlling the collaborative construction planning of the multi-scene building robot is characterized in that the central server comprises the following steps of: marking construction areas of robots of different types, positions of charging piles, positions of material supplementing and the like; in order to achieve the acceptance standard and ensure the construction quality, for some areas which cannot be moved again in a short time after construction, the areas are regarded as limited barriers during construction and after construction, and other robots are prevented from moving to the areas; the information is processed by the central server and then sent to each onboard computer.
According to the multi-scene building robot collaborative construction planning control method, the building construction environment is a complex environment in practice, so that the division of the working subareas is performed by adopting a Boustrophedon method.
The optimal scheme of the multi-scene building robot collaborative construction planning control method is that the laser radar and the GPS carried by the robot are interacted with each other, wherein GPS information and a central server are interacted with each other; and the robustness of real-time obstacle avoidance is increased by combining GPS information and laser radar information.
The preferable scheme of the method for controlling the collaborative construction planning of the multi-scene building robot is that the working path meeting the self requirement in the step S7 comprises the following steps: the construction path meeting the construction principle and kinematic requirements of the robot is provided; when the self-allocated working subareas are not connected, generating the shortest motion path among the working subareas; when the self electric quantity is insufficient, the path of the nearest charging pile is reached; and (3) a path for reaching a material supplementing point when the construction material is insufficient.
The optimal scheme of the multi-scene building robot collaborative construction planning control method is that the fault processing method in the step S8 comprises the following steps: the failed robot uploads the information to the central server; the central server reassigns the work subareas belonging to the faulty robots to one or more robots of the same class according to the shortest working time principle; and the robot which receives the repartitioning work subareas reschedules the work path which meets the self work requirement.
The beneficial effects are that:
the invention can realize the collaborative construction of the multi-scene building robot and generate the most economical task allocation and working path according to the map information, thereby improving the construction efficiency and quality. In addition, the invention can save labor cost, reduce engineering accident risk, adapt to complex construction environment, and simultaneously cope with the problem that the robot breaks down, thereby ensuring the continuity and reliability of the construction process.
Drawings
Fig. 1 is a flowchart of a collaborative construction planning control method.
Detailed Description
The following description of the technical solutions in the embodiments of the present invention will be clear and complete, and it is obvious that the described embodiments are only some embodiments of the present invention, but not all embodiments.
As shown in fig. 1, an embodiment of a multi-scenario construction robot collaborative construction planning control method is provided; the combined construction control embodiment of the brick wall robot, the plastering robot, the quality detection robot and the cleaning robot.
S1, selecting a working environment as outdoor in a central server;
s2, selecting the type of robots as brick wall robots, plastering robots, quality detection robots and cleaning robots and selecting the number;
s3, acquiring map information of the working environment and uploading the map information to a central server, wherein the map information is divided into two acquisition modes according to the known and unknown conditions of the map information;
when the map information is known, converting the map information into a grid map and directly uploading the grid map information to a central server;
when the map information is unknown, map point cloud information is obtained through mobile scanning of the SLAM robot in a working environment, and the map point cloud information is converted into a grid map and uploaded to a central server;
combining the map information with BIM technology to generate a working environment map;
s4, marking the working environment map, marking construction areas of the brick wall robot, the plastering robot and the cleaning robot, charging pile positions, material supplementing positions and the like, and initializing the positions of the robots;
s5, dividing working areas by the central server according to task requirements and BIM information, and dividing working areas of the brick wall robot, the plastering robot, the quality detection robot and the cleaning robot;
s6, dividing the working areas of the brick wall robots, the plastering robots, the quality detection robots and the cleaning robots into subareas according to the number of the brick wall robots, the plastering robots, the quality detection robots and the cleaning robots, and issuing the working subareas belonging to each robot to an onboard computer of the robot;
s7, a robot airborne computer generates a working path meeting the self working requirements in the allocated working subarea, the robot carries a laser radar and a GPS, and the airborne computer and a central server form information interaction for preventing collision with other robots in the moving process;
and S8, the central server interacts information with each robot, and when one robot fails, the working area of the robot is distributed to other robots.
In this embodiment, the working area of the brick wall robot is a wall body which is not piled up in construction. The working area of the plastering robot is a wall body after being sufficiently dried. The quality detection robot works in the whole construction area, so that quality problems in construction can be detected rapidly, and correction can be performed in time. The cleaning robot may help clean the unobstructed area in the construction area to reduce dust and contamination during construction.
Claims (9)
1. The method for controlling the collaborative construction planning of the multi-scene building robot is characterized by comprising the following steps of:
s1, selecting an indoor or outdoor working environment in a central server, and reading the type of a building robot in the working environment according to the working environment;
s2, inputting the types, the numbers and the parameters of the robots;
s3, acquiring map information of the working environment and uploading the map information to a central server, wherein the map information is divided into two acquisition modes according to the known and unknown conditions of the map information;
when the map information is known, converting the map information into a grid map and directly uploading the grid map information to a central server;
when the map information is unknown, map point cloud information is obtained through mobile scanning of the SLAM robot in a working environment, and the map point cloud information is converted into a grid map and uploaded to a central server;
combining the map information with BIM technology to generate a working environment map;
s4, marking the working environment map, marking the construction areas of different types of robots, the positions of charging piles and the positions of supplementing materials, and initializing the positions of the robots;
s5, dividing working areas by the central server according to task requirements and BIM information, and dividing the working areas with different task requirements to the robots of the same type meeting the task requirements;
s6, dividing the working areas of the robots into subareas according to the number of the robots of the same type, and issuing the working subareas belonging to each robot to an airborne computer of the robot;
s7, a robot airborne computer generates a working path meeting the self working requirements in the allocated working subarea, the robot carries a laser radar and a GPS, and the airborne computer and a central server form information interaction for preventing collision with other robots in the moving process;
and S8, the central server interacts information with each robot, and when one robot fails, the working area of the robot is distributed to other robots.
2. The method for controlling collaborative construction planning of a multi-scenario construction robot according to claim 1, wherein the central server is communicatively connected with an on-board computer, GPS, and a respective robot.
3. The multi-scene building robot collaborative construction planning control method according to claim 1 is characterized in that the building robots used in different construction scenes are different, and the building robots are required to be selected according to specific construction tasks, environment factors, construction flows and other factors so as to realize efficient, safe and reliable construction processes.
4. The method for controlling collaborative construction planning of a multi-scene construction robot according to claim 1, wherein, in order to meet acceptance criteria and ensure construction quality, for an area which cannot be moved again in a short time after construction, it is regarded as an obstacle during and after construction, and other robots are prevented from moving to the area; the information is processed by the central server and then sent to each onboard computer.
5. The method according to claim 1, wherein the working path satisfying the requirement in step S7 includes: the construction path which meets the construction principle of the robot, has kinematic requirements and is safe without collision is satisfied; when the self-allocated working subareas are not connected, generating the shortest motion path among the working subareas; when the self electric quantity is insufficient, the path of the nearest charging pile is reached; and (3) constructing a road reaching a material supplementing point when the material is insufficient.
6. The method for collaborative construction planning control of a multi-scenario construction robot according to claim 1, wherein the division of the work subareas in step S6 employs B The outlook method is used for dividing.
7. The method for planning and controlling the collaborative construction of the multi-scene building robot according to claim 1, wherein the central server in the step S7 is configured to receive the environmental map information, form information interaction with each robot on-board computer, obtain real-time position information of each robot, and send the information to each on-board computer for real-time obstacle avoidance.
8. The method for controlling collaborative construction planning of a multi-scenario construction robot according to claim 1, wherein a central server forms information interaction with each robot onboard computer; when a certain robot fails, the information is uploaded to a central server, and the central server can distribute the work subareas belonging to the robot to other robots according to the shortest working time principle.
9. The multi-scene building robot collaborative construction planning control method according to claim 1, wherein when different kinds of robot construction areas overlap, the robot moves in the overlapping area to perform static and dynamic obstacle avoidance by means of information interaction between an onboard computer and a central server, self-carried laser radar and GPS (global positioning system) in a collaborative manner; meanwhile, whether the operations between the two are mutually influenced is considered, so that the operation modes are judged to be sequential or simultaneous.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310492240.7A CN116430871A (en) | 2023-05-04 | 2023-05-04 | Multi-scene building robot collaborative construction planning control method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310492240.7A CN116430871A (en) | 2023-05-04 | 2023-05-04 | Multi-scene building robot collaborative construction planning control method |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116430871A true CN116430871A (en) | 2023-07-14 |
Family
ID=87087294
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310492240.7A Pending CN116430871A (en) | 2023-05-04 | 2023-05-04 | Multi-scene building robot collaborative construction planning control method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116430871A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117590816A (en) * | 2023-12-14 | 2024-02-23 | 湖南比邻星科技有限公司 | Multi-robot cooperative control system and method based on Internet of things |
-
2023
- 2023-05-04 CN CN202310492240.7A patent/CN116430871A/en active Pending
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117590816A (en) * | 2023-12-14 | 2024-02-23 | 湖南比邻星科技有限公司 | Multi-robot cooperative control system and method based on Internet of things |
CN117590816B (en) * | 2023-12-14 | 2024-05-17 | 湖南比邻星科技有限公司 | Multi-robot cooperative control system and method based on Internet of things |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Liang et al. | Human–robot collaboration in construction: Classification and research trends | |
EP3612906B1 (en) | Method and system for environment map generation and alignment | |
CN111103887A (en) | Multi-sensor-based multi-mobile-robot scheduling system design method | |
Elattar | Automation and robotics in construction: opportunities and challenges | |
US20240069569A1 (en) | Mobile robot platoon driving system and control method thereof | |
CN116430871A (en) | Multi-scene building robot collaborative construction planning control method | |
Helm et al. | In-situ robotic fabrication: advanced digital manufacturing beyond the laboratory | |
Oleari et al. | Industrial AGVs: Toward a pervasive diffusion in modern factory warehouses | |
CN110849366A (en) | Navigation method and system based on fusion of vision and laser radar | |
KR20220061916A (en) | Mobility platform for autonomous navigation of construction sites | |
CN110834963A (en) | Black light operation management system and method for stacker-reclaimer in bulk material yard | |
JP2022531566A (en) | Self-propelled printing robot and printing method with line printing route optimization | |
Barberá et al. | I-Fork: a flexible AGV system using topological and grid maps | |
Xia et al. | Decentralized coordination of autonomous AGVs for flexible factory automation in the context of Industry 4.0 | |
Beinschob et al. | Advances in 3d data acquisition, mapping and localization in modern large-scale warehouses | |
CN113654558A (en) | Navigation method and device, server, equipment, system and storage medium | |
CN114700944B (en) | Heterogeneous task-oriented double-robot cooperative path planning method | |
Lutz et al. | Towards a robot fleet for intra-logistic tasks: Combining free robot navigation with multi-robot coordination at bottlenecks | |
CN110531725B (en) | Cloud-based map sharing method | |
Mansouri et al. | Multi vehicle routing with nonholonomic constraints and dense dynamic obstacles | |
Prieto et al. | A guide for construction practitioners to integrate robotic systems in their construction applications | |
Dersten et al. | An analysis of a layered system architecture for autonomous construction vehicles | |
Oleari et al. | Improving AGV systems: Integration of advanced sensing and control technologies | |
Lange et al. | Two autonomous robots for the dlr spacebot cup-lessons learned from 60 minutes on the moon | |
KR100929927B1 (en) | Route planning method of mobile cargo ship wall robot |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |