CN114633256A - Autonomous real-time live-action three-dimensional reconstruction and detection robot system - Google Patents
Autonomous real-time live-action three-dimensional reconstruction and detection robot system Download PDFInfo
- Publication number
- CN114633256A CN114633256A CN202210294671.8A CN202210294671A CN114633256A CN 114633256 A CN114633256 A CN 114633256A CN 202210294671 A CN202210294671 A CN 202210294671A CN 114633256 A CN114633256 A CN 114633256A
- Authority
- CN
- China
- Prior art keywords
- capability
- dimensional reconstruction
- action
- robot system
- detection robot
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000001514 detection method Methods 0.000 title claims abstract description 15
- 230000033001 locomotion Effects 0.000 claims abstract description 10
- 230000005540 biological transmission Effects 0.000 claims abstract description 8
- 238000011897 real-time detection Methods 0.000 abstract description 8
- 238000000034 method Methods 0.000 abstract description 5
- 238000010586 diagram Methods 0.000 description 2
- 238000006073 displacement reaction Methods 0.000 description 2
- 230000007613 environmental effect Effects 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000009776 industrial production Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1602—Programme controls characterised by the control system, structure, architecture
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/08—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/02—Sensing devices
- B25J19/021—Optical sensing devices
- B25J19/023—Optical sensing devices including video camera means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1602—Programme controls characterised by the control system, structure, architecture
- B25J9/161—Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Automation & Control Theory (AREA)
- Evolutionary Computation (AREA)
- Fuzzy Systems (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Physics & Mathematics (AREA)
- Artificial Intelligence (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Manipulator (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
The invention discloses an autonomous real-time live-action three-dimensional reconstruction and detection robot system which comprises a mechanical hardware subsystem and a software subsystem, wherein the mechanical hardware subsystem provides three-dimensional space motion capability, three-dimensional space information acquisition capability and data transmission capability for the whole system, the software subsystem is used for collecting three-dimensional space information and processing the three-dimensional space information, and the software subsystem provides decision-making capability for the whole system, wherein the decision-making capability comprises but is not limited to navigation capability, obstacle avoidance capability, three-dimensional reconstruction capability and control data transmission capability. The invention can work in the scene which is not suitable for the prior oblique photography technology, such as indoor or special scenes of mine holes, fire, earthquake and the like; the method can carry out real-time detection and three-dimensional reconstruction on the three-dimensional space, and can reflect real-time performance compared with the oblique photography technology.
Description
Technical Field
The invention relates to the field of robot systems, in particular to an autonomous real-time live-action three-dimensional reconstruction and detection robot system.
Background
In recent years, the requirements and requirements for real-time detection and three-dimensional reconstruction of environmental information in special scenes and industrial production and life are higher and higher. The existing real-time detection method for the environment mostly adopts an outdoor environment, and three-dimensional reconstruction is carried out on the environment by using an oblique photography technology through an unmanned aerial vehicle and a camera. The oblique photogrammetry technology is a high and new technology developed in recent years in the field of surveying and mapping remote sensing, comprehensively senses and reconstructs complex scenes in a large-range, high-precision and high-definition mode, and visually reflects the attributes of the appearance, the position, the height and the like of a ground feature through data achievements generated by high-efficiency data acquisition equipment and a professional data processing flow. In the oblique photography technique, a plurality of sensors are mounted on the same flying platform, and a five-lens camera is commonly used at present. And meanwhile, images are acquired from 5 different angles such as vertical angle and inclined angle, and more complete and accurate information of the ground object is acquired. The shooting at an angle vertical to the ground is carried out to obtain a group of vertically downward images called positive films, and the four groups of images shot by the lens facing to the ground at a certain included angle point to the south, the west and the north respectively called oblique films. After shooting is finished, the positive film and the oblique film are led into three-dimensional reconstruction software in an off-line mode, and the operation of environment three-dimensional reconstruction is finished after a plurality of hours. The oblique photography method depends on that the unmanned aerial vehicle can only be applied to outdoor large-space environment, is not suitable for indoor or mine hole, fire and earthquake environments, and simultaneously can not achieve the purpose of real-time detection. In these scenes, a camera is often mounted on a mobile platform, and a real-time picture is simply transmitted by the camera. The method can achieve the purpose of real time, but needs to depend on human eyes for detection, and meanwhile, three-dimensional reconstruction cannot be carried out.
Therefore, the function of transmitting the real-time picture by using a camera simply cannot meet the requirements of production and life of special scenes and industries, so the requirements of real-time detection and three-dimensional reconstruction are met. The purpose of real-time detection is that real-time detection and scene coverage updating can be rapidly performed on a changed scene on the premise of low time cost, three-dimensional reconstruction of the scene is completed at the same time, a model after three-dimensional reconstruction can be observed by using various three-dimensional software, and personnel can conveniently perform next action according to environmental three-dimensional information.
Disclosure of Invention
The invention aims to provide an autonomous real-time live-action three-dimensional reconstruction and detection robot system which comprises a mechanical hardware subsystem and a software subsystem, wherein the mechanical hardware subsystem provides three-dimensional space motion capability, three-dimensional space information acquisition capability and data transmission capability for the whole system, the software subsystem is used for collecting three-dimensional space information and processing the three-dimensional space information, and the software subsystem provides decision-making capability for the whole system, wherein the decision-making capability comprises but is not limited to navigation capability, obstacle avoidance capability, three-dimensional reconstruction capability and control data transmission capability.
Furthermore, the mechanical hardware subsystem is composed of a mobile platform, a mechanical arm and a scanner.
Furthermore, a mechanical arm is arranged at the front end above the moving platform, and the upper part of the mechanical arm is connected with the scanner through a flange.
Further, the scanner includes the following data sensors: camera, radar.
Further, the scanner comprises the following light sources: infrared light, structured light.
Further, the robotic arm includes, but is not limited to, six degrees of freedom of movement.
Further, the moving mode of the moving platform includes, but is not limited to, a crawler-type and a wheel-type motion mode.
Further, the scanner includes, but is not limited to, picture information, depth information, point cloud information modality.
Further, the type and number of the scanner sensors are not limited.
Further, the type and number of the light sources of the scanner are not limited.
Compared with the prior art, the invention has the following beneficial effects:
1. the invention can work in the scene where the existing oblique photography technology is not suitable, such as indoor or special scenes of mine holes, fire, earthquake and the like.
2. The invention can carry out real-time detection and three-dimensional reconstruction on the three-dimensional space, and can better embody the real-time property compared with the oblique photography technology.
Drawings
FIG. 1 is a schematic diagram of the system architecture of the present invention;
FIG. 2 is a schematic diagram of the mechanical hardware subsystem of the present invention.
The reference numerals and names in the figures are as follows:
1. a mobile platform; 2. a mechanical arm; 3. a scanner; 4. a camera; 5. a radar; 6. a structured light; 7. an infrared sensor.
Detailed Description
As shown in fig. 1, the autonomous real-time live-action three-dimensional reconstruction and detection robot system provided by the present invention includes a mechanical hardware subsystem and a software subsystem, wherein the mechanical hardware subsystem provides three-dimensional spatial motion capability, three-dimensional spatial information acquisition capability and data transmission capability for the entire system, the software subsystem is used for collecting three-dimensional spatial information and processing three-dimensional spatial information, and the software subsystem provides decision-making capability for the entire system, the decision-making capability includes, but is not limited to, navigation capability, obstacle avoidance capability, three-dimensional reconstruction capability and control data transmission capability.
As shown in fig. 2, the mechanical hardware subsystem is composed of a moving platform 1, a robot arm 2 and a scanner 3, the robot arm 2 is arranged at the front end above the moving platform 1, the upper side of the robot arm 2 is connected with the scanner 3 through a flange, rigid connections are arranged between the moving platform 1 and the robot arm 2 and between the robot arm 2 and the scanner 3, when one of the members generates displacement or is stressed, the other member connected with the other member does not generate displacement or relative deformation relative to the first member, and the scanner 3 includes the following data sensors: camera 4, radar; the scanner 3 includes the following light sources: infrared light emitted by the infrared sensor 7, structured light 6.
Specifically, the mechanical arm 2 includes, but is not limited to, six degrees of freedom of movement.
Specifically, the moving mode of the moving platform 1 includes, but is not limited to, a crawler-type and a wheel-type motion mode.
Specifically, the scanner 3 includes, but is not limited to, a picture information modality, a depth information modality, and a point cloud information modality.
Specifically, the camera 4, the radar 5 and the structured light 6 are connected into a whole.
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In the description of the present invention, it should be noted that the terms "upper", "lower", "inner", "outer", "front", "rear", "both ends", "one end", "the other end", and the like indicate orientations or positional relationships based on those shown in the drawings, and are only for convenience of description and simplicity of description, but do not indicate or imply that the referred device or element must have a specific orientation, be constructed in a specific orientation, and be operated, and thus, should not be construed as limiting the present invention. Furthermore, the terms "first," "second," and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
In the description of the present invention, it is to be noted that, unless otherwise explicitly specified or limited, the terms "mounted," "disposed," "connected," and the like are to be construed broadly, such as "connected," which may be fixedly connected, detachably connected, or integrally connected; can be mechanically or electrically connected; they may be connected directly or indirectly through intervening media, or they may be interconnected between two elements. The specific meanings of the above terms in the present invention can be understood in specific cases to those skilled in the art.
Claims (8)
1. An autonomous real-time live-action three-dimensional reconstruction and detection robot system is characterized in that: the system comprises a mechanical hardware subsystem and a software subsystem, wherein the mechanical hardware subsystem provides three-dimensional space motion capability, three-dimensional space information acquisition capability and data transmission capability for the whole system, the software subsystem is used for collecting three-dimensional space information and processing the three-dimensional space information, and the software subsystem provides decision-making capability for the whole system, wherein the decision-making capability comprises but is not limited to navigation capability, obstacle avoidance capability, three-dimensional reconstruction capability and control data transmission capability.
2. The autonomous real-time live-action three-dimensional reconstruction and detection robot system according to claim 1, wherein: the mechanical hardware subsystem is composed of a mobile platform, a mechanical arm and a scanner.
3. The autonomous real-time live-action three-dimensional reconstruction and detection robot system according to claim 2, wherein: the front end above the moving platform is provided with a mechanical arm, and the upper part of the mechanical arm is connected with the scanner through a flange.
4. The autonomous real-time live-action three-dimensional reconstruction and detection robot system according to claim 2, wherein: the scanner includes the following data sensors: camera, radar.
5. The autonomous real-time live-action three-dimensional reconstruction and detection robot system according to claim 2, wherein: the scanner includes the following light sources: infrared light, structured light.
6. The autonomous real-time live-action three-dimensional reconstruction and detection robot system according to claim 2, wherein: the robotic arm includes, but is not limited to, six degrees of freedom of movement.
7. The autonomous real-time live-action three-dimensional reconstruction and detection robot system according to claim 2, wherein: the moving mode of the moving platform includes, but is not limited to, a crawler-type and a wheel-type motion mode.
8. The autonomous real-time live-action three-dimensional reconstruction and detection robot system according to claim 2, wherein: the scanner includes, but is not limited to, picture information, depth information, point cloud information modality.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210294671.8A CN114633256A (en) | 2022-03-23 | 2022-03-23 | Autonomous real-time live-action three-dimensional reconstruction and detection robot system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210294671.8A CN114633256A (en) | 2022-03-23 | 2022-03-23 | Autonomous real-time live-action three-dimensional reconstruction and detection robot system |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114633256A true CN114633256A (en) | 2022-06-17 |
Family
ID=81949437
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210294671.8A Pending CN114633256A (en) | 2022-03-23 | 2022-03-23 | Autonomous real-time live-action three-dimensional reconstruction and detection robot system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114633256A (en) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109352621A (en) * | 2018-10-19 | 2019-02-19 | 飞码机器人私人有限公司 | A kind of construction quality detection robot system and method |
CN109434830A (en) * | 2018-11-07 | 2019-03-08 | 宁波赛朗科技有限公司 | A kind of industrial robot platform of multi-modal monitoring |
US20200001458A1 (en) * | 2018-06-27 | 2020-01-02 | Abb Schweiz Ag | Method and system to generate a 3d model for a robot scene |
CN110842940A (en) * | 2019-11-19 | 2020-02-28 | 广东博智林机器人有限公司 | Building surveying robot multi-sensor fusion three-dimensional modeling method and system |
CN112710233A (en) * | 2020-12-18 | 2021-04-27 | 南京航空航天大学 | Large-scale aircraft skin point cloud obtaining equipment and method based on laser point cloud |
CN113352334A (en) * | 2021-05-26 | 2021-09-07 | 南开大学 | Mobile flexible scanning robot system |
-
2022
- 2022-03-23 CN CN202210294671.8A patent/CN114633256A/en active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20200001458A1 (en) * | 2018-06-27 | 2020-01-02 | Abb Schweiz Ag | Method and system to generate a 3d model for a robot scene |
CN110640730A (en) * | 2018-06-27 | 2020-01-03 | Abb瑞士股份有限公司 | Method and system for generating three-dimensional model for robot scene |
CN109352621A (en) * | 2018-10-19 | 2019-02-19 | 飞码机器人私人有限公司 | A kind of construction quality detection robot system and method |
CN109434830A (en) * | 2018-11-07 | 2019-03-08 | 宁波赛朗科技有限公司 | A kind of industrial robot platform of multi-modal monitoring |
CN110842940A (en) * | 2019-11-19 | 2020-02-28 | 广东博智林机器人有限公司 | Building surveying robot multi-sensor fusion three-dimensional modeling method and system |
CN112710233A (en) * | 2020-12-18 | 2021-04-27 | 南京航空航天大学 | Large-scale aircraft skin point cloud obtaining equipment and method based on laser point cloud |
CN113352334A (en) * | 2021-05-26 | 2021-09-07 | 南开大学 | Mobile flexible scanning robot system |
Non-Patent Citations (1)
Title |
---|
许丽等: "移动柔性扫描机器人综合实验平台研制", 《实验技术与管理》, vol. 39, no. 8, 31 August 2022 (2022-08-31) * |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110587600B (en) | Point cloud-based autonomous path planning method for live working robot | |
CN110446159B (en) | System and method for accurate positioning and autonomous navigation of indoor unmanned aerial vehicle | |
CN104217439B (en) | Indoor visual positioning system and method | |
CN109737981B (en) | Unmanned vehicle target searching device and method based on multiple sensors | |
CN110262507B (en) | Camera array robot positioning method and device based on 5G communication | |
CN109118585B (en) | Virtual compound eye camera system meeting space-time consistency for building three-dimensional scene acquisition and working method thereof | |
JP6943988B2 (en) | Control methods, equipment and systems for movable objects | |
Yang et al. | A novel approach of efficient 3D reconstruction for real scene using unmanned aerial vehicle oblique photogrammetry with five cameras | |
CN112184812B (en) | Method for improving identification and positioning precision of unmanned aerial vehicle camera to april tag and positioning method and system | |
Flynn | Redundant sensors for mobile robot navigation | |
CN112115607B (en) | Mobile intelligent body digital twin system based on multidimensional microblog space | |
WO2022077296A1 (en) | Three-dimensional reconstruction method, gimbal load, removable platform and computer-readable storage medium | |
CN104079916A (en) | Panoramic three-dimensional visual sensor and using method | |
CN111319502A (en) | Unmanned aerial vehicle laser charging method based on binocular vision positioning | |
Nüchter et al. | Irma3D—An intelligent robot for mapping applications | |
CN114529585A (en) | Mobile equipment autonomous positioning method based on depth vision and inertial measurement | |
CN112327868A (en) | Intelligent robot automatic navigation system | |
CN116952081B (en) | Aerial monitoring system and monitoring method for parameter images of drop points of fire extinguishing bomb | |
CN113984037A (en) | Semantic map construction method based on target candidate box in any direction | |
EP4332631A1 (en) | Global optimization methods for mobile coordinate scanners | |
TW201635250A (en) | Indoor monitoring system and method thereof | |
CN114633256A (en) | Autonomous real-time live-action three-dimensional reconstruction and detection robot system | |
CN115065816B (en) | Real geospatial scene real-time construction method and real-time construction device | |
US20240201371A1 (en) | Three-dimensional ultrasonic imaging method and system based on lidar | |
CN212193168U (en) | Robot head with laser radars arranged on two sides |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |