CN112964196B - Three-dimensional scanning method, system, electronic device and computer equipment - Google Patents
Three-dimensional scanning method, system, electronic device and computer equipment Download PDFInfo
- Publication number
- CN112964196B CN112964196B CN202110162495.8A CN202110162495A CN112964196B CN 112964196 B CN112964196 B CN 112964196B CN 202110162495 A CN202110162495 A CN 202110162495A CN 112964196 B CN112964196 B CN 112964196B
- Authority
- CN
- China
- Prior art keywords
- scanning
- head
- coordinate system
- tracking
- robot
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 49
- 230000033001 locomotion Effects 0.000 claims description 23
- 238000004590 computer program Methods 0.000 claims description 22
- 238000012545 processing Methods 0.000 claims description 8
- 230000008569 process Effects 0.000 abstract description 12
- 238000010586 diagram Methods 0.000 description 10
- 230000008901 benefit Effects 0.000 description 4
- 238000006243 chemical reaction Methods 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 239000003550 marker Substances 0.000 description 2
- 230000001360 synchronised effect Effects 0.000 description 2
- 238000009825 accumulation Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0234—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using optical markers or beacons
- G05D1/0236—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using optical markers or beacons in combination with a laser
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0238—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
- G05D1/024—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0246—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
- G05D1/0251—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting 3D information from a plurality of images taken from different locations, e.g. stereo vision
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0276—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Electromagnetism (AREA)
- Optics & Photonics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
The application provides a three-dimensional scanning method, a three-dimensional scanning system, an electronic device and a computer device, wherein a movable device carrying robot and a scanner are controlled to move, when the movable device carrying the robot moves to a target scanning area, point cloud data under a scanning head coordinate system are obtained, the point cloud data on the surface of a scanned object are converted to a global coordinate system according to the relative position of a scanning head and a tracking head, and the point cloud data of the scanned object in the target scanning area under the global coordinate system are obtained, so that scanning and data splicing of the scanned object are completed, and the problems of limited scanning range and low splicing precision in the three-dimensional scanning process are solved.
Description
Technical Field
The present application relates to the field of three-dimensional scanning technologies for robots, and in particular, to a three-dimensional scanning method, a three-dimensional scanning system, an electronic apparatus, and a computer device.
Background
When a scanner is carried to scan a workpiece, due to the limitation of the working space and the singular posture of the robot, the requirement of scanning work is difficult to meet under the condition that the size of the workpiece exceeds the scanning range of the robot.
Aiming at the problems of limited scanning range and low splicing precision in the current three-dimensional scanning process, an effective solution is not provided yet.
Disclosure of Invention
The embodiment of the application provides a three-dimensional scanning method, a three-dimensional scanning system, an electronic device and computer equipment, and aims to at least solve the problems of limited scanning range and low splicing precision in the three-dimensional scanning process in the related technology.
In a first aspect, an embodiment of the present application provides a three-dimensional scanning method, which is used in a three-dimensional scanning system, where the three-dimensional scanning system includes a scanner, a robot, and a movable device, the scanner includes a scanning head and a tracking head, the scanning head is mounted at a distal end of the robot, and the robot is mounted on the movable device, and the method includes the following steps:
when a movable device carrying the robot moves to a target scanning area, point cloud data of the surface of a scanned object in the target scanning area under a scanning head coordinate system are acquired;
and according to the relative position and posture of the scanning head and the tracking head, converting the point cloud data of the surface of the scanned object under the scanning head coordinate system into the global coordinate system of the three-dimensional scanning system to obtain the point cloud data of the surface of the scanned object in the target scanning area under the global coordinate system.
In some of these embodiments, the tracking head is mounted on the movable device carrying the robot, wherein the number of the tracking head, the robot and the movable device is at least one.
In some embodiments, there are a plurality of movable apparatuses, at least one of the movable apparatuses is provided with the robot, and at least one of the other movable apparatuses not provided with the robot is provided with the tracking head.
In some embodiments, the converting the point cloud data of the scanned object surface in the scanning head coordinate system into the global coordinate system of the three-dimensional scanning system according to the relative poses of the scanning head and the tracking head includes the following steps:
converting point cloud data of the surface of the scanned object under a scanning head coordinate system into the tracking head coordinate system according to the relative poses of the scanning head and the tracking head;
and calculating the relative pose of the tracking head and the global coordinate system, and converting the point cloud data of the surface of the scanned object under the tracking head coordinate system into the global coordinate system according to the relative pose of the tracking head and the global coordinate system.
In some embodiments, the three-dimensional scanning system includes a global tracking device, and the method calculates the relative pose of the tracking head and the global coordinate system, and converts the point cloud data of the scanned object surface under the tracking head coordinate system into the global coordinate system according to the relative pose of the tracking head and the global coordinate system, and includes the following steps:
acquiring the relative pose of the tracking head and an identifier on the tracking head;
acquiring the pose of the identifier under a global coordinate system of the global tracking device;
and converting the point cloud data of the surface of the scanned object under the tracking head coordinate system into the global coordinate system according to the relative pose of the tracking head and the identifier and the pose of the identifier under the global coordinate system.
In some of these embodiments, the three-dimensional scanning system includes an industrial camera that calculates the relative pose of the tracking head and the global coordinate system and transforms the point cloud data of the scanned object surface in the tracking head coordinate system to the global coordinate system according to the relative pose of the tracking head and the global coordinate system, further comprising the steps of:
acquiring all mark point data under the global coordinate system in the target scanning area through the industrial camera;
acquiring partial mark point data under the tracking head coordinate system in the target scanning area;
and comparing all the marking point data under the global coordinate system with part of the marking point data under the tracking head coordinate system, and calculating to obtain the relative pose of the tracking head under the global coordinate system.
In some of these embodiments, the tracking head is stationary relative to the scanned object, and the tracking head coordinate system is the global coordinate system.
In some embodiments, after the point cloud data of the scanned object surface in the target scanning area is converted into the global coordinate system, the movable device is controlled to move into the next scanning area, and the next scanning area is taken as the target scanning area.
In some embodiments, the movable device is an AGV cart that moves along a predetermined travel path.
In a second aspect, an embodiment of the present application provides a three-dimensional scanning system, including a scanner, a robot, and a movable device, where the scanner includes a scanning head and a tracking head, the scanning head is mounted at a tail end of the robot, the robot is mounted on the movable device, the movable device includes a motion control module and a data acquisition and processing module, and the motion control module controls motions of the robot and the movable device through a network;
the scanning head scans along with the motion of the robot and is used for acquiring point cloud data of the surface of a scanned object in a target scanning area under a scanning head coordinate system when a movable device carrying the robot moves to the target scanning area;
the tracking head is used for tracking the pose of the scanning head;
the data acquisition processing module is used for calculating the relative position and pose of a tracking head and a scanning head, and converting the point cloud data of the surface of the scanned object under the scanning head coordinate system into the global coordinate system of the three-dimensional scanning system according to the relative position and pose to obtain the point cloud data of the surface of the scanned object in the target scanning area under the global coordinate system.
In a third aspect, an embodiment of the present application provides an electronic apparatus, which includes a memory, a processor, and a computer program stored on the memory and executable on the processor, and the processor implements the three-dimensional scanning method according to the first aspect when executing the computer program.
In a fourth aspect, embodiments of the present application provide a computer device, which includes a memory, a processor, and a computer program stored on the memory and executable on the processor, and the processor implements the three-dimensional scanning method as described in the first aspect when executing the computer program.
According to the three-dimensional scanning method, the system, the electronic device and the computer equipment, the movable device carrying robot and the scanner are controlled to move, when the movable device carrying the robot moves to the target scanning area, point cloud data under a scanning head coordinate system are obtained, the point cloud data on the surface of the scanned object are converted to be under a global coordinate system according to the relative position of the scanning head and the tracking head, and the point cloud data of the scanned object in the target scanning area under the global coordinate system are obtained, so that scanning and data splicing of the scanned object are completed, and the problems of limited scanning range and low splicing precision in the three-dimensional scanning process are solved.
The details of one or more embodiments of the application are set forth in the accompanying drawings and the description below to provide a more concise and understandable description of the application, and features, objects, and advantages of the application.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
fig. 1 is a first application environment diagram of a three-dimensional scanning method according to an embodiment of the present invention;
fig. 2 is a diagram of an application environment of a three-dimensional scanning method according to an embodiment of the present invention;
fig. 3 is a diagram of an application environment of a three-dimensional scanning method according to an embodiment of the present invention;
FIG. 4 is a flow chart of a three-dimensional scanning method according to an embodiment of the invention;
FIG. 5 is a schematic diagram of an electronic device according to an embodiment of the invention;
fig. 6 is a schematic structural diagram of a computer device according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application will be described and illustrated below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of and not restrictive on the broad application. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments provided in the present application without any inventive step are within the scope of protection of the present application. Moreover, it should be appreciated that such a development effort might be complex and tedious, but would nevertheless be a routine undertaking of design, fabrication, and manufacture for those of ordinary skill having the benefit of this disclosure, given the benefit of this disclosure, without departing from the scope of this disclosure.
Reference in the specification to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the specification. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is to be expressly and implicitly understood by one of ordinary skill in the art that the embodiments described herein may be combined with other embodiments without conflict.
Unless otherwise defined, technical or scientific terms referred to herein should have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. Reference to "a," "an," "the," and similar words throughout this application are not to be construed as limiting in number, and may refer to the singular or the plural. The use of the terms "including," "comprising," "having," and any variations thereof herein, is meant to cover a non-exclusive inclusion; for example, a process, method, system, article, or apparatus that comprises a list of steps or modules (elements) is not limited to the listed steps or elements, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus. Reference to "connected," "coupled," and the like in this application is not intended to be limited to physical or mechanical connections, but may include electrical connections, whether direct or indirect. Reference herein to "a plurality" means greater than or equal to two. "and/or" describes an association relationship of associated objects, meaning that three relationships may exist, for example, "A and/or B" may mean: a exists alone, A and B exist simultaneously, and B exists alone. Reference herein to the terms "first," "second," "third," and the like, are merely to distinguish similar objects and do not denote a particular ordering for the objects.
Fig. 1 is a first application environment diagram of a three-dimensional scanning method according to an embodiment of the present disclosure. As shown in fig. 1, in this application environment, the scanner includes two parts, namely a scanning head 101 and a tracking head 102, wherein the scanning head 101 is installed at the end of a robot 103, the tracking head 102 and the robot 103 are installed on the same movable device 104 and move with the movement of the movable device 104, when the movable device 104 moves to a certain target scanning area, the end of the robot 103 holds the scanning head 101 to scan an object 106 to be scanned, and a global tracking device 105 is fixedly installed for capturing the tracking head 102. The scanning head 101 and the tracking head 102 may be a scanning head and a tracking instrument of a tracking scanner, respectively, the robot 103 may specifically be an industrial robot arm, the movable device 104 may be an AGV (Automated Guided Vehicle) cart having an omnidirectional traveling function, and the global tracking device 105 may be a laser tracker.
The scanning head, the tracking head and the robot can be grouped to form scanning combinations, one scanning combination can comprise one or more scanning heads, tracking heads and robots, each scanning combination is distributed on one movable device, a plurality of movable devices are arranged to carry the scanning combinations for scanning, and the global tracking device tracks the tracking heads of the plurality of scanning combinations.
Fig. 2 is a second application environment diagram of the three-dimensional scanning method in an embodiment provided by the present application, as shown in fig. 2, in the application environment, the application environment further includes another movable device 107, and the scanner includes two parts, namely, a scanning head 101 and a tracking head 102, wherein the scanning head 101 is installed at an end of the robot 103, the robot 103 is installed on the movable device 104, the tracking head 102 is installed on the other movable device 107 not equipped with the robot 103, when the movable device 104 moves to a certain target scanning area, the end of the robot 103 holds the scanning head 101 to scan an object 106 to be scanned, the global tracking device 105 is fixedly installed to capture the tracking head 102, and the tracking head 102 calculates a relative pose between the scanning head 101 and the tracking head 102 by capturing a mark point on the scanning head 101.
Fig. 3 is a third diagram of an application environment of a three-dimensional scanning method according to an embodiment of the present invention, as shown in fig. 3, in the application environment, a scanner includes two parts, namely a scanning head 101 and a tracking head 102, where the scanning head 101 is installed at an end of a robot 103, the robot 103 is installed on a movable device 104 and moves along with the movement of the movable device 104, when the movable device 104 moves to a certain target scanning area, the end of the robot 103 holds the scanning head 101 to scan an object 106 to be scanned, the tracking head 102 is fixed relative to the object 106 to be scanned, and a coordinate system of the tracking head 102 is a global coordinate system.
The embodiment provides a three-dimensional scanning method, as shown in fig. 4, including the following steps:
step S201, when the movable device carrying the robot moves to the target scanning area, acquiring point cloud data of the surface of the scanned object in the target scanning area under the scanning head coordinate system.
Specifically, the movable device may be a smart mobile device, such as an AGV cart, which has an omnidirectional movement function and can be controlled by a motion command. When the AGV trolley moves to a specified target scanning area, the robot carried on the AGV trolley can clamp the scanning head of the scanner to move in a scanning range, the surface of a scanned object in the target scanning area is scanned, and point cloud data of the surface of the scanned object under a scanning head coordinate system is obtained. The surface of the scanned object may be a local surface that can be scanned by the scanning head within a fixed scanning range in the target scanning area, or may be the entire surface of one or more scanned objects in the target scanning area. The target scanning area may be a scanning area that is previously designated according to the characteristics of the scanned object, the inherent properties of the scanner, the motion characteristics of the movable device, and the like.
In the case where the size of the scanned object exceeds the scanning range of the scanner, the entire scanning area where the scanned object is located needs to be divided into a plurality of scanning areas, and the target scanning area may be any one of the entire scanning areas.
Furthermore, the controller of the robot carried on the AGV and the AGV can be instructed by a motion control module in the control device on the AGV to control the movement of the AGV and the robot, wherein the motion control module is in signal connection with the controller of the AGV or the robot through a network. When the AGV trolley reaches the target scanning area, the AGV trolley is kept fixed, the robot clamping scanning head on the AGV trolley is controlled to scan the surface of the scanned object, and the mode of scanning through the AGV trolley matched with the movement of the robot can enlarge the working range of the robot.
Step S202, according to the relative position of the scanning head and the tracking head, the point cloud data of the surface of the scanned object under the scanning head coordinate system is converted into the global coordinate system of the three-dimensional scanning system, and the point cloud data of the surface of the scanned object in the target scanning area under the global coordinate system is obtained.
The scanner comprises a tracking head and a scanning head, wherein the scanning head can scan the surface of a scanned object to acquire point cloud data of the surface of the scanned object under a scanning head coordinate system, the tracking head can take a picture in a visual range, and the relative pose of the scanning head and the tracking head is calculated in real time by capturing mark points on the scanning head. The robot can be an industrial mechanical arm, and after the mechanical arm drives the scanning head to acquire the point cloud data of the surface of the scanned object in the target scanning range, the point cloud data of the surface of the scanned object in the target scanning range under the scanning head coordinate system can be converted into the point cloud data under the global coordinate system according to the calculated relative pose of the scanning head and the tracking head.
Further, due to the limitation of the scanning range, in a fixed target scanning area, the surface data of the scanned object may not be completely obtained, but only the surface data of the part of the scanned object surface of the scanned object in the target scanning area under the local coordinate system, that is, the point cloud data of the scanned object surface in the scanning head coordinate system in the target scanning area, may be obtained. In order to unify the coordinate systems of the point cloud data obtained in different scanning areas, a global coordinate system needs to be specified, and the point cloud data in the target scanning area and under the scanning head coordinate system needs to be converted into the unified global coordinate system.
The global coordinate system is a reference coordinate system of a device which is fixed relative to the scanned object and based on the scanned object, and may be determined by a global tracking device in a three-dimensional scanning system, such as a laser tracker, or by pasting mark points in the whole scanned area and obtaining coordinates of the global mark points, or by fixing the tracking head relative to the scanned object and using the coordinate system of the tracking head as the global coordinate system. After the global coordinate system is determined, the point cloud data of the surface of the scanned object in the target scanning area and under the scanning head coordinate system can be transferred to the global coordinate system through the relative position and posture of the scanning head and the tracking head. The method for determining the uniform global coordinate system can improve the splicing precision of the data of the surface of the scanned object.
The movable device carrying robot and the scanner are controlled to move, when the movable device carrying the robot moves to a target scanning area, point cloud data under a scanning head coordinate system are obtained, the point cloud data on the surface of a scanned object are converted to be under a global coordinate system according to the relative position of the scanning head and the tracking head, and the point cloud data of the scanned object in the target scanning area under the global coordinate system are obtained, so that scanning and data splicing of the scanned object are completed, and the problems of limited scanning range and low splicing precision in the three-dimensional scanning process are solved.
Further, in one embodiment, based on the above steps S201 to S202, the tracking head is mounted on a movable device mounting the robot, wherein the number of the tracking head, the robot, and the movable device is at least one.
In addition, in an embodiment, based on the above steps S201 to S202, there are a plurality of movable devices, at least one of the movable devices has a robot mounted thereon, and at least one of the other movable devices that do not have a robot mounted thereon has a tracking head mounted thereon.
In the entire scanning area, the robot and the tracking head may be mounted on different movable devices, that is, the scanning head and the tracking head of the scanner may be mounted on different movable devices, respectively, wherein there may be a plurality of movable devices each carrying the robot and the movable device mounted with the tracking head.
When the tracking head is arranged on the movable device carrying the robot or the tracking head is arranged on the movable devices of other un-carried robots, the point cloud data of the surface of the scanned object under the scanning head coordinate system is converted into the global coordinate system of the three-dimensional scanning system according to the relative position and posture of the scanning head and the tracking head, and the method comprises the following steps:
and S301, converting the point cloud data of the surface of the scanned object under the scanning head coordinate system into the point cloud data under the tracking head coordinate system according to the relative position and posture of the scanning head and the tracking head.
And S302, calculating the relative position and posture of the tracking head and the global coordinate system, and converting the point cloud data of the surface of the scanned object under the tracking head coordinate system into the global coordinate system according to the relative position and posture of the tracking head and the global coordinate system.
Since the tracking head moves along with the movable device, the tracking head coordinate system is still a local coordinate system relative to the scanned object, and after the point cloud data of the surface of the scanned object in the tracking head coordinate system is obtained in step S301, the point cloud data of the surface of the scanned object in the global coordinate system in the target scanning area is obtained through the relative pose between the tracking head and the global coordinate system.
The global tracking device may be a device capable of tracking and capturing the position of the scanner, such as a laser tracker, which is fixed at a certain position and does not change position with the movement of the movable device, so that the reference coordinate system of the laser tracker can be used as the global coordinate system of the whole scanning process.
In addition, a data acquisition and processing module can be installed in the control device, and the data acquisition and processing module receives the point cloud data of the surface of the scanned object under the tracking head coordinate system through a network and/or Bluetooth, calculates the relative position and orientation of the tracking head and the coordinate system, and converts the point cloud data of the surface of the scanned object into a global coordinate system.
Furthermore, the three-dimensional scanning system includes a global tracking device, based on the step S303, calculating the relative pose between the tracking head and the global coordinate system, and converting the point cloud data of the surface of the scanned object in the tracking head coordinate system into the global coordinate system according to the relative pose between the tracking head and the global coordinate system, including the following steps:
step S401, the relative position and pose of the tracking head and the identifier on the tracking head are obtained.
The relative position of the identifier and the tracking head is fixed, and the relative position relationship between the identifier and the tracking head can be determined in a calibration mode, wherein the identifier can be a laser target ball.
Specifically, a series of mark points can be pasted on the calibration board by arranging the calibration board at a fixed position in the target scanning area, wherein the calibration board can also be replaced by the ground or the wall in the target scanning area. And finally, solving the relative position of the identifier and the tracking head according to the relative position of the identifier and the global tracking device at the fixed position, the relative position of the tracking head and the calibration board and the known relative position of the calibration board and the global tracking device.
Step S402, acquiring the pose of the identifier in the global coordinate system of the global tracking device.
Since the identifier can be captured by the global tracking device, the relative pose of the identifier and the global tracking device can be acquired in real time during the movement of the identifier along with the movable device.
And S403, converting the point cloud data of the surface of the scanned object under the tracking head coordinate system into the global coordinate system according to the relative position and posture of the tracking head and the identifier and the position and posture of the identifier under the global coordinate system.
In the space, the conversion between the two coordinate systems can be determined according to the relative pose between the two coordinate systems, so that after the relative pose of the tracking head and the identifier and the pose of the identifier in the reference coordinate system of the global tracking device are obtained, the relative pose of the tracking head and the reference coordinate system can be indirectly calculated, and the point cloud data of the surface of the scanned object in the tracking head coordinate system can be converted into the reference coordinate system of the global tracking device.
Additionally, in one embodiment, the three-dimensional scanning system comprises an industrial camera, calculates the relative pose of the tracking head and the global coordinate system, and converts the point cloud data of the scanned object surface under the tracking head coordinate system to the global coordinate system according to the relative pose of the tracking head and the global coordinate system, and comprises the following steps:
step S501, all mark point data in the target scanning area under the global coordinate system are obtained through the industrial camera.
In this embodiment, the global coordinate system is determined by pasting marker points on and around the scanned object and acquiring all marker point data within the entire scan area. Specifically, a photogrammetric device, such as an industrial camera, may be used to obtain the three-dimensional coordinates and directions of all the marked points under the global coordinates.
Step S502, obtaining part of mark point data under the tracking head coordinate system in the target scanning area.
The mark points can be used for calibrating the relative pose between the tracking head and the global coordinate system, so that the tracking head obtains the relative pose between the tracking head and the scanning head and also obtains mark point data of part of the mark points in the tracking head coordinate system in a target scanning area, specifically, the mark point data can be three-dimensional coordinates and directions of part of the mark points in the tracking head coordinate system.
And S503, comparing all the marking point data under the global coordinate system with part of the marking point data under the tracking head coordinate system, and calculating to obtain the relative pose of the tracking head under the global coordinate system.
Because the three-dimensional position relationship of the mark points in the target scanning area under the tracking head coordinate system is expressed by the data of the part of the mark points under the tracking head coordinate system, and the coordinates and the directions of the part of the mark points in the target scanning area under the global coordinate system can be obtained through the step S501, the part of the mark points under the tracking head coordinate system are compared with all the mark points under the global coordinate system, and the relative poses of the global coordinate system and the tracking head can be determined.
Specifically, the relative position relationship between all the mark points is obtained while all the mark point data under the global coordinate system is obtained, similarly, the relative position relationship between part of the mark points under the tracking head coordinate system is obtained while part of the mark point data under the tracking head coordinate system is obtained, and the matching of the corresponding mark points is carried out in all the mark points under the global coordinate system through the relative position relationship between part of the mark points under the tracking head coordinate system and the coordinates and directions of part of the mark points, so as to obtain the positions of the part of the mark points under the global coordinate system.
For example, the set of all the markers in the global coordinate system is a (x 1, x2, x 3., xn), the set of the part markers in the tracking head coordinate system is B (x 1, x2, x 3), and the set B is a part of the set a, but the coordinate system to which the markers in the set B belong is different from the coordinate system to which the markers in the set a belong. And matching the coordinates and the directions of the mark points in the set B and the relative position relationship among the mark points with the coordinates and the directions of the mark points in the set A and the relative position relationship among the mark points to obtain the pose of the mark points in the set B in the global coordinate system.
And finally, calculating the pose of the tracking head in the global coordinate system according to the pose of part of the mark point data in the global coordinate system and part of the mark point data in the tracking head coordinate system.
In one embodiment, based on the above steps S201 to S202, the tracking head is fixed relative to the scanned object, and the tracking head coordinate system is a global coordinate system.
Specifically, the tracking head may be fixed in the overall scanning area and kept stationary with respect to the scanned object, and the mobile device-mounted robot moves according to a preset moving path, in this case, the tracking head coordinate system can be used as a global coordinate system, and the point cloud data of the surface of the scanned object in the scanning head coordinate system is directly converted into the global coordinate system by the relative pose of the tracking head and the scanning head, so as to complete the splicing of the point cloud data of the scanned object in the global coordinate system.
In one embodiment, after the point cloud data of the scanned object surface in the target scanning area is converted into the global coordinate system, the movable device is controlled to move to the next scanning area, and the next scanning area is taken as the target scanning area.
After the conversion of the point cloud data of the surface of the scanned object in the global coordinate system in the target scanning area is completed, the point cloud data of the local surface of the scanned object in the target scanning area, that is, the scanning of the point cloud data of the surface of the scanned object and the splicing in the global coordinate system, can be realized, the movable device is also required to be controlled to move into the next scanning area, the next scanning area is taken as the target scanning area, and the steps S201 to S502 are repeated until the scanning and splicing of the whole scanned object are completed, so that the surface data of the whole scanned object or all the scanned objects in the whole scanning area in the unified global coordinate system can be obtained.
In one embodiment, the movable device is an AGV that moves according to a predetermined travel path. The preset moving path may be determined according to parameters such as the size of the scanned object, the inherent properties of the AGV, and the scanning range of the scanner.
In the steps, the movable device carrying robot and the scanner are controlled to move, and when the movable device moves to the target scanning area, the point cloud data under the scanning head coordinate system is obtained, so that the scanning range of the robot carrying scanner for scanning can be expanded, and the problem that the scanning range is limited in the scanning process of the robot is solved. When the tracking head is installed on the movable device, the point cloud data of the scanned object in the target scanning area under the tracking head coordinate system is obtained according to the relative position of the scanning head and the tracking head, then the global coordinate system is determined, the relative position of the tracking head and the global coordinate system of the three-dimensional scanning system is calculated, the point cloud data of the scanned object in the target scanning area under the global coordinate system is obtained, when the tracking head is fixed relative to the scanned object, the tracking head coordinate system is directly used as the global coordinate system, the conversion of the point cloud data of the scanned object under the global coordinate system is completed, after the scanning and the splicing in the target scanning area are completed, the scanning and the data splicing are continuously performed on the surfaces of the scanned objects in other scanning areas until the scanning and the data splicing of the scanned objects are completed, and therefore the problem of low splicing accuracy caused by the accumulation of splicing errors is solved.
In one embodiment, a three-dimensional scanning system is provided, which comprises a scanner, a robot and a movable device, wherein the scanner comprises a scanning head and a tracking head, the scanning head is arranged at the tail end of the robot, the robot is arranged on the movable device, the movable device comprises a motion control module and a data acquisition and processing module, and the motion control module controls the motion of the robot and the movable device through a network;
the scanning head scans along with the movement of the robot and is used for acquiring point cloud data of the surface of a scanned object in a target scanning area under a scanning head coordinate system when a movable device carrying the robot moves to the target scanning area;
the tracking head is used for tracking the pose of the scanning head;
the data acquisition processing module is used for calculating the relative position and posture of the tracking head and the scanning head, and converting the point cloud data of the surface of the scanned object under the coordinate system of the scanning head into the global coordinate system of the three-dimensional scanning system according to the relative position and posture to obtain the point cloud data of the surface of the scanned object in the target scanning area under the global coordinate system.
According to the three-dimensional scanning system, the movable device carrying robot and the scanner are controlled to move, when the movable device carrying the robot moves to the target scanning area, point cloud data under a scanning head coordinate system are obtained, the point cloud data on the surface of the scanned object are converted into the global coordinate system according to the relative position of the scanning head and the tracking head, and the point cloud data of the scanned object in the target scanning area under the global coordinate system are obtained, so that scanning and data splicing of the scanned object are completed, and the problems of limited scanning range and low splicing precision in the three-dimensional scanning process are solved.
For specific limitations of embodiments of the three-dimensional scanning system, reference may be made to the above limitations of the three-dimensional scanning method, which are not described herein again. The various modules in the three-dimensional scanning system described above may be implemented in whole or in part by software, hardware, and combinations thereof. The modules can be embedded in a hardware form or independent of a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
In one embodiment, as shown in FIG. 5, an electronic device is provided that includes a memory and a processor. The memory has stored therein a computer program for providing computing and control capabilities to the processor of the electronic device. The memory of the electronic device comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system, a computer program, and a database. The internal memory provides an environment for the operating system and the computer program to run on the non-volatile storage medium.
Optionally, the electronic apparatus may further include a transmission device and an input/output device, wherein the transmission device is connected to the processor, and the input/output device is connected to the processor.
Optionally, in this embodiment, the processor implements the following steps when executing the computer program:
when a movable device carrying the robot moves to a target scanning area, point cloud data of the surface of a scanned object in the target scanning area under a scanning head coordinate system are acquired;
and according to the relative position of the scanning head and the tracking head, converting the point cloud data of the surface of the scanned object under the scanning head coordinate system into the global coordinate system of the three-dimensional scanning system to obtain the point cloud data of the surface of the scanned object in the target scanning area under the global coordinate system.
In one embodiment, the tracking head is mounted on a movable device carrying the robot, wherein the number of the tracking head, the robot and the movable device is at least one.
In one embodiment, there are a plurality of mobile devices, at least one of which carries a robot, and at least one of the remaining mobile devices which do not carry a robot has a tracking head mounted thereon.
In one embodiment, the processor when executing the computer program further performs the steps of:
converting point cloud data of the surface of the scanned object under a scanning head coordinate system into point cloud data under a tracking head coordinate system according to the relative position and posture of the scanning head and the tracking head;
and calculating the relative pose of the tracking head and the global coordinate system, and converting the point cloud data of the surface of the scanned object under the tracking head coordinate system into the global coordinate system according to the relative pose of the tracking head and the global coordinate system.
In one embodiment, the processor when executing the computer program further performs the steps of:
acquiring the relative poses of the tracking head and the identifier on the tracking head;
acquiring the pose of the identifier under the global coordinate system of the global tracking device;
and converting the point cloud data of the surface of the scanned object under the tracking head coordinate system into the global coordinate system according to the relative position of the tracking head and the identifier and the position of the identifier under the global coordinate system.
In one embodiment, the processor when executing the computer program further performs the steps of:
acquiring all mark point data under a global coordinate system in a target scanning area through an industrial camera;
acquiring partial mark point data under a tracking head coordinate system in a target scanning area;
and comparing all the mark point data under the global coordinate system with part of the mark point data under the tracking head coordinate system, and calculating to obtain the relative pose of the tracking head under the global coordinate system.
In one embodiment, the tracking head is stationary relative to the object being scanned and the tracking head coordinate system is a global coordinate system.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
and after the point cloud data of the surface of the scanned object in the target scanning area is converted into the global coordinate system, controlling the movable device to move into the next scanning area, and taking the next scanning area as the target scanning area.
In one embodiment, the movable device is an AGV cart that moves along a predetermined travel path.
It should be noted that, for specific examples in this embodiment, reference may be made to the examples described in the foregoing embodiment and optional implementation manners, and details of this embodiment are not described herein again.
In one embodiment, a computer device is provided, which may be a server, and its internal structure diagram may be as shown in fig. 6. The computer device includes a processor, a memory, a network interface, and a database connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system, a computer program, and a database. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The database of the computer device is used for storing a preset configuration information set. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer program is executed by a processor to implement the three-dimensional scanning method described above.
In one embodiment, a computer device is provided, which may be a terminal. The computer device comprises a processor, a memory, a network interface, a display screen and an input device which are connected through a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer program is executed by a processor to implement a three-dimensional scanning method. The display screen of the computer equipment can be a liquid crystal display screen or an electronic ink display screen, and the input device of the computer equipment can be a touch layer covered on the display screen, a key, a track ball or a touch pad arranged on a shell of the computer equipment, an external keyboard, a touch pad or a mouse and the like.
Those skilled in the art will appreciate that the architecture shown in fig. 6 is merely a block diagram of some of the structures associated with the disclosed aspects and is not intended to limit the computing devices to which the disclosed aspects apply, as particular computing devices may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in the embodiments provided herein may include non-volatile and/or volatile memory, among others. Non-volatile memory can include read-only memory (ROM), programmable ROM (PROM), electrically Programmable ROM (EPROM), electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double Data Rate SDRAM (DDRSDRAM), enhanced SDRAM (ESDRAM), synchronous Link DRAM (SLDRAM), rambus (Rambus) direct RAM (RDRAM), direct Rambus Dynamic RAM (DRDRAM), and Rambus Dynamic RAM (RDRAM), among others.
All possible combinations of the technical features in the above embodiments may not be described for the sake of brevity, but should be considered as being within the scope of the present disclosure as long as there is no contradiction between the combinations of the technical features.
The above examples only express several embodiments of the present application, and the description thereof is more specific and detailed, but not to be construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent application shall be subject to the appended claims.
Claims (8)
1. A three-dimensional scanning method is used for a three-dimensional scanning system, and is characterized in that the three-dimensional scanning system comprises a scanner, a robot, movable devices and a global tracking device, wherein the scanner comprises a scanning head and a tracking head, the scanning head is mounted at the tail end of the robot, the robot is mounted on the movable devices, the scanning head, the tracking head and the robot form scanning combinations according to groups, one scanning combination comprises at least one scanning head, one tracking head and one robot, each scanning combination is distributed with one movable device, and the three-dimensional scanning system comprises a plurality of movable device mounting scanning combinations for scanning; the method comprises the following steps:
when a movable device carrying the robot moves to a target scanning area, point cloud data of the surface of a scanned object in the target scanning area under a scanning head coordinate system are acquired;
converting point cloud data of the surface of the scanned object under a scanning head coordinate system into a tracking head coordinate system according to the relative position and posture of the scanning head and the tracking head;
acquiring the relative pose of the tracking head and the identifier on the tracking head;
acquiring the pose of the identifier under a global coordinate system of the global tracking device;
and converting the point cloud data of the surface of the scanned object under the tracking head coordinate system into the global coordinate system according to the relative position and posture of the tracking head and the identifier and the position and posture of the identifier under the global coordinate system to obtain the point cloud data of the surface of the scanned object in the target scanning area under the global coordinate system.
2. The method of claim 1, wherein the movable device on which the robot is mounted is further mounted with a tracking head, wherein the number of the tracking head, the robot, and the movable device is at least one.
3. The method of claim 1, wherein the three-dimensional scanning system comprises an industrial camera, the method further comprising:
acquiring all mark point data under the global coordinate system in the target scanning area through the industrial camera;
acquiring partial mark point data under the tracking head coordinate system in the target scanning area;
and comparing all mark point data under the global coordinate system with part of mark point data under the tracking head coordinate system, and calculating to obtain the relative pose of the tracking head under the global coordinate system.
4. The method according to claim 1, wherein after the point cloud data of the scanned object surface in the target scanning area is converted into the global coordinate system, the movable device is controlled to move into a next scanning area, and the next scanning area is taken as the target scanning area.
5. The method of claim 1, wherein the mobile device is an AGV cart that moves according to a predetermined travel path.
6. A three-dimensional scanning system is characterized by comprising a scanner, a robot, a movable device and a global tracking device, wherein the scanner comprises a scanning head and a tracking head, the scanning head is arranged at the tail end of the robot, the robot is arranged on the movable device, the movable device comprises a motion control module and a data acquisition and processing module, and the motion control module controls the motion of the robot and the movable device through a network;
the scanning head, the tracking head and the robot form scanning combinations according to groups, one scanning combination comprises at least one scanning head, one tracking head and one robot, each scanning combination is distributed with one movable device, and the three-dimensional scanning system comprises a plurality of movable devices carrying the scanning combinations for scanning;
the scanning head scans along with the motion of the robot and is used for acquiring point cloud data of the surface of a scanned object in a target scanning area under a scanning head coordinate system when a movable device carrying the robot moves to the target scanning area;
the tracking head is used for tracking the pose of the scanning head;
the data acquisition processing module is used for calculating the relative position and posture of the tracking head and the scanning head and converting the point cloud data of the surface of the scanned object under the scanning head coordinate system into a tracking head coordinate system according to the relative position and posture of the scanning head and the tracking head; acquiring the relative pose of the tracking head and an identifier on the tracking head; acquiring the pose of the identifier under a global coordinate system of the global tracking device; and converting the point cloud data of the surface of the scanned object under the tracking head coordinate system into the global coordinate system according to the relative position and posture of the tracking head and the identifier and the position and posture of the identifier under the global coordinate system to obtain the point cloud data of the surface of the scanned object in the target scanning area under the global coordinate system.
7. An electronic device comprising a memory and a processor, wherein the memory has stored therein a computer program, and the processor is configured to execute the computer program to perform the three-dimensional scanning method according to any one of claims 1 to 5.
8. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor performs the three-dimensional scanning method of any one of claims 1 to 5.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110162495.8A CN112964196B (en) | 2021-02-05 | 2021-02-05 | Three-dimensional scanning method, system, electronic device and computer equipment |
PCT/CN2021/085230 WO2022165973A1 (en) | 2021-02-05 | 2021-04-02 | Three-dimensional scanning method and system, electronic device, and computer equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110162495.8A CN112964196B (en) | 2021-02-05 | 2021-02-05 | Three-dimensional scanning method, system, electronic device and computer equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112964196A CN112964196A (en) | 2021-06-15 |
CN112964196B true CN112964196B (en) | 2023-01-03 |
Family
ID=76274687
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110162495.8A Active CN112964196B (en) | 2021-02-05 | 2021-02-05 | Three-dimensional scanning method, system, electronic device and computer equipment |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN112964196B (en) |
WO (1) | WO2022165973A1 (en) |
Families Citing this family (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113432561B (en) * | 2021-08-02 | 2023-10-13 | 思看科技(杭州)股份有限公司 | Data processing method and three-dimensional scanning system |
CN113670202B (en) * | 2021-08-25 | 2024-09-06 | 思看科技(杭州)股份有限公司 | Three-dimensional scanning system and three-dimensional scanning method |
CN113865506B (en) * | 2021-09-09 | 2023-11-24 | 武汉惟景三维科技有限公司 | Automatic three-dimensional measurement method and system without mark point splicing |
CN113884021A (en) * | 2021-09-24 | 2022-01-04 | 上海飞机制造有限公司 | Scanning system, calibration device and calibration method of scanning system |
CN114234838B (en) * | 2021-11-19 | 2023-09-08 | 武汉尺子科技有限公司 | 3D scanning method and device |
CN114111627B (en) * | 2021-12-07 | 2024-10-08 | 深圳市中图仪器股份有限公司 | Scanning system and scanning method based on laser tracker |
CN114739405B (en) * | 2022-02-28 | 2024-10-18 | 思看科技(杭州)股份有限公司 | Scanning path adjustment method, device, automatic scanning system and computer equipment |
CN115493512B (en) * | 2022-08-10 | 2023-06-13 | 思看科技(杭州)股份有限公司 | Data processing method, three-dimensional scanning system, electronic device and storage medium |
CN115979164A (en) * | 2022-11-25 | 2023-04-18 | 杭州天远三维检测技术有限公司 | Scanning processing method, device, equipment and medium |
CN115830550A (en) * | 2022-12-08 | 2023-03-21 | 亿咖通(湖北)技术有限公司 | Method and device for detecting motion state of target |
CN115661369B (en) * | 2022-12-14 | 2023-03-14 | 思看科技(杭州)股份有限公司 | Three-dimensional scanning method, three-dimensional scanning control system and electronic device |
CN115830249B (en) * | 2023-02-22 | 2023-06-13 | 思看科技(杭州)股份有限公司 | Three-dimensional scanning method, three-dimensional measuring method, three-dimensional scanning system and electronic device |
CN116593490B (en) * | 2023-04-21 | 2024-02-02 | 无锡中车时代智能装备研究院有限公司 | Nondestructive testing method and system for surface defects of soft rubber mold of wing wallboard |
CN116550990B (en) * | 2023-04-28 | 2023-12-08 | 中国长江电力股份有限公司 | Mobile laser additive processing method and device for top cover of large-sized water turbine |
CN116476070B (en) * | 2023-05-22 | 2023-11-10 | 北京航空航天大学 | Method for adjusting scanning measurement path of large-scale barrel part local characteristic robot |
CN116437016B (en) * | 2023-06-13 | 2023-10-10 | 武汉中观自动化科技有限公司 | Object scanning method, device, electronic equipment and storage medium |
CN116858127A (en) * | 2023-07-18 | 2023-10-10 | 航天规划设计集团有限公司 | Initiating explosive device appearance measurement method, system and device |
CN116781837B (en) * | 2023-08-25 | 2023-11-14 | 中南大学 | Automatic change laser three-dimensional scanning system |
CN117948915B (en) * | 2024-03-18 | 2024-05-31 | 杭州非白三维科技有限公司 | Multi-tracking-head optical tracking three-dimensional scanning method and system |
CN118080205B (en) * | 2024-04-24 | 2024-07-23 | 四川吉埃智能科技有限公司 | Automatic spraying method and system based on vision |
CN118293136B (en) * | 2024-06-06 | 2024-08-13 | 新创碳谷集团有限公司 | Automatic bonding control method for modularized blade box body |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106846488A (en) * | 2017-01-11 | 2017-06-13 | 江苏科技大学 | A kind of large-sized object three-dimensional modeling and method based on many three-dimensional tracking devices |
CN106918300A (en) * | 2017-01-11 | 2017-07-04 | 江苏科技大学 | A kind of large-sized object three-dimensional Measured data connection method based on many three-dimensional tracking devices |
CN109613519A (en) * | 2019-01-11 | 2019-04-12 | 清华大学 | Pairing attitude-adjusting method based on more laser trackers measurement field |
CN109732600A (en) * | 2018-12-29 | 2019-05-10 | 南京工程学院 | A kind of Full-automatic sequential multi-drop measuring system and measurement method |
CN110260786A (en) * | 2019-06-26 | 2019-09-20 | 华中科技大学 | A kind of robot vision measuring system and its scaling method based on external trace |
CN111133395A (en) * | 2019-07-19 | 2020-05-08 | 爱佩仪测量设备有限公司 | Intelligent manufacturing system |
Family Cites Families (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE19957366C1 (en) * | 1999-11-29 | 2001-04-05 | Daimler Chrysler Ag | Measuring position determination method for object measurement has triangular grid defined by reference points on object surface with measuring point assigned to each triangle |
JP5322050B2 (en) * | 2008-07-28 | 2013-10-23 | 独立行政法人日本原子力研究開発機構 | Marker three-dimensional position measuring method and system |
JP2011237296A (en) * | 2010-05-11 | 2011-11-24 | Nippon Telegr & Teleph Corp <Ntt> | Three dimensional shape measuring method, three dimensional shape measuring device, and program |
TWI439660B (en) * | 2010-08-19 | 2014-06-01 | China Steel Corp | Shipboard bending measurement method |
CN104424630A (en) * | 2013-08-20 | 2015-03-18 | 华为技术有限公司 | Three-dimension reconstruction method and device, and mobile terminal |
US9964398B2 (en) * | 2015-05-06 | 2018-05-08 | Faro Technologies, Inc. | Three-dimensional measuring device removably coupled to robotic arm on motorized mobile platform |
CN106841206B (en) * | 2016-12-19 | 2018-07-24 | 大连理工大学 | Untouched online inspection method is cut in heavy parts chemical milling |
CN107543495B (en) * | 2017-02-17 | 2019-02-22 | 北京卫星环境工程研究所 | Spacecraft equipment autocollimation measuring system, alignment method and measurement method |
CN106959080B (en) * | 2017-04-10 | 2019-04-05 | 上海交通大学 | A kind of large complicated carved components three-dimensional pattern optical measuring system and method |
CN107782240B (en) * | 2017-09-27 | 2020-06-05 | 首都师范大学 | Two-dimensional laser scanner calibration method, system and device |
CN108444383B (en) * | 2018-03-08 | 2019-06-28 | 大连理工大学 | The box-like process integral measurement method of view-based access control model laser group |
CN108871209B (en) * | 2018-07-27 | 2020-11-03 | 复旦大学 | Large-size workpiece moving measurement robot system and method |
CN108801142B (en) * | 2018-07-27 | 2020-10-16 | 复旦大学 | Double-movement measuring robot system and method for super-large-size workpiece |
CN109990701B (en) * | 2019-03-04 | 2020-07-10 | 华中科技大学 | Mobile measurement system and method for large-scale complex curved surface three-dimensional shape robot |
CN109916333A (en) * | 2019-04-04 | 2019-06-21 | 大连交通大学 | A kind of large scale target with high precision three-dimensional reconstruction system and method based on AGV |
CN110672029A (en) * | 2019-08-30 | 2020-01-10 | 合肥学院 | Flexible measuring system of large-scale complex curved surface three-dimensional shape robot |
CN110686592B (en) * | 2019-09-04 | 2021-04-20 | 同济大学 | Combined measuring method for large-size target object |
CN110906880A (en) * | 2019-12-12 | 2020-03-24 | 中国科学院长春光学精密机械与物理研究所 | Object automatic three-dimensional laser scanning system and method |
CN111325723A (en) * | 2020-02-17 | 2020-06-23 | 杭州鼎热科技有限公司 | Hole site detection method, device and equipment |
CN111238375B (en) * | 2020-03-16 | 2022-06-03 | 北京卫星制造厂有限公司 | Laser tracker-based appearance reconstruction method for large-scale component of mobile detection robot |
CN111678459B (en) * | 2020-06-09 | 2021-10-08 | 杭州思看科技有限公司 | Three-dimensional scanning method, three-dimensional scanning system, and computer-readable storage medium |
CN111879235A (en) * | 2020-06-22 | 2020-11-03 | 杭州思看科技有限公司 | Three-dimensional scanning detection method and system for bent pipe and computer equipment |
-
2021
- 2021-02-05 CN CN202110162495.8A patent/CN112964196B/en active Active
- 2021-04-02 WO PCT/CN2021/085230 patent/WO2022165973A1/en active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106846488A (en) * | 2017-01-11 | 2017-06-13 | 江苏科技大学 | A kind of large-sized object three-dimensional modeling and method based on many three-dimensional tracking devices |
CN106918300A (en) * | 2017-01-11 | 2017-07-04 | 江苏科技大学 | A kind of large-sized object three-dimensional Measured data connection method based on many three-dimensional tracking devices |
CN109732600A (en) * | 2018-12-29 | 2019-05-10 | 南京工程学院 | A kind of Full-automatic sequential multi-drop measuring system and measurement method |
CN109613519A (en) * | 2019-01-11 | 2019-04-12 | 清华大学 | Pairing attitude-adjusting method based on more laser trackers measurement field |
CN110260786A (en) * | 2019-06-26 | 2019-09-20 | 华中科技大学 | A kind of robot vision measuring system and its scaling method based on external trace |
CN111133395A (en) * | 2019-07-19 | 2020-05-08 | 爱佩仪测量设备有限公司 | Intelligent manufacturing system |
Also Published As
Publication number | Publication date |
---|---|
WO2022165973A1 (en) | 2022-08-11 |
CN112964196A (en) | 2021-06-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112964196B (en) | Three-dimensional scanning method, system, electronic device and computer equipment | |
CN109285190B (en) | Object positioning method and device, electronic equipment and storage medium | |
CN114012731B (en) | Hand-eye calibration method and device, computer equipment and storage medium | |
US20170136626A1 (en) | Facilitating robot positioning | |
CN114739405B (en) | Scanning path adjustment method, device, automatic scanning system and computer equipment | |
EP1584426A1 (en) | Tool center point calibration system | |
CN110722558B (en) | Origin correction method and device for robot, controller and storage medium | |
CN111590593B (en) | Calibration method, device and system of mechanical arm and storage medium | |
CN114139857B (en) | Workpiece finishing working procedure correction method, system, storage medium and device | |
CN110815205A (en) | Calibration method, system and device of mobile robot | |
CN109636783B (en) | Method and device for determining arm length of robot, computer equipment and storage medium | |
CN112659129B (en) | Robot positioning method, device and system and computer equipment | |
CN114310901B (en) | Coordinate system calibration method, device, system and medium for robot | |
JPWO2018043524A1 (en) | Robot system, robot system control apparatus, and robot system control method | |
CN112102375A (en) | Method and device for detecting reliability of point cloud registration and mobile intelligent equipment | |
CN113781558B (en) | Robot vision locating method with decoupling gesture and position | |
CN113302027A (en) | Work coordinate generation device | |
CN117490633A (en) | Station transferring method, three-dimensional scanning method and three-dimensional scanning system of tracking equipment | |
JP2022152845A (en) | Calibration device for controlling robot | |
CN116019562A (en) | Robot control system and method | |
CN114929436B (en) | System and method for controlling robot, electronic device, and computer-readable medium | |
CN111812613A (en) | Mobile robot positioning monitoring method, device, equipment and medium | |
CN113515112B (en) | Robot moving method, apparatus, computer device and storage medium | |
CN109615658B (en) | Method and device for taking articles by robot, computer equipment and storage medium | |
CN112847350A (en) | Hand-eye calibration method, system, computer equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |