CN112964196A - Three-dimensional scanning method, system, electronic device and computer equipment - Google Patents

Three-dimensional scanning method, system, electronic device and computer equipment Download PDF

Info

Publication number
CN112964196A
CN112964196A CN202110162495.8A CN202110162495A CN112964196A CN 112964196 A CN112964196 A CN 112964196A CN 202110162495 A CN202110162495 A CN 202110162495A CN 112964196 A CN112964196 A CN 112964196A
Authority
CN
China
Prior art keywords
coordinate system
head
scanning
tracking
global coordinate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110162495.8A
Other languages
Chinese (zh)
Other versions
CN112964196B (en
Inventor
王江峰
梅振
蒋鑫巍
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Silidi Technology Co Ltd
Original Assignee
Hangzhou Silidi Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Silidi Technology Co Ltd filed Critical Hangzhou Silidi Technology Co Ltd
Priority to CN202110162495.8A priority Critical patent/CN112964196B/en
Priority to PCT/CN2021/085230 priority patent/WO2022165973A1/en
Publication of CN112964196A publication Critical patent/CN112964196A/en
Application granted granted Critical
Publication of CN112964196B publication Critical patent/CN112964196B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0234Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using optical markers or beacons
    • G05D1/0236Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using optical markers or beacons in combination with a laser
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0238Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
    • G05D1/024Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0251Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting 3D information from a plurality of images taken from different locations, e.g. stereo vision
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Electromagnetism (AREA)
  • Optics & Photonics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The application provides a three-dimensional scanning method, a three-dimensional scanning system, an electronic device and a computer device, wherein a movable device carrying robot and a scanner are controlled to move, when the movable device carrying the robot moves to a target scanning area, point cloud data under a scanning head coordinate system are obtained, the point cloud data on the surface of a scanned object are converted to a global coordinate system according to the relative position of a scanning head and a tracking head, and the point cloud data of the scanned object in the target scanning area under the global coordinate system are obtained, so that scanning and data splicing of the scanned object are completed, and the problems of limited scanning range and low splicing precision in the three-dimensional scanning process are solved.

Description

Three-dimensional scanning method, system, electronic device and computer equipment
Technical Field
The present application relates to the field of three-dimensional scanning technology for robots, and in particular, to a three-dimensional scanning method, a three-dimensional scanning system, an electronic apparatus, and a computer device.
Background
When a scanner is carried to scan a workpiece, the robot is difficult to meet the requirement of scanning work due to the limitation of the working space and the singular posture of the robot under the condition that the size of the workpiece exceeds the scanning range of the robot, at present, the scanning of the workpiece can be realized by introducing an external shaft, but in the mode, the movement range of the robot is limited by the length of a guide rail, the problem that the scanning range is limited still exists, and the splicing precision of the scanning data of the workpiece by matching the robot with the guide rail is low.
Aiming at the problems of limited scanning range and low splicing precision in the current three-dimensional scanning process, an effective solution is not provided.
Disclosure of Invention
The embodiment of the application provides a three-dimensional scanning method, a three-dimensional scanning system, an electronic device and computer equipment, and aims to at least solve the problems of limited scanning range and low splicing precision in the three-dimensional scanning process in the related technology.
In a first aspect, an embodiment of the present application provides a three-dimensional scanning method, which is used in a three-dimensional scanning system, where the three-dimensional scanning system includes a scanner, a robot, and a movable device, the scanner includes a scanning head and a tracking head, the scanning head is mounted at a distal end of the robot, and the robot is mounted on the movable device, and the method includes the following steps:
when a movable device carrying the robot moves to a target scanning area, point cloud data of the surface of a scanned object in the target scanning area under a scanning head coordinate system are acquired;
and according to the relative position and posture of the scanning head and the tracking head, converting the point cloud data of the surface of the scanned object under the scanning head coordinate system into the global coordinate system of the three-dimensional scanning system to obtain the point cloud data of the surface of the scanned object in the target scanning area under the global coordinate system.
In some of these embodiments, the tracking head is mounted on the movable device carrying the robot, wherein the number of the tracking head, the robot and the movable device is at least one.
In some embodiments, there are a plurality of movable apparatuses, at least one of the movable apparatuses is provided with the robot, and at least one of the other movable apparatuses not provided with the robot is provided with the tracking head.
In some embodiments, the converting the point cloud data of the scanned object surface in the scanning head coordinate system into the global coordinate system of the three-dimensional scanning system according to the relative poses of the scanning head and the tracking head includes the following steps:
converting point cloud data of the surface of the scanned object under a scanning head coordinate system into the tracking head coordinate system according to the relative poses of the scanning head and the tracking head;
and calculating the relative pose of the tracking head and the global coordinate system, and converting the point cloud data of the surface of the scanned object under the tracking head coordinate system into the global coordinate system according to the relative pose of the tracking head and the global coordinate system.
In some embodiments, the three-dimensional scanning system includes a global tracking device, and the method calculates the relative pose of the tracking head and the global coordinate system, and converts the point cloud data of the scanned object surface under the tracking head coordinate system into the global coordinate system according to the relative pose of the tracking head and the global coordinate system, and includes the following steps:
acquiring the relative pose of the tracking head and an identifier on the tracking head;
acquiring the pose of the identifier under a global coordinate system of the global tracking device;
and converting the point cloud data of the surface of the scanned object under the tracking head coordinate system into the global coordinate system according to the relative pose of the tracking head and the identifier and the pose of the identifier under the global coordinate system.
In some of these embodiments, the three-dimensional scanning system includes an industrial camera, and the calculating the relative pose of the tracking head and the global coordinate system and the translating the point cloud data of the scanned object surface under the tracking head coordinate system into the global coordinate system according to the relative pose of the tracking head and the global coordinate system further includes the following steps:
acquiring all mark point data under the global coordinate system in the target scanning area through the industrial camera;
acquiring partial mark point data under the tracking head coordinate system in the target scanning area;
and comparing all mark point data under the global coordinate system with part of mark point data under the tracking head coordinate system, and calculating to obtain the relative pose of the tracking head under the global coordinate system.
In some of these embodiments, the tracking head is stationary relative to the scanned object, and the tracking head coordinate system is the global coordinate system.
In some embodiments, after the point cloud data of the scanned object surface in the target scanning area is converted into the global coordinate system, the movable device is controlled to move into the next scanning area, and the next scanning area is taken as the target scanning area.
In some embodiments, the movable device is an AGV cart that moves along a predetermined travel path.
In a second aspect, an embodiment of the present application provides a three-dimensional scanning system, including a scanner, a robot, and a movable device, where the scanner includes a scanning head and a tracking head, the scanning head is mounted at a tail end of the robot, the robot is mounted on the movable device, the movable device includes a motion control module and a data acquisition and processing module, and the motion control module controls motions of the robot and the movable device through a network;
the scanning head scans along with the motion of the robot and is used for acquiring point cloud data of the surface of a scanned object in a target scanning area under a scanning head coordinate system when a movable device carrying the robot moves to the target scanning area;
the tracking head is used for tracking the pose of the scanning head;
the data acquisition processing module is used for calculating the relative position and posture of a tracking head and a scanning head, and converting the point cloud data of the surface of the scanned object under the scanning head coordinate system into the global coordinate system of the three-dimensional scanning system according to the relative position and posture to obtain the point cloud data of the surface of the scanned object in the target scanning area under the global coordinate system.
In a third aspect, an embodiment of the present application provides an electronic apparatus, which includes a memory, a processor, and a computer program stored on the memory and executable on the processor, and the processor implements the three-dimensional scanning method according to the first aspect when executing the computer program.
In a fourth aspect, embodiments of the present application provide a computer device, which includes a memory, a processor, and a computer program stored on the memory and executable on the processor, and the processor implements the three-dimensional scanning method as described in the first aspect when executing the computer program.
According to the three-dimensional scanning method, the system, the electronic device and the computer equipment, the movable device carrying robot and the scanner are controlled to move, when the movable device carrying the robot moves to the target scanning area, point cloud data under a scanning head coordinate system are obtained, the point cloud data on the surface of the scanned object are converted to be under a global coordinate system according to the relative position of the scanning head and the tracking head, and the point cloud data of the scanned object in the target scanning area under the global coordinate system are obtained, so that scanning and data splicing of the scanned object are completed, and the problems of limited scanning range and low splicing precision in the three-dimensional scanning process are solved.
The details of one or more embodiments of the application are set forth in the accompanying drawings and the description below to provide a more thorough understanding of the application.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
fig. 1 is a first application environment diagram of a three-dimensional scanning method according to an embodiment of the present invention;
FIG. 2 is a diagram of an application environment of a three-dimensional scanning method according to an embodiment of the present invention;
fig. 3 is a diagram of an application environment of a three-dimensional scanning method according to an embodiment of the present invention;
FIG. 4 is a flow chart of a three-dimensional scanning method according to an embodiment of the invention;
FIG. 5 is a schematic structural diagram of an electronic device according to an embodiment of the invention;
fig. 6 is a schematic structural diagram of a computer device according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application will be described and illustrated below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments provided in the present application without any inventive step are within the scope of protection of the present application. Moreover, it should be appreciated that in the development of any such actual implementation, as in any engineering or design project, numerous implementation-specific decisions must be made to achieve the developers' specific goals, such as compliance with system-related and business-related constraints, which may vary from one implementation to another.
Reference in the specification to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the specification. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those of ordinary skill in the art will explicitly and implicitly appreciate that the embodiments described herein may be combined with other embodiments without conflict.
Unless defined otherwise, technical or scientific terms referred to herein shall have the ordinary meaning as understood by those of ordinary skill in the art to which this application belongs. Reference to "a," "an," "the," and similar words throughout this application are not to be construed as limiting in number, and may refer to the singular or the plural. The present application is directed to the use of the terms "including," "comprising," "having," and any variations thereof, which are intended to cover non-exclusive inclusions; for example, a process, method, system, article, or apparatus that comprises a list of steps or modules (elements) is not limited to the listed steps or elements, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus. Reference to "connected," "coupled," and the like in this application is not intended to be limited to physical or mechanical connections, but may include electrical connections, whether direct or indirect. Reference herein to "a plurality" means greater than or equal to two. "and/or" describes an association relationship of associated objects, meaning that three relationships may exist, for example, "A and/or B" may mean: a exists alone, A and B exist simultaneously, and B exists alone. Reference herein to the terms "first," "second," "third," and the like, are merely to distinguish similar objects and do not denote a particular ordering for the objects.
Fig. 1 is a first application environment diagram of a three-dimensional scanning method according to an embodiment of the present disclosure. As shown in fig. 1, in this application environment, the scanner includes two parts, namely a scanning head 101 and a tracking head 102, where the scanning head 101 is mounted at the end of a robot 103, the tracking head 102 and the robot 103 are mounted on the same movable device 104 and move along with the movement of the movable device 104, when the movable device 104 moves to a certain target scanning area, the end of the robot 103 holds the scanning head 101 to scan an object 106 to be scanned, and a global tracking device 105 is fixedly mounted for capturing the tracking head 102. The scanning head 101 and the tracking head 102 may be a scanning head and a tracking instrument of a tracking scanner, respectively, the robot 103 may specifically be an industrial robot arm, the movable device 104 may be an agv (automated Guided vehicle) cart having an omnidirectional traveling function, and the global tracking device 105 may be a laser tracker.
The scanning head, the tracking head and the robot can be grouped to form scanning combinations, one scanning combination can comprise one or more scanning heads, tracking heads and robots, each scanning combination is distributed on one movable device, a plurality of movable devices are arranged to carry the scanning combinations for scanning, and the global tracking device tracks the tracking heads of the plurality of scanning combinations.
Fig. 2 is a second application environment diagram of the three-dimensional scanning method in an embodiment provided by the present application, as shown in fig. 2, in the application environment, the application environment further includes another movable device 107, and the scanner includes two parts, namely, a scanning head 101 and a tracking head 102, wherein the scanning head 101 is installed at an end of the robot 103, the robot 103 is installed on the movable device 104, the tracking head 102 is installed on the other movable device 107 not equipped with the robot 103, when the movable device 104 moves to a certain target scanning area, the end of the robot 103 holds the scanning head 101 to scan an object 106 to be scanned, the global tracking device 105 is fixedly installed to capture the tracking head 102, and the tracking head 102 calculates a relative pose between the scanning head 101 and the tracking head 102 by capturing a mark point on the scanning head 101.
Fig. 3 is a third diagram of an application environment of a three-dimensional scanning method according to an embodiment of the present invention, as shown in fig. 3, in the application environment, a scanner includes two parts, namely a scanning head 101 and a tracking head 102, where the scanning head 101 is installed at an end of a robot 103, the robot 103 is installed on a movable device 104 and moves along with the movement of the movable device 104, when the movable device 104 moves to a certain target scanning area, the end of the robot 103 holds the scanning head 101 to scan an object 106 to be scanned, the tracking head 102 is fixed relative to the object 106 to be scanned, and a coordinate system of the tracking head 102 is a global coordinate system.
The embodiment provides a three-dimensional scanning method, as shown in fig. 4, including the following steps:
step S201, when the movable device carrying the robot moves to the target scanning area, acquiring point cloud data of the surface of the scanned object in the target scanning area under the scanning head coordinate system.
Specifically, the movable device may be a smart mobile device, such as an AGV, having an omnidirectional movement function and capable of being controlled by a motion command. When the AGV trolley moves to a specified target scanning area, the robot carried on the AGV trolley can clamp the scanning head of the scanner to move in a scanning range, the surface of a scanned object in the target scanning area is scanned, and point cloud data of the surface of the scanned object under a scanning head coordinate system is obtained. The surface of the scanned object may be a local surface that can be scanned by the scanning head within a fixed scanning range in the target scanning area, or may be the entire surface of one or more scanned objects in the target scanning area. The target scanning area may be a scanning area that is previously designated according to the characteristics of the scanned object, the inherent properties of the scanner, the motion characteristics of the movable device, and the like.
In the case where the size of the scanned object exceeds the scanning range of the scanner, the entire scanning area where the scanned object is located needs to be divided into a plurality of scanning areas, and the target scanning area may be any one of the entire scanning areas.
Furthermore, the motion control module in the control device on the AGV trolley can send instructions to the AGV trolley and the controller of the robot carried on the AGV trolley so as to control the movement of the AGV trolley and the robot, wherein the motion control module is in signal connection with the controller of the AGV trolley or the robot through a network. When the AGV trolley reaches the target scanning area, the AGV trolley is kept fixed, the robot clamping scanning head on the AGV trolley is controlled to scan the surface of the scanned object, and the mode of scanning through the AGV trolley matched with the movement of the robot can enlarge the working range of the robot.
Step S202, according to the relative position of the scanning head and the tracking head, the point cloud data of the surface of the scanned object under the scanning head coordinate system is converted into the global coordinate system of the three-dimensional scanning system, and the point cloud data of the surface of the scanned object in the target scanning area under the global coordinate system is obtained.
The scanner comprises a tracking head and a scanning head, wherein the scanning head can scan the surface of a scanned object to acquire point cloud data of the surface of the scanned object under a scanning head coordinate system, the tracking head can take a picture in a visual range, and the relative pose of the scanning head and the tracking head is calculated in real time by capturing mark points on the scanning head. The robot can be an industrial mechanical arm, and after the mechanical arm drives the scanning head to acquire the point cloud data of the surface of the scanned object in the target scanning range, the point cloud data of the surface of the scanned object in the target scanning range under the scanning head coordinate system can be converted into the point cloud data under the global coordinate system according to the calculated relative pose of the scanning head and the tracking head.
Further, due to the limitation of the scanning range, in a fixed target scanning area, the surface data of the scanned object may not be completely obtained, but only the surface data of the part of the scanned object surface of the scanned object in the target scanning area under the local coordinate system, that is, the point cloud data of the scanned object surface in the scanning head coordinate system in the target scanning area, may be obtained. In order to unify the coordinate systems of the point cloud data obtained in different scanning areas, a global coordinate system needs to be specified, and the point cloud data in the target scanning area and under the scanning head coordinate system needs to be converted into the unified global coordinate system.
The global coordinate system is a reference coordinate system of a device which is fixed relative to the scanned object and is based on the scanned object, and may be determined by a global tracking device in a three-dimensional scanning system, such as a laser tracker, or by pasting mark points in the whole scanned area and obtaining the coordinates of the global mark points, or by fixing the tracking head relative to the scanned object and using the coordinate system of the tracking head as the global coordinate system. After the global coordinate system is determined, the point cloud data of the surface of the scanned object in the target scanning area and under the scanning head coordinate system can be transferred to the global coordinate system through the relative position and posture of the scanning head and the tracking head. The method for determining the uniform global coordinate system can improve the splicing precision of the data of the surface of the scanned object.
The movable device carrying robot and the scanner are controlled to move, when the movable device carrying the robot moves to a target scanning area, point cloud data under a scanning head coordinate system are obtained, the point cloud data on the surface of a scanned object are converted to be under a global coordinate system according to the relative position of the scanning head and the tracking head, and the point cloud data of the scanned object in the target scanning area under the global coordinate system are obtained, so that scanning and data splicing of the scanned object are completed, and the problems of limited scanning range and low splicing precision in the three-dimensional scanning process are solved.
Further, in one embodiment, based on the above steps S201 to S202, the tracking head is mounted on a movable device mounting the robot, wherein the number of the tracking head, the robot, and the movable device is at least one.
In addition, in one embodiment, based on the above steps S201 to S202, there are a plurality of mobile devices, wherein at least one of the mobile devices has a robot mounted thereon, and at least one of the other mobile devices that has no robot mounted thereon has a tracking head mounted thereon.
In the entire scanning area, the robot and the tracking head may be mounted on different movable devices, that is, the scanning head and the tracking head of the scanner may be mounted on different movable devices, respectively, wherein the movable device carrying the robot and the movable device mounted with the tracking head may be plural.
When the tracking head is arranged on the movable device carrying the robot or the tracking head is arranged on the movable devices of other un-carried robots, the point cloud data of the surface of the scanned object under the scanning head coordinate system is converted into the global coordinate system of the three-dimensional scanning system according to the relative position and posture of the scanning head and the tracking head, and the method comprises the following steps:
step S301, according to the relative position of the scanning head and the tracking head, point cloud data of the surface of the scanned object under a scanning head coordinate system is converted into a tracking head coordinate system.
And step S302, calculating the relative position and posture of the tracking head and the global coordinate system, and converting the point cloud data of the surface of the scanned object under the tracking head coordinate system into the global coordinate system according to the relative position and posture of the tracking head and the global coordinate system.
Since the tracking head moves along with the movable device, the tracking head coordinate system is still a local coordinate system relative to the scanned object, and after the point cloud data of the surface of the scanned object in the tracking head coordinate system is obtained in step S301, the point cloud data of the surface of the scanned object in the global coordinate system in the target scanning area is obtained through the relative pose between the tracking head and the global coordinate system.
The global tracking device may specifically be a device capable of tracking and capturing the position of the scanner, such as a laser tracker, which is fixed at a certain position and does not change position with the movement of the movable device, so that the reference coordinate system of the laser tracker can be used as the global coordinate system of the whole scanning process.
In addition, a data acquisition and processing module can be installed in the control device, and the data acquisition and processing module receives point cloud data of the surface of the scanned object in the tracking head coordinate system through a network and/or Bluetooth, calculates the relative position and posture of the tracking head and the coordinate system, and converts the point cloud data of the surface of the scanned object into a global coordinate system.
Furthermore, the three-dimensional scanning system includes a global tracking device, based on the step S303, calculating the relative pose between the tracking head and the global coordinate system, and converting the point cloud data of the surface of the scanned object in the tracking head coordinate system into the global coordinate system according to the relative pose between the tracking head and the global coordinate system, including the following steps:
step S401, the relative position and posture of the tracking head and the identifier on the tracking head are obtained.
The relative position of the identifier and the tracking head is fixed, and the relative position relationship between the identifier and the tracking head can be determined in a calibration mode, wherein the identifier can be a laser target ball.
Specifically, a series of mark points can be pasted on the calibration board by arranging the calibration board at a fixed position in the target scanning area, wherein the calibration board can also be replaced by the ground or the wall in the target scanning area. And finally, solving the relative poses of the identifier and the tracking head according to the relative poses of the identifier and the global tracking device at the fixed position, the relative poses of the tracking head and the calibration plate and the known relative poses of the calibration plate and the global tracking device.
Step S402, acquiring the pose of the identifier in the global coordinate system of the global tracking device.
Since the identifier can be captured by the global tracking device, the relative pose of the identifier and the global tracking device can be acquired in real time during the movement of the identifier along with the movable device.
And S403, converting the point cloud data of the surface of the scanned object in the tracking head coordinate system into the global coordinate system according to the relative position of the tracking head and the identifier and the position of the identifier in the global coordinate system.
In the space, the conversion between the two coordinate systems can be determined according to the relative pose between the two coordinate systems, so that after the relative pose of the tracking head and the identifier and the pose of the identifier in the reference coordinate system of the global tracking device are obtained, the relative pose of the tracking head and the reference coordinate system can be indirectly calculated, and the point cloud data of the surface of the scanned object in the tracking head coordinate system can be converted into the reference coordinate system of the global tracking device.
Additionally, in one embodiment, the three-dimensional scanning system comprises an industrial camera, calculates the relative pose of the tracking head and the global coordinate system, and converts the point cloud data of the scanned object surface under the tracking head coordinate system to the global coordinate system according to the relative pose of the tracking head and the global coordinate system, and comprises the following steps:
step S501, all mark point data in the target scanning area under the global coordinate system are obtained through the industrial camera.
In this embodiment, the global coordinate system is determined by pasting marker points on the object to be scanned and its surroundings, and acquiring all the marker point data within the entire scanning area. Specifically, a photogrammetric device, such as an industrial camera, may be used to obtain the three-dimensional coordinates and directions of all the marked points under the global coordinates.
Step S502, obtaining part of mark point data under the tracking head coordinate system in the target scanning area.
The mark points can be used for calibrating the relative pose between the tracking head and the global coordinate system, so that the tracking head obtains the relative pose between the tracking head and the scanning head and also obtains mark point data of part of the mark points in the tracking head coordinate system in a target scanning area, specifically, the mark point data can be three-dimensional coordinates and directions of part of the mark points in the tracking head coordinate system.
And S503, comparing all the mark point data in the global coordinate system with part of the mark point data in the tracking head coordinate system, and calculating to obtain the relative pose of the tracking head in the global coordinate system.
Because the three-dimensional position relationship of the mark points in the target scanning area in the tracking head coordinate system is expressed by the data of the part of the mark points in the tracking head coordinate system, and the coordinates and the directions of the part of the mark points in the target scanning area in the global coordinate system can be obtained through the step S501, the part of the mark points in the tracking head coordinate system is compared with all the mark points in the global coordinate system, and the relative poses of the global coordinate system and the tracking head can be determined.
Specifically, the relative position relationship between all the mark points is obtained while all the mark point data under the global coordinate system is obtained, similarly, the relative position relationship between part of the mark points under the tracking head coordinate system is obtained while part of the mark point data under the tracking head coordinate system is obtained, and the matching of the corresponding mark points is carried out in all the mark points under the global coordinate system through the relative position relationship between part of the mark points under the tracking head coordinate system and the coordinates and directions of part of the mark points, so as to obtain the positions of the part of the mark points under the global coordinate system.
For example, the set of all the marker points in the global coordinate system is a (x1, x2, x 3.., xn), the part of the marker points in the tracking head coordinate system is B (x1, x2, x3), and the set B is a part of the set a, but the coordinate system to which the marker points in the set B belong is different from the coordinate system to which the marker points in the set a belong. And matching the coordinates and the directions of the mark points in the set B and the relative position relationship among the mark points with the coordinates and the directions of the mark points in the set A and the relative position relationship among the mark points to obtain the pose of the mark points in the set B in the global coordinate system.
And finally, calculating the pose of the tracking head in the global coordinate system according to the pose of part of the mark point data in the global coordinate system and part of the mark point data in the tracking head coordinate system.
In one embodiment, based on the above steps S201 to S202, the tracking head is fixed relative to the scanned object, and the tracking head coordinate system is a global coordinate system.
Specifically, the tracking head may be fixed in the entire scanning area and kept stationary with respect to the scanned object, and the movable device-mounted robot moves according to a preset movement path, in which case, the tracking head coordinate system can be used as a global coordinate system, and the point cloud data of the surface of the scanned object in the scanning head coordinate system is directly converted into the global coordinate system through the relative poses of the tracking head and the scanning head, so as to complete the stitching of the point cloud data of the scanned object in the global coordinate system.
In one embodiment, after the point cloud data of the scanned object surface in the target scanning area is converted into the global coordinate system, the movable device is controlled to move to the next scanning area, and the next scanning area is taken as the target scanning area.
After the conversion of the point cloud data of the surface of the scanned object in the target scanning area under the global coordinate system is completed, the point cloud data of the local surface of the scanned object in the target scanning area, that is, the scanning of the point cloud data of the surface of the scanned object and the splicing under the global coordinate system, can be realized, the movable device also needs to be controlled to move into the next scanning area, the next scanning area is taken as the target scanning area, and the steps S201 to S502 are repeated until the scanning and splicing of the whole scanned object are completed, so that the surface data of the whole scanned object or all the scanned objects in the whole scanning area under the unified global coordinate system can be obtained.
In one embodiment, the movable device is an AGV that moves according to a predetermined travel path. The preset moving path may be determined according to parameters such as the size of the scanned object, the inherent properties of the AGV, and the scanning range of the scanner.
In the steps, the movable device carrying robot and the scanner are controlled to move, and when the movable device moves to the target scanning area, the point cloud data under the scanning head coordinate system is obtained, so that the scanning range of the robot carrying scanner for scanning can be expanded, and the problem that the scanning range is limited in the scanning process of the robot is solved. When the tracking head is arranged on the movable device, the point cloud data of the scanned object in the target scanning area under the tracking head coordinate system is obtained according to the relative position of the scanning head and the tracking head, then the relative position of the tracking head and the global coordinate system of the three-dimensional scanning system is calculated by determining the global coordinate system, the point cloud data of the scanned object in the target scanning area under the global coordinate system is obtained, when the tracking head is fixed relative to the scanned object, the tracking head coordinate system is directly used as a global coordinate system to complete the conversion of the point cloud data of the scanned object in the global coordinate system, after the scanning and splicing in the target scanning area are finished, the scanning and data splicing are continuously carried out on the surfaces of the scanned objects in other scanning areas until the scanning and data splicing of the scanned objects are finished, so that the problem of low splicing precision caused by the accumulation of splicing errors is solved.
In one embodiment, a three-dimensional scanning system is provided, which comprises a scanner, a robot and a movable device, wherein the scanner comprises a scanning head and a tracking head, the scanning head is arranged at the tail end of the robot, the robot is arranged on the movable device, the movable device comprises a motion control module and a data acquisition and processing module, and the motion control module controls the motion of the robot and the movable device through a network;
the scanning head scans along with the movement of the robot and is used for acquiring point cloud data of the surface of a scanned object in a target scanning area under a scanning head coordinate system when a movable device carrying the robot moves to the target scanning area;
the tracking head is used for tracking the pose of the scanning head;
the data acquisition processing module is used for calculating the relative position and posture of the tracking head and the scanning head, and converting the point cloud data of the surface of the scanned object under the coordinate system of the scanning head into the global coordinate system of the three-dimensional scanning system according to the relative position and posture to obtain the point cloud data of the surface of the scanned object in the target scanning area under the global coordinate system.
According to the three-dimensional scanning system, the movable device carrying robot and the scanner are controlled to move, when the movable device carrying the robot moves to the target scanning area, point cloud data under a scanning head coordinate system are obtained, the point cloud data on the surface of the scanned object are converted into the global coordinate system according to the relative position of the scanning head and the tracking head, and the point cloud data of the scanned object in the target scanning area under the global coordinate system are obtained, so that scanning and data splicing of the scanned object are completed, and the problems of limited scanning range and low splicing precision in the three-dimensional scanning process are solved.
For specific limitations of embodiments of the three-dimensional scanning system, reference may be made to the above limitations of the three-dimensional scanning method, which are not described herein again. The various modules in the three-dimensional scanning system described above may be implemented in whole or in part by software, hardware, and combinations thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
In one embodiment, as shown in FIG. 5, an electronic device is provided that includes a memory and a processor. The memory has stored therein a computer program for providing computing and control capabilities to the processor of the electronic device. The memory of the electronic device comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system, a computer program, and a database. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium.
Optionally, the electronic apparatus may further include a transmission device and an input/output device, wherein the transmission device is connected to the processor, and the input/output device is connected to the processor.
Optionally, in this embodiment, the processor, when executing the computer program, implements the following steps:
when a movable device carrying the robot moves to a target scanning area, point cloud data of the surface of a scanned object in the target scanning area under a scanning head coordinate system are acquired;
and according to the relative position of the scanning head and the tracking head, converting the point cloud data of the surface of the scanned object under the scanning head coordinate system into the global coordinate system of the three-dimensional scanning system to obtain the point cloud data of the surface of the scanned object in the target scanning area under the global coordinate system.
In one embodiment, the tracking head is mounted on a movable device carrying the robot, wherein the number of tracking heads, robots and movable devices is at least one.
In one embodiment, there are a plurality of mobile devices, at least one of which has a robot mounted thereon, and at least one of the remaining mobile devices which has no robot mounted thereon has a tracking head mounted thereon.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
converting point cloud data of the surface of the scanned object under a scanning head coordinate system into a tracking head coordinate system according to the relative position and posture of the scanning head and the tracking head;
and calculating the relative pose of the tracking head and the global coordinate system, and converting the point cloud data of the surface of the scanned object under the tracking head coordinate system into the global coordinate system according to the relative pose of the tracking head and the global coordinate system.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
acquiring the relative poses of the tracking head and the identifier on the tracking head;
acquiring the pose of the identifier under the global coordinate system of the global tracking device;
and converting the point cloud data of the surface of the scanned object under the tracking head coordinate system into the global coordinate system according to the relative position of the tracking head and the identifier and the position of the identifier under the global coordinate system.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
acquiring all mark point data under a global coordinate system in a target scanning area through an industrial camera;
acquiring partial mark point data under a tracking head coordinate system in a target scanning area;
and comparing all mark point data under the global coordinate system with part of mark point data under the tracking head coordinate system, and calculating to obtain the relative pose of the tracking head under the global coordinate system.
In one embodiment, the tracking head is stationary relative to the scanned object and the tracking head coordinate system is a global coordinate system.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
and after the point cloud data of the surface of the scanned object in the target scanning area is converted into the global coordinate system, controlling the movable device to move into the next scanning area, and taking the next scanning area as the target scanning area.
In one embodiment, the movable device is an AGV cart that moves along a predetermined travel path.
It should be noted that, for specific examples in this embodiment, reference may be made to examples described in the foregoing embodiments and optional implementations, and details of this embodiment are not described herein again.
In one embodiment, a computer device is provided, which may be a server, and its internal structure diagram may be as shown in fig. 6. The computer device includes a processor, a memory, a network interface, and a database connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system, a computer program, and a database. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The database of the computer device is used for storing a preset configuration information set. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer program is executed by a processor to implement the three-dimensional scanning method described above.
In one embodiment, a computer device is provided, which may be a terminal. The computer device includes a processor, a memory, a network interface, a display screen, and an input device connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer program is executed by a processor to implement a three-dimensional scanning method. The display screen of the computer equipment can be a liquid crystal display screen or an electronic ink display screen, and the input device of the computer equipment can be a touch layer covered on the display screen, a key, a track ball or a touch pad arranged on the shell of the computer equipment, an external keyboard, a touch pad or a mouse and the like.
Those skilled in the art will appreciate that the architecture shown in fig. 6 is merely a block diagram of some of the structures associated with the disclosed aspects and is not intended to limit the computing devices to which the disclosed aspects apply, as particular computing devices may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in the embodiments provided herein may include non-volatile and/or volatile memory, among others. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDRSDRAM), Enhanced SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above examples only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (12)

1. A three-dimensional scanning method for use in a three-dimensional scanning system, the three-dimensional scanning system including a scanner including a scanning head and a tracking head, a robot mounted on a distal end of the robot, and a movable device on which the robot is mounted, the method comprising:
when a movable device carrying the robot moves to a target scanning area, point cloud data of the surface of a scanned object in the target scanning area under a scanning head coordinate system are acquired;
and according to the relative position of the scanning head and the tracking head, converting the point cloud data of the surface of the scanned object under the scanning head coordinate system into a global coordinate system of the three-dimensional scanning system to obtain the point cloud data of the surface of the scanned object in the target scanning area under the global coordinate system.
2. The method of claim 1, wherein the tracking head is mounted on the movable device on which the robot is mounted, wherein the number of the tracking head, the robot, and the movable device is at least one.
3. The method according to claim 1, wherein there are a plurality of said mobile devices, at least one of said mobile devices having said robot mounted thereon, and at least one of said mobile devices not having said robot mounted thereon having said tracking head mounted thereon.
4. The method according to any one of claims 2 or 3, wherein the converting point cloud data of the scanned object surface under the scanning head coordinate system into a global coordinate system of the three-dimensional scanning system according to the relative poses of the scanning head and the tracking head comprises:
converting point cloud data of the surface of the scanned object under the scanning head coordinate system into the tracking head coordinate system according to the relative poses of the scanning head and the tracking head;
and calculating the relative pose of the tracking head and the global coordinate system, and converting the point cloud data of the surface of the scanned object under the tracking head coordinate system into the global coordinate system according to the relative pose of the tracking head and the global coordinate system.
5. The method of claim 4, wherein the three-dimensional scanning system comprises a global tracking device, and wherein calculating the relative pose of the tracking head and the global coordinate system and translating the point cloud data of the surface of the scanned object in the tracking head coordinate system into the global coordinate system according to the relative pose of the tracking head and the global coordinate system comprises:
acquiring the relative poses of the tracking head and the identifier on the tracking head;
acquiring the pose of the identifier under a global coordinate system of the global tracking device;
and converting the point cloud data of the surface of the scanned object under the tracking head coordinate system into the global coordinate system according to the relative pose of the tracking head and the identifier and the pose of the identifier under the global coordinate system.
6. The method of claim 4, wherein the three-dimensional scanning system comprises an industrial camera, and wherein calculating the relative pose of the tracking head and the global coordinate system and translating the point cloud data of the surface of the scanned object in the tracking head coordinate system to the global coordinate system according to the relative pose of the tracking head and the global coordinate system comprises:
acquiring all mark point data under the global coordinate system in the target scanning area through the industrial camera;
acquiring partial mark point data under the tracking head coordinate system in the target scanning area;
and comparing all mark point data under the global coordinate system with part of mark point data under the tracking head coordinate system, and calculating to obtain the relative pose of the tracking head under the global coordinate system.
7. The method of claim 1, wherein the tracking head is stationary relative to the scanned object and the tracking head coordinate system is the global coordinate system.
8. The method according to claim 1, wherein after the point cloud data of the scanned object surface in the target scanning area is converted into the global coordinate system, the movable device is controlled to move into a next scanning area, and the next scanning area is taken as the target scanning area.
9. The method of claim 1, wherein the mobile device is an AGV cart that moves according to a predetermined travel path.
10. A three-dimensional scanning system is characterized by comprising a scanner, a robot and a movable device, wherein the scanner comprises a scanning head and a tracking head, the scanning head is arranged at the tail end of the robot, the robot is arranged on the movable device, the movable device comprises a motion control module and a data acquisition and processing module, and the motion control module controls the motion of the robot and the movable device through a network;
the scanning head scans along with the motion of the robot and is used for acquiring point cloud data of the surface of a scanned object in a target scanning area under a scanning head coordinate system when a movable device carrying the robot moves to the target scanning area;
the tracking head is used for tracking the pose of the scanning head;
the data acquisition processing module is used for calculating the relative pose of the tracking head and the scanning head, and converting the point cloud data of the surface of the scanned object under the scanning head coordinate system into the global coordinate system of the three-dimensional scanning system according to the relative pose to obtain the point cloud data of the surface of the scanned object in the target scanning area under the global coordinate system.
11. An electronic device comprising a memory and a processor, wherein the memory has stored therein a computer program, and the processor is configured to execute the computer program to perform the three-dimensional scanning method according to any one of claims 1 to 9.
12. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor performs the three-dimensional scanning method of any one of claims 1 to 9.
CN202110162495.8A 2021-02-05 2021-02-05 Three-dimensional scanning method, system, electronic device and computer equipment Active CN112964196B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110162495.8A CN112964196B (en) 2021-02-05 2021-02-05 Three-dimensional scanning method, system, electronic device and computer equipment
PCT/CN2021/085230 WO2022165973A1 (en) 2021-02-05 2021-04-02 Three-dimensional scanning method and system, electronic device, and computer equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110162495.8A CN112964196B (en) 2021-02-05 2021-02-05 Three-dimensional scanning method, system, electronic device and computer equipment

Publications (2)

Publication Number Publication Date
CN112964196A true CN112964196A (en) 2021-06-15
CN112964196B CN112964196B (en) 2023-01-03

Family

ID=76274687

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110162495.8A Active CN112964196B (en) 2021-02-05 2021-02-05 Three-dimensional scanning method, system, electronic device and computer equipment

Country Status (2)

Country Link
CN (1) CN112964196B (en)
WO (1) WO2022165973A1 (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113432561A (en) * 2021-08-02 2021-09-24 杭州思看科技有限公司 Data processing method and three-dimensional scanning system
CN113670202A (en) * 2021-08-25 2021-11-19 杭州思看科技有限公司 Three-dimensional scanning system and three-dimensional scanning method
CN113865506A (en) * 2021-09-09 2021-12-31 武汉惟景三维科技有限公司 Automatic three-dimensional measurement method and system for non-mark point splicing
CN113884021A (en) * 2021-09-24 2022-01-04 上海飞机制造有限公司 Scanning system, calibration device and calibration method of scanning system
CN114111627A (en) * 2021-12-07 2022-03-01 深圳市中图仪器股份有限公司 Scanning system and scanning method based on laser tracker
CN114234838A (en) * 2021-11-19 2022-03-25 武汉尺子科技有限公司 3D scanning method and device
CN114739405A (en) * 2022-02-28 2022-07-12 杭州思看科技有限公司 Scanning path adjusting method and device, automatic scanning system and computer equipment
CN115493512A (en) * 2022-08-10 2022-12-20 思看科技(杭州)股份有限公司 Data processing method, three-dimensional scanning system, electronic device, and storage medium
CN115661369A (en) * 2022-12-14 2023-01-31 思看科技(杭州)股份有限公司 Three-dimensional scanning method, three-dimensional scanning control system and electronic device
CN116858127A (en) * 2023-07-18 2023-10-10 航天规划设计集团有限公司 Initiating explosive device appearance measurement method, system and device
CN117948915A (en) * 2024-03-18 2024-04-30 杭州非白三维科技有限公司 Multi-tracking-head optical tracking three-dimensional scanning method and system
WO2024109795A1 (en) * 2022-11-25 2024-05-30 杭州先临天远三维检测技术有限公司 Scanning processing method and apparatus, device, and medium

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115830550A (en) * 2022-12-08 2023-03-21 亿咖通(湖北)技术有限公司 Method and device for detecting motion state of target
CN115830249B (en) * 2023-02-22 2023-06-13 思看科技(杭州)股份有限公司 Three-dimensional scanning method, three-dimensional measuring method, three-dimensional scanning system and electronic device
CN116593490B (en) * 2023-04-21 2024-02-02 无锡中车时代智能装备研究院有限公司 Nondestructive testing method and system for surface defects of soft rubber mold of wing wallboard
CN116550990B (en) * 2023-04-28 2023-12-08 中国长江电力股份有限公司 Mobile laser additive processing method and device for top cover of large-sized water turbine
CN116476070B (en) * 2023-05-22 2023-11-10 北京航空航天大学 Method for adjusting scanning measurement path of large-scale barrel part local characteristic robot
CN116437016B (en) * 2023-06-13 2023-10-10 武汉中观自动化科技有限公司 Object scanning method, device, electronic equipment and storage medium
CN116781837B (en) * 2023-08-25 2023-11-14 中南大学 Automatic change laser three-dimensional scanning system
CN118080205B (en) * 2024-04-24 2024-07-23 四川吉埃智能科技有限公司 Automatic spraying method and system based on vision
CN118293136B (en) * 2024-06-06 2024-08-13 新创碳谷集团有限公司 Automatic bonding control method for modularized blade box body

Citations (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010032282A (en) * 2008-07-28 2010-02-12 Japan Atomic Energy Agency Method and system for measuring three-dimensional position of marker
JP2011237296A (en) * 2010-05-11 2011-11-24 Nippon Telegr & Teleph Corp <Ntt> Three dimensional shape measuring method, three dimensional shape measuring device, and program
TW201209374A (en) * 2010-08-19 2012-03-01 China Steel Corp Ship plate bending measurement method
CN104424630A (en) * 2013-08-20 2015-03-18 华为技术有限公司 Three-dimension reconstruction method and device, and mobile terminal
CN106846488A (en) * 2017-01-11 2017-06-13 江苏科技大学 A kind of large-sized object three-dimensional modeling and method based on many three-dimensional tracking devices
CN106841206A (en) * 2016-12-19 2017-06-13 大连理工大学 Untouched online inspection method is cut in heavy parts chemical milling
CN106918300A (en) * 2017-01-11 2017-07-04 江苏科技大学 A kind of large-sized object three-dimensional Measured data connection method based on many three-dimensional tracking devices
CN106959080A (en) * 2017-04-10 2017-07-18 上海交通大学 A kind of large complicated carved components three-dimensional pattern optical measuring system and method
CN107543495A (en) * 2017-02-17 2018-01-05 北京卫星环境工程研究所 Spacecraft equipment autocollimation measuring system, alignment method and measuring method
CN107782240A (en) * 2017-09-27 2018-03-09 首都师范大学 A kind of two dimensional laser scanning instrument scaling method, system and device
CN108444383A (en) * 2018-03-08 2018-08-24 大连理工大学 The box-like process integral measurement method of view-based access control model laser group
CN108801142A (en) * 2018-07-27 2018-11-13 复旦大学 A kind of super workpiece double-movement measurement robot system and method
CN109613519A (en) * 2019-01-11 2019-04-12 清华大学 Pairing attitude-adjusting method based on more laser trackers measurement field
CN109732600A (en) * 2018-12-29 2019-05-10 南京工程学院 A kind of Full-automatic sequential multi-drop measuring system and measurement method
CN109916333A (en) * 2019-04-04 2019-06-21 大连交通大学 A kind of large scale target with high precision three-dimensional reconstruction system and method based on AGV
CN109990701A (en) * 2019-03-04 2019-07-09 华中科技大学 A kind of large complicated carved three-dimensional appearance robot traverse measurement system and method
CN110260786A (en) * 2019-06-26 2019-09-20 华中科技大学 A kind of robot vision measuring system and its scaling method based on external trace
CN110686592A (en) * 2019-09-04 2020-01-14 同济大学 Combined measuring method for large-size target object
CN110906880A (en) * 2019-12-12 2020-03-24 中国科学院长春光学精密机械与物理研究所 Object automatic three-dimensional laser scanning system and method
CN111133395A (en) * 2019-07-19 2020-05-08 爱佩仪测量设备有限公司 Intelligent manufacturing system
CN111325723A (en) * 2020-02-17 2020-06-23 杭州鼎热科技有限公司 Hole site detection method, device and equipment
CN111678459A (en) * 2020-06-09 2020-09-18 杭州思看科技有限公司 Three-dimensional scanning method, three-dimensional scanning system, and computer-readable storage medium
CN111879235A (en) * 2020-06-22 2020-11-03 杭州思看科技有限公司 Three-dimensional scanning detection method and system for bent pipe and computer equipment

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE19957366C1 (en) * 1999-11-29 2001-04-05 Daimler Chrysler Ag Measuring position determination method for object measurement has triangular grid defined by reference points on object surface with measuring point assigned to each triangle
US9964398B2 (en) * 2015-05-06 2018-05-08 Faro Technologies, Inc. Three-dimensional measuring device removably coupled to robotic arm on motorized mobile platform
CN108871209B (en) * 2018-07-27 2020-11-03 复旦大学 Large-size workpiece moving measurement robot system and method
CN110672029A (en) * 2019-08-30 2020-01-10 合肥学院 Flexible measuring system of large-scale complex curved surface three-dimensional shape robot
CN111238375B (en) * 2020-03-16 2022-06-03 北京卫星制造厂有限公司 Laser tracker-based appearance reconstruction method for large-scale component of mobile detection robot

Patent Citations (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010032282A (en) * 2008-07-28 2010-02-12 Japan Atomic Energy Agency Method and system for measuring three-dimensional position of marker
JP2011237296A (en) * 2010-05-11 2011-11-24 Nippon Telegr & Teleph Corp <Ntt> Three dimensional shape measuring method, three dimensional shape measuring device, and program
TW201209374A (en) * 2010-08-19 2012-03-01 China Steel Corp Ship plate bending measurement method
CN104424630A (en) * 2013-08-20 2015-03-18 华为技术有限公司 Three-dimension reconstruction method and device, and mobile terminal
CN106841206A (en) * 2016-12-19 2017-06-13 大连理工大学 Untouched online inspection method is cut in heavy parts chemical milling
CN106846488A (en) * 2017-01-11 2017-06-13 江苏科技大学 A kind of large-sized object three-dimensional modeling and method based on many three-dimensional tracking devices
CN106918300A (en) * 2017-01-11 2017-07-04 江苏科技大学 A kind of large-sized object three-dimensional Measured data connection method based on many three-dimensional tracking devices
CN107543495A (en) * 2017-02-17 2018-01-05 北京卫星环境工程研究所 Spacecraft equipment autocollimation measuring system, alignment method and measuring method
CN106959080A (en) * 2017-04-10 2017-07-18 上海交通大学 A kind of large complicated carved components three-dimensional pattern optical measuring system and method
CN107782240A (en) * 2017-09-27 2018-03-09 首都师范大学 A kind of two dimensional laser scanning instrument scaling method, system and device
CN108444383A (en) * 2018-03-08 2018-08-24 大连理工大学 The box-like process integral measurement method of view-based access control model laser group
CN108801142A (en) * 2018-07-27 2018-11-13 复旦大学 A kind of super workpiece double-movement measurement robot system and method
CN109732600A (en) * 2018-12-29 2019-05-10 南京工程学院 A kind of Full-automatic sequential multi-drop measuring system and measurement method
CN109613519A (en) * 2019-01-11 2019-04-12 清华大学 Pairing attitude-adjusting method based on more laser trackers measurement field
CN109990701A (en) * 2019-03-04 2019-07-09 华中科技大学 A kind of large complicated carved three-dimensional appearance robot traverse measurement system and method
CN109916333A (en) * 2019-04-04 2019-06-21 大连交通大学 A kind of large scale target with high precision three-dimensional reconstruction system and method based on AGV
CN110260786A (en) * 2019-06-26 2019-09-20 华中科技大学 A kind of robot vision measuring system and its scaling method based on external trace
CN111133395A (en) * 2019-07-19 2020-05-08 爱佩仪测量设备有限公司 Intelligent manufacturing system
CN110686592A (en) * 2019-09-04 2020-01-14 同济大学 Combined measuring method for large-size target object
CN110906880A (en) * 2019-12-12 2020-03-24 中国科学院长春光学精密机械与物理研究所 Object automatic three-dimensional laser scanning system and method
CN111325723A (en) * 2020-02-17 2020-06-23 杭州鼎热科技有限公司 Hole site detection method, device and equipment
CN111678459A (en) * 2020-06-09 2020-09-18 杭州思看科技有限公司 Three-dimensional scanning method, three-dimensional scanning system, and computer-readable storage medium
CN111879235A (en) * 2020-06-22 2020-11-03 杭州思看科技有限公司 Three-dimensional scanning detection method and system for bent pipe and computer equipment

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113432561A (en) * 2021-08-02 2021-09-24 杭州思看科技有限公司 Data processing method and three-dimensional scanning system
CN113432561B (en) * 2021-08-02 2023-10-13 思看科技(杭州)股份有限公司 Data processing method and three-dimensional scanning system
CN113670202A (en) * 2021-08-25 2021-11-19 杭州思看科技有限公司 Three-dimensional scanning system and three-dimensional scanning method
CN113865506A (en) * 2021-09-09 2021-12-31 武汉惟景三维科技有限公司 Automatic three-dimensional measurement method and system for non-mark point splicing
CN113865506B (en) * 2021-09-09 2023-11-24 武汉惟景三维科技有限公司 Automatic three-dimensional measurement method and system without mark point splicing
CN113884021A (en) * 2021-09-24 2022-01-04 上海飞机制造有限公司 Scanning system, calibration device and calibration method of scanning system
CN114234838B (en) * 2021-11-19 2023-09-08 武汉尺子科技有限公司 3D scanning method and device
CN114234838A (en) * 2021-11-19 2022-03-25 武汉尺子科技有限公司 3D scanning method and device
CN114111627A (en) * 2021-12-07 2022-03-01 深圳市中图仪器股份有限公司 Scanning system and scanning method based on laser tracker
CN114739405A (en) * 2022-02-28 2022-07-12 杭州思看科技有限公司 Scanning path adjusting method and device, automatic scanning system and computer equipment
CN115493512B (en) * 2022-08-10 2023-06-13 思看科技(杭州)股份有限公司 Data processing method, three-dimensional scanning system, electronic device and storage medium
CN115493512A (en) * 2022-08-10 2022-12-20 思看科技(杭州)股份有限公司 Data processing method, three-dimensional scanning system, electronic device, and storage medium
WO2024109795A1 (en) * 2022-11-25 2024-05-30 杭州先临天远三维检测技术有限公司 Scanning processing method and apparatus, device, and medium
CN115661369A (en) * 2022-12-14 2023-01-31 思看科技(杭州)股份有限公司 Three-dimensional scanning method, three-dimensional scanning control system and electronic device
CN116858127A (en) * 2023-07-18 2023-10-10 航天规划设计集团有限公司 Initiating explosive device appearance measurement method, system and device
CN117948915A (en) * 2024-03-18 2024-04-30 杭州非白三维科技有限公司 Multi-tracking-head optical tracking three-dimensional scanning method and system
CN117948915B (en) * 2024-03-18 2024-05-31 杭州非白三维科技有限公司 Multi-tracking-head optical tracking three-dimensional scanning method and system

Also Published As

Publication number Publication date
WO2022165973A1 (en) 2022-08-11
CN112964196B (en) 2023-01-03

Similar Documents

Publication Publication Date Title
CN112964196B (en) Three-dimensional scanning method, system, electronic device and computer equipment
CN109829953B (en) Image acquisition device calibration method and device, computer equipment and storage medium
CN114012731B (en) Hand-eye calibration method and device, computer equipment and storage medium
US20170136626A1 (en) Facilitating robot positioning
CN110856932B (en) Interference avoidance device and robot system
JP2005326944A (en) Device and method for generating map image by laser measurement
CN112659129B (en) Robot positioning method, device and system and computer equipment
CN114739405A (en) Scanning path adjusting method and device, automatic scanning system and computer equipment
CN111590593B (en) Calibration method, device and system of mechanical arm and storage medium
CN110722558B (en) Origin correction method and device for robot, controller and storage medium
CN114139857B (en) Workpiece finishing working procedure correction method, system, storage medium and device
CN109636783B (en) Method and device for determining arm length of robot, computer equipment and storage medium
JPWO2018043524A1 (en) Robot system, robot system control apparatus, and robot system control method
CN112894758A (en) Robot cleaning control system, method and device and computer equipment
CN114098980B (en) Camera pose adjustment method, space registration method, system and storage medium
CN114929436B (en) System and method for controlling robot, electronic device, and computer-readable medium
CN117490633A (en) Station transferring method, three-dimensional scanning method and three-dimensional scanning system of tracking equipment
CN116019562A (en) Robot control system and method
CN112847350B (en) Hand-eye calibration method, system, computer equipment and storage medium
CN113302027A (en) Work coordinate generation device
CN112923889B (en) Scanning method, device, three-dimensional scanning system, electronic device and storage medium
EP4094043A1 (en) Method and apparatus for estimating system state
CN116330303B (en) SCARA robot motion control method, SCARA robot motion control device, SCARA robot motion control terminal equipment and SCARA robot motion control medium
KR102424378B1 (en) Method and Apparatus for Position Calibation for Robot Using 3D Scanner
US20230191612A1 (en) Coordinate system setting system and position/orientation measurement system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant