WO2022068510A1 - 三维数据拼接方法及三维扫描系统、手持扫描仪 - Google Patents

三维数据拼接方法及三维扫描系统、手持扫描仪 Download PDF

Info

Publication number
WO2022068510A1
WO2022068510A1 PCT/CN2021/116044 CN2021116044W WO2022068510A1 WO 2022068510 A1 WO2022068510 A1 WO 2022068510A1 CN 2021116044 W CN2021116044 W CN 2021116044W WO 2022068510 A1 WO2022068510 A1 WO 2022068510A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
measurement module
point cloud
measured
splicing
Prior art date
Application number
PCT/CN2021/116044
Other languages
English (en)
French (fr)
Inventor
赵晓波
王文斌
Original Assignee
先临三维科技股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 先临三维科技股份有限公司 filed Critical 先临三维科技股份有限公司
Publication of WO2022068510A1 publication Critical patent/WO2022068510A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/521Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Definitions

  • the invention relates to the field of three-dimensional digitization, in particular, to a three-dimensional data splicing method, a three-dimensional scanning system, and a hand-held scanner.
  • the 3D scanning method based on the 3D reconstruction of fringe images requires manual sticking.
  • photogrammetry is used to shoot the marks and obtain their 3D data, and the scanning process is performed.
  • import the 3D data of the mark points or features and use the scanner to surround the measured object to stitch and scan the mark points or features.
  • the present application provides a 3D data splicing method, a 3D scanning system, and a handheld scanner, so as to at least solve the time-consuming and laborious technical problem in the related art that the point cloud data is spliced and scanned in real time by frequent manual sticking.
  • a three-dimensional scanning system configured to obtain three-dimensional data of a measured object, including: a measurement module configured to obtain a target image of the surface of the measured object, the target image including a first image and a second two images; a computer terminal, configured to perform the following steps: performing three-dimensional reconstruction on the target image, obtaining a first point cloud based on the first image, and obtaining a second point cloud based on the second image; splicing and transforming relationships between the first point clouds; and splicing the corresponding multiple pieces of the second point clouds based on the splicing and transforming relationships of multiple pieces of the first point clouds.
  • the measurement module includes: a first measurement module configured to acquire a first image of the surface of the object to be measured; and a second measurement module configured to acquire a second image of the surface of the object to be measured.
  • the first measurement module includes: a speckle projector, configured to project a speckle image of the first waveband to the surface of the object to be measured; a first acquisition module, configured to collect and project the surface of the object to be measured The speckle image is obtained to obtain the first image, and/or the second measurement module includes: a fringe projection module, configured to emit a fringe image of the second wavelength band to the surface of the measured object; a second acquisition module, It is configured to collect the fringe image projected on the surface of the measured object to obtain a second image.
  • the method includes: acquiring the first measurement module and the Describe the pose transformation relationship between the second measurement modules; based on the pose transformation relationship between the first measurement module and the second measurement module and the splicing transformation relationship of multiple pieces of the first point cloud, splicing the corresponding multiple pieces the second point cloud.
  • the measurement module includes: a speckle projector, configured to project a speckle image of a first waveband to the surface of the object to be measured; a fringe projection module, configured to emit a fringe image of a second waveband to the measured object an object surface; a collection module configured to collect a target image projected onto the surface of the object to be measured, the target image including a first image corresponding to the speckle image and a second image corresponding to the fringe image.
  • the first measurement module and the second measurement module are calibrated in advance to obtain the pose transformation relationship between the first measurement module and the second measurement module.
  • the measurement module can be moved to measure.
  • a three-dimensional data stitching method which is configured to obtain three-dimensional data of a measured object, including: acquiring a first image and a second image, wherein the first image and the first image The second image is the target image of the surface of the measured object obtained by the measurement module; three-dimensional reconstruction is performed on the target image, a first point cloud is obtained based on the first image, and a second point cloud is obtained based on the second image; Determining the splicing transformation relationship among the plurality of pieces of the first point cloud; splicing the corresponding plurality of the second point clouds based on the splicing transformation relationship among the multiple pieces of the first point cloud.
  • a hand-held scanner is also provided, the hand-held scanner is connected to a computer terminal, a program running in the computer terminal executes the above-mentioned three-dimensional data stitching method, and the hand-held scanner includes: measuring The module is configured to acquire a target image on the surface of the object to be measured, and send the target image to a computer terminal, wherein the target image includes a first image and a second image.
  • a computer-readable storage medium includes a stored computer program, wherein when the computer program is run, the computer-readable storage medium is controlled
  • the device performs the above-mentioned three-dimensional data stitching method.
  • the target image is reconstructed in three dimensions, and the first image is obtained based on the first image.
  • a point cloud and obtain a second point cloud based on the second image, determine the splicing transformation relationship between multiple first point clouds, and splicing the corresponding multiple second points based on the splicing transformation relationship between multiple first point clouds cloud.
  • the splicing operation between multiple second point clouds is realized by performing 3D reconstruction on the target image and determining the splicing transformation relationship between the first point clouds with multiple first point clouds.
  • FIG. 1 is a schematic diagram of an optional three-dimensional scanning system according to an embodiment of the present invention.
  • FIG. 2 is a schematic diagram of another optional three-dimensional scanning system according to an embodiment of the present invention.
  • FIG. 3 is a flowchart of an optional three-dimensional data stitching method according to an embodiment of the present invention.
  • FIG. 1 is a schematic diagram of an optional three-dimensional scanning system according to an embodiment of the present invention, which is used to obtain three-dimensional data of a scanned object.
  • the three-dimensional scanning system may include: a measurement module 11, a computer terminal 12, of which,
  • the measurement module 11 is configured to acquire a target image of the surface of the object to be measured, and the target image includes a first image and a second image.
  • the measurement module 11 includes: a first measurement module configured to acquire a first image of the surface of the object to be measured; and a second measurement module configured to acquire a second image of the surface of the object to be measured.
  • the first measurement module and the second measurement module are respectively provided with different collection modules.
  • the first measurement module includes a first collection module
  • the second measurement module includes a second collection module.
  • the first collection module and the second collection module are independent of each other.
  • the first measurement module includes: a speckle projector, configured to project the speckle image of the first waveband to the surface of the object to be measured; the first acquisition module, configured to collect the speckle image projected to the surface of the object to be measured , to get the first image.
  • a speckle projector configured to project the speckle image of the first waveband to the surface of the object to be measured
  • the first acquisition module configured to collect the speckle image projected to the surface of the object to be measured , to get the first image.
  • the relative positions of the first measurement module and the following second measurement module remain unchanged during the scanning process, and the first measurement module and the following second measurement module can be constructed into a fixed 3D scanner, so that the measured object moves within the measurement range of the fixed scanner to realize multi-angle measurement, or, the first measurement module and the following second measurement module are constructed into a mobile 3D scanner , the first measurement module and the second measurement module are integrated in a scanner, the scanner can be moved to measure, that is, the scanner can be moved to perform multi-angle measurement on the measured object from different measurement angles at different measurement times.
  • the speckle projector can project a speckle image to the surface of the object to be measured, and collect a two-dimensional image through the first acquisition module to obtain the first image, and obtain the first image for each measurement angle (each measurement moment), thereby obtaining the Multiple frames of the first image.
  • the optional first acquisition module includes at least one camera.
  • the measurement module may further include a filter, a fill light, and the like.
  • the first measurement module 11 and the second measurement module described below include a filter, a fill light, and the like, respectively.
  • the camera only collects images of the corresponding waveband.
  • the first acquisition module only collects the speckle image but not the fringe image.
  • the second acquisition module only collects fringe images but not speckle images; or the same acquisition module can collect both speckle images and fringe images.
  • a first image (which can be directly understood as an object surface image including a speckle image) can be obtained through the first measurement module, and a first point cloud can be obtained through three-dimensional reconstruction of the first image.
  • the number of points in the point cloud is large and dense, which can reflect the rich characteristics of the measured object, and realize the feature splicing of multiple first point clouds.
  • the scanner captures images of the surface of the object to be measured from multiple measurement angles, and obtains the first image.
  • Each frame of the first image can correspond to a measurement angle.
  • the corresponding single piece of the first point cloud is obtained based on the first image, and multiple pieces of the first point cloud are obtained, that is, the first image, the first point cloud, the measurement time and the measurement angle are in one-to-one correspondence. Taking the measurement angle corresponding to shooting one frame of the first image as an example, a schematic description is given.
  • feature splicing is performed between multiple first point clouds, and the splicing transformation relationship between multiple first point clouds can be determined, that is, the splicing transformation relationship between multiple first point clouds can be determined.
  • the pose transformation relationship of the first measurement module in the process of acquiring multiple first images that is, the pose transformation relationship of the first measurement module between a plurality of corresponding measurement angles (measurement moments), is used to assist the second
  • the point cloud completes the high-precision stitching operation.
  • the second measurement module includes: a fringe projection module, configured to project the fringe image of the second waveband to the surface of the object to be measured; a second acquisition module, configured to collect the fringe image projected to the surface of the object to be measured, to obtain second image.
  • the type of the fringe image includes at least one of the following: a laser fringe image and a structured light fringe image.
  • the fringe projection module in the second measurement module projects a fringe image (such as a laser fringe image formed by projection of a laser light source or a structured light fringe image formed by projection of a structured light source) onto the surface of the scanned object, and then uses the second acquisition module to collect
  • a fringe image such as a laser fringe image formed by projection of a laser light source or a structured light fringe image formed by projection of a structured light source
  • the image of the surface of the object includes at least two cameras, each camera captures the image of the surface of the object to be measured.
  • the target image includes the first image and the second image
  • the speckle projector and the fringe projection module synchronously project the image to a certain part of the surface of the object to be measured
  • the speckle image and the fringe image are presented on the part together.
  • the acquisition module and the second acquisition module capture images of the surface of the object to be measured synchronously to obtain a first image corresponding to the speckle image and a second image corresponding to the fringe image.
  • the first acquisition module only acquires the first image corresponding to the speckle image
  • the second acquisition module only acquires the second image corresponding to the fringe image
  • the first image and the second image are acquired synchronously.
  • the second measurement module and the first measurement module acquire the image of the surface of the object to be measured synchronously.
  • the scanner acquires the image of the surface of the object to be measured from the measurement angle A , including: a speckle projector projects a speckle image to the surface of the measured object, a fringe projection module synchronously projects a fringe image to the surface of the measured object, and the first acquisition module collects the speckle image on the surface of the measured object to obtain the first image A,
  • the second acquisition module synchronously acquires the fringe image on the surface of the measured object to obtain the second image A, that is, the scanner simultaneously acquires the first image A and the second image A from the measurement angle A; the scanner acquires the surface of the measured object from the measurement angle B
  • the image includes: a speckle projector projects a speckle image to the surface of the object to be measured, a fringe projection module synchronously projects a fringe image to the surface of the object to be measured, and the first acquisition module collects the speckle image
  • the second image is reconstructed to obtain a second point cloud.
  • the scanner captures images of the surface of the object to be measured from multiple measurement angles to obtain the second image.
  • Each frame of the second image can correspond to a measurement angle.
  • Image 3D reconstruction can obtain a single second point cloud corresponding to the second image of each frame, and obtain multiple second point clouds based on the second image, that is, the second image, the second point cloud, the measurement time and the measurement angle are the same.
  • One correspondence it can be seen that the first point cloud obtained at each measurement angle corresponds to the second point cloud, and then when splicing multiple second point clouds, the first measurement module and the second measurement module are used.
  • the pose transformation relationship between the two that is, the pose transformation relationship between the first acquisition module and the second acquisition module
  • the splicing transformation relationship between multiple pieces of first point clouds to achieve a high degree of accuracy between multiple pieces of second point clouds.
  • Accurate splicing so as to obtain a high-precision 3D model of the measured object.
  • the scanner acquires the image of the surface of the object to be measured from the measurement angle A, that is, the first measurement module acquires the first image A from the measurement angle A, the second measurement module simultaneously acquires the second image A from the measurement angle A, and the first measurement module acquires the second image A from the measurement angle A simultaneously.
  • the first point cloud A is obtained by the three-dimensional reconstruction of an image A
  • the second point cloud A is obtained by the three-dimensional reconstruction of the second image A
  • the first image B, the second measurement module synchronously obtains the second image B from the measurement angle B
  • the first point cloud B is obtained by three-dimensional reconstruction of the first image B
  • the second point cloud B is obtained by the three-dimensional reconstruction of the second image B
  • the measurement angle C acquires the image of the surface of the object to be measured, that is, the first measurement module acquires the first image C from the measurement angle C
  • the second measurement module simultaneously acquires the second image C from the measurement angle C
  • the three-dimensional reconstruction of the first image C is obtained.
  • the first point cloud C, the second point cloud C is obtained by three-dimensional reconstruction of the second image C; the feature splicing of the first point cloud A and the first point cloud B is performed to obtain the splicing transformation relationship [R 1 , T 1 ], the first point Cloud A and the first point cloud C perform feature splicing to obtain the splicing transformation relationship [R 2 , T 2 ]; obtain the pose transformation relationship [R, T] between the first measurement module and the second measurement module,
  • the second point cloud A and the second point cloud B are spliced through [R, T] and [R 1 , T 1 ], and the second point cloud A and the second point cloud C are connected through [R, T] and [R 2 , T 2 ] for splicing.
  • the scanner acquires the image of the surface of the object to be measured from the measurement angle A, that is, the first measurement module acquires the first image A from the measurement angle A, the second measurement module simultaneously acquires the second image A from the measurement angle A, and the first measurement module acquires the second image A from the measurement angle A simultaneously.
  • the first point cloud A is obtained by the three-dimensional reconstruction of an image A
  • the second point cloud A is obtained by the three-dimensional reconstruction of the second image A
  • the first image B, the second measurement module synchronously obtains the second image B from the measurement angle B
  • the first point cloud B is obtained by three-dimensional reconstruction of the first image B
  • the second point cloud B is obtained by the three-dimensional reconstruction of the second image B
  • the first point Perform feature splicing between cloud A and first point cloud B that is, feature splicing between the first point cloud of the current frame and the first point cloud of the previous frame
  • the second point cloud A and the second point cloud B are spliced through [R, T] and [R 1 , T 1 ] (that is, the current frame
  • the second point cloud is spliced with the second point cloud of the previous frame), and the second point cloud
  • the point cloud AB and the first point cloud C perform feature splicing to obtain the splicing conversion relationship [R 2 , T 2 ]; after the splicing, the second point cloud AB and the second point cloud C pass through [R, T] and [R 2 , T 2 ] for splicing to obtain the second point cloud ABC after splicing...until the overall point cloud of the object to be measured is obtained.
  • the first waveband of the speckle projector projected speckle image is the same as the second waveband of the fringe image projected by the fringe projection module, or the first waveband and the second waveband do not interfere with each other. That is, the light bands projected by the two projection modules may be the same or different.
  • the light types emitted by the first wavelength band and the second wavelength band include, but are not limited to, visible light and invisible light.
  • the first wavelength band is an invisible light wavelength band.
  • the first wavelength band is the 815-845 nm wavelength band in the invisible light band.
  • the speckle image of the first band adopts a specific wavelength, which is 830nm.
  • the single-frame image acquired by the first acquisition module includes the first image and the second image.
  • the single-frame image obtained by the module synchronously includes the first image and the second image, that is, the speckle image and the fringe image are presented in the same frame image.
  • one acquisition module can be selected to work; if the speckle projector and fringe The projection module projects light of different wavelength bands.
  • the single-frame image acquired by the first acquisition module only includes the first image
  • the single-frame image acquired by the second acquisition module only includes the second image.
  • Image, the first image and the second image obtained by the measurement module are two independent images of each other.
  • the first acquisition module can also obtain the first image and the second image and the second image at the same time.
  • the second acquisition module simultaneously acquires the first image and the second image, that is, a single frame of image includes the first image and the second image.
  • the first acquisition module and the second acquisition module may respectively include two cameras.
  • the first measurement module and the second measurement module are pre-installed in the handheld scanner, and the positions of the two modules are fixed. Therefore, the distance between the first measurement module and the second measurement module can be directly determined by the scanner calibration.
  • the first acquisition module and the second acquisition module can simultaneously perform image acquisition on the same part on the surface of the object to be measured, and after the acquisition is completed, continue to acquire the next part.
  • the first measurement module and the second measurement module share the same acquisition module.
  • the measurement module includes: a speckle projector, configured to project the speckle pattern of the first waveband to the surface of the object to be measured; a fringe projection module, configured to emit the fringe image of the second waveband to the surface of the object to be measured;
  • the module is set to collect the target image projected to the surface of the object to be measured.
  • the speckle projector and the fringe projector project to the surface of the object to be measured synchronously.
  • the surface of the object to be measured forms both a speckle image and a fringe image.
  • the acquisition module collects For the image of the surface of the object to be measured, the target image of a single frame includes both the first image corresponding to the speckle image and the second image corresponding to the fringe image.
  • the speckle projector and the acquisition module form the first measurement module
  • the fringe projection module and the acquisition module form the second measurement module
  • the first measurement module and the second measurement module share the same acquisition module
  • the speckle image and the fringe image are imaged in the same frame image.
  • the first acquisition module and the second acquisition module share a group of cameras (including at least two cameras), and when sharing a group of cameras, the speckle projector projects the first waveband of the speckle image and the fringe projection
  • the second band of the fringe image projected by the module is the same, and the single-frame target image obtained by each camera contains both the speckle image and the fringe image.
  • FIG. 2 is a schematic diagram of another optional three-dimensional scanning system according to an embodiment of the present invention.
  • the acquisition modules in the first measurement module and the second measurement module share one A group of cameras is used to capture multiple images through a group of cameras, and the images include both speckle images and fringe images.
  • one acquisition module includes two cameras, and the two cameras construct a binocular stereo vision system.
  • the splicing transformation relationship of the first point cloud is the splicing transformation relationship of the second point cloud, Based on the splicing transformation relationship of the multiple first point clouds, the corresponding multiple second point clouds can be spliced.
  • the above-mentioned handheld scanner can establish a communication connection relationship with the computer terminal, and then transmit the collected first image and second image to the computer terminal in real time.
  • the first point cloud and the second point cloud are obtained respectively.
  • multiple second point clouds corresponding to the first point cloud can be stitched together with the assistance of the feature stitching of multiple pieces of the first point cloud to obtain the three-dimensional digital image corresponding to the measured object.
  • the measurement module can move to measure.
  • the measurement module can be installed in the handheld scanner, so it can realize mobile measurement and realize 3D scanning measurement of the measured object.
  • the computer terminal 12 performs the following steps: performing three-dimensional reconstruction on the target image, obtaining a first point cloud based on the first image, and obtaining a second point cloud based on the second image; determining the splicing transformation relationship between multiple pieces of the first point cloud; The corresponding multiple second point clouds are spliced based on the splicing transformation relationship of the multiple first point clouds.
  • the computer terminal when it performs the step of splicing the corresponding multiple pieces of the second point cloud based on the splicing conversion relationship of the multiple pieces of the first point cloud, it includes: acquiring the position between the first measurement module and the second measurement module. Attitude transformation relationship; based on the pose transformation relationship between the first measurement module and the second measurement module and the splicing transformation relationship of multiple first point clouds, the corresponding multiple second point clouds are spliced.
  • each time point corresponds to a measurement angle, and each measurement angle
  • the measurement module acquires the target image, that is, the measurement time point, measurement angle and the target image are in one-to-one correspondence, and are obtained from different measurement angles at different time points.
  • the image projected onto the surface of the object to be measured can obtain multiple frames of target images.
  • the first measurement module uses the first measurement module as a standard, construct a pose transformation relationship corresponding to the first measurement module, and determine the pose transformation relationship of the first measurement module at different time points based on multiple first point clouds.
  • the speckle images projected onto the surface of the object to be measured are acquired from different measurement angles at time points, thereby obtaining the first image.
  • it is necessary to determine different time points before and after.
  • the first measurement module acquires the first image The pose transformation relationship that occurs. Multiple pieces of second point clouds are spliced through the pose transformation relationship of the first measurement module at different time points and the pose transformation relationship between the two measurement modules.
  • the above-mentioned three-dimensional scanning system adopts the measurement module to obtain the target image of the surface of the measured object, the target image includes a first image and a second image, and the computer terminal 12 is used to perform the following steps: carry out three-dimensional reconstruction on the target image, based on the first image and the second image.
  • a first point cloud is obtained from an image, and a second point cloud is obtained based on the second image; the splicing transformation relationship between multiple pieces of the first point cloud is determined; the splicing transformation relationship between multiple pieces of the first point cloud is determined Splicing the corresponding pieces of the second point cloud.
  • the splicing between the second point clouds corresponding to the second image can be assisted by a large number of first point clouds, and the accuracy can be completed without the aid of externally attached marker points.
  • the splicing operation between the second point clouds realizes laser scanning without sticking points, thus solving the time-consuming and laborious technical problem of stitching and scanning point cloud data in real time through frequent manual sticking points in the related art.
  • the device can be calibrated before scanning, and the pose between the two modules can be determined after calibration.
  • Transformation relationship [R, T] the pose transformation relationship can reflect the positional relationship between the two measurement modules, which can be represented by a rotation and translation matrix.
  • the computer terminal 12 can obtain a plurality of first point clouds of the measured object through three-dimensional reconstruction, and the plurality of first point clouds can be obtained through feature splicing to obtain the space between the plurality of first point clouds.
  • the splicing transformation relationship [R i , T i ] is obtained, that is, the pose transformation relationship of the first measurement module in the process of obtaining the corresponding multi-frame target images is obtained.
  • the splicing conversion relationship [R 1 , T 1 ] between the first point cloud A and the first point cloud B is obtained by splicing the first point cloud A and the first point cloud B, that is, the pose of the first measurement module when the first image A is obtained is transformed to the time when the first image A is obtained.
  • the pose transformation relationship of the first measurement module in an image B; the first point cloud A and the first point cloud C are spliced to obtain the splicing transformation relationship [R 2 , T 2 ] between the two, that is, the first image A is obtained
  • the pose of the first measurement module is transformed to the pose transformation relationship of the first measurement module when the first image C is acquired.
  • the pose transformation relationship between the first measurement module and the second measurement module is [R, T] and the pose transformation relationship of the first measurement module when the first image A is obtained to the pose transformation relationship of the first measurement module when the first image B is obtained, it can be determined that when the second image A is obtained, the second The pose transformation of the measurement module is converted to the pose transformation relationship of the second measurement module when the second image B is obtained, that is, the splicing transformation relationship between the second point cloud A and the second point cloud B is obtained [R 1 ′,T 1 '], complete the splicing of the second point cloud A and the second point cloud B based on the splicing transformation relationship [R 1 ', T 1 '], through the pose transformation relationship between the first measurement module and the second measurement module [ R, T] and the pose transformation relationship of the first measurement module when acquiring the first image A to the pose transformation relationship of the first measurement module when acquiring the first image C can determine the
  • the pose of the group is transformed to the pose transformation relationship of the second measurement module when the second image C is obtained, that is, the splicing transformation relationship between the second point cloud A and the second point cloud C is obtained [R 2 ′, T 2 ′ ], based on the splicing transformation relationship [R 2 ', T 2 '], the splicing of the second point cloud A and the second point cloud C is completed.
  • the target image projected on the surface of the object to be measured can be obtained by the measurement module 11, and then the three-dimensional reconstruction can be used on the computer terminal 12 to obtain multiple pieces of first point clouds, and determine the number of pieces of the first point clouds.
  • the splicing and transformation relationship between the two images is reconstructed in three dimensions, and multiple second point clouds are obtained.
  • Based on the pose transformation relationship of the measurement module in the process of shooting different images and the pose transformation relationship between the two measurement modules splicing multiple pieces of second point clouds to obtain a three-dimensional digital model of the object to be measured, and realize the scanning and splicing of label-free marking points in the same band or multi-band, without the need for external marking points.
  • the first image is three-dimensionally reconstructed based on the binocular vision reconstruction principle to obtain the first point cloud
  • the second image is three-dimensionally reconstructed based on the binocular vision reconstruction principle to obtain the second point cloud.
  • the speckle image is reconstructed based on the binocular vision reconstruction principle to obtain the first point cloud
  • the fringe image is three-dimensionally reconstructed based on the binocular vision reconstruction principle to obtain the second point cloud.
  • the pose transformation relationship [R, T] with the second measurement module remains unchanged. Therefore, based on the pose transformation relationship [R, T] and The pose transformation relationship/splicing transformation relationship [R i ,T i ] between multiple first point clouds can complete the splicing between multiple second point clouds, which also solves the problem of points obtained by 3D reconstruction of fringe images.
  • the number of points in the cloud is small and sparse, the features are not rich, and the manual frequent sticking of points is time-consuming and labor-intensive caused by the need to use pasting marker points to assist the stitching.
  • the accuracy of the reconstructed point cloud is high, therefore, the 3D digital model of the measured object obtained by splicing multiple second point clouds has high accuracy.
  • a three-dimensional scanning system configured to obtain three-dimensional data of an object to be measured, and includes: a first measurement module, at least including: a speckle projector, and the speckle projector is set to The speckle image of the first waveband is projected to the surface of the object to be measured; the second measurement module includes at least: a fringe projection module, and the fringe projection module is configured to emit the fringe image of the second waveband to the surface of the object to be measured; the acquisition module is set to Collecting images projected onto the surface of the object to be measured at different time points; the computer terminal, after receiving the images, performs the following steps: Step 1, perform three-dimensional reconstruction on the images to obtain multiple first point clouds and multiple second point clouds, wherein , the first point cloud corresponds to the speckle image projected by the first measurement module, and the second point cloud corresponds to the fringe image emitted by the second measurement module; step 2, the first measurement module is determined based on multiple pieces of the first point cloud The pose transformation relationship at
  • FIG. 3 is a flowchart of an optional three-dimensional data stitching method according to an embodiment of the present invention.
  • the three-dimensional data stitching method is used to obtain the three-dimensional data of the scanned object. As shown in FIG. 3 , the three-dimensional data stitching method includes:
  • Step S302 obtaining a first image and a second image, wherein the first image and the second image are target images of the surface of the object to be measured obtained by the measurement module;
  • Step S304 performing three-dimensional reconstruction on the target image, obtaining a first point cloud based on the first image, and obtaining a second point cloud based on the second image;
  • Step S306 determining the splicing conversion relationship between multiple pieces of the first point cloud
  • Step S308 splicing the corresponding multiple second point clouds based on the splicing transformation relationship between the multiple first point clouds.
  • a first image and a second image are obtained, wherein the first image and the second image are the target images of the surface of the measured object obtained by the measurement module, and the target image is three-dimensionally reconstructed, and the first image is obtained based on the first image.
  • point cloud and obtain a second point cloud based on the second image, determine the splicing transformation relationship between multiple first point clouds, and splicing the corresponding multiple second point clouds based on the splicing transformation relationship between multiple first point clouds cloud.
  • the splicing operation between the corresponding multiple second point clouds is realized by performing 3D reconstruction on the target image, and using multiple first point clouds to determine the splicing transformation relationship between multiple first point clouds,
  • the splicing between the second point clouds corresponding to the second image can be assisted by a large number of first point clouds, and the second point cloud with higher precision can be completed without the aid of externally attached marker points.
  • the splicing operation between points realizes laser scanning without sticking points, thus solving the time-consuming and laborious technical problem in the related art that the point cloud data is stitched and scanned in real time by frequent manual sticking points.
  • This embodiment is applied to a computer terminal.
  • the computer terminal can be connected to a handheld scanner, receive the first image and the second image respectively collected by the measurement module in the handheld scanner, and complete the point cloud splicing of the object to be measured.
  • the above measurement module includes: a first measurement module configured to acquire a first image of the surface of the object to be measured; and a second measurement module configured to acquire a second image of the surface of the object to be measured.
  • the first measurement module includes: a speckle projector, configured to project a speckle image of the first waveband to the surface of the object to be measured; a first acquisition module, configured to collect the speckle image projected to the surface of the object to be measured, to obtain the first an image.
  • the second measurement module includes: a fringe projection module, configured to project the fringe image of the second waveband to the surface of the object to be measured; a second acquisition module, configured to collect the fringe image projected to the surface of the object to be measured, to obtain second image.
  • the type of fringe image includes at least one of the following: laser fringe image, structured light fringe image.
  • the first waveband and the second waveband are the same, or the first waveband and the second waveband do not interfere with each other.
  • the computer terminal when the computer terminal performs the step of splicing the corresponding multiple pieces of the second point cloud based on the splicing conversion relationship of the multiple pieces of the first point cloud, it includes: acquiring the position between the first measurement module and the second measurement module. Attitude transformation relationship; splicing multiple second point clouds based on the pose transformation relationship between the first measurement module and the second measurement module and the splicing transformation relationship of multiple first point clouds.
  • the first measurement module and the second measurement module are calibrated in advance to obtain the pose transformation relationship between the first measurement module and the second measurement module.
  • the measurement module includes: a speckle projector, configured to project a speckle image of the first waveband to the surface of the object to be measured; a fringe projection module, configured to emit a fringe image of the second waveband to the surface of the object to be measured;
  • the module is configured to collect a target image projected onto the surface of the object to be measured, and the target image includes a first image corresponding to the speckle image and a second image corresponding to the fringe image.
  • the measurement module can be moved to measure.
  • an electronic device including: a processor; and a memory configured to store executable instructions of the processor; wherein the processor is configured to execute any one of the above by executing the executable instructions Item's 3D data stitching method.
  • a hand-held scanner is also provided.
  • the hand-held scanner is connected to a computer terminal, and a program running in the computer terminal executes the above-mentioned three-dimensional data stitching method.
  • the hand-held scanner includes: a measurement module, a set of In order to acquire a target image on the surface of the object to be measured, and send the target image to a computer terminal, the target image includes a first image and a second image.
  • a computer-readable storage medium is also provided, where the computer-readable storage medium includes a stored computer program, wherein, when the computer program runs, the device where the computer-readable storage medium is located is controlled to execute any of the above 3D data stitching method.
  • the present application also provides a computer program product that, when executed on a data processing device, is adapted to execute a program initialized with the following method steps: acquiring a first image and a second image, wherein the first image and the second image are measuring the target image of the surface of the object to be measured obtained by the measurement module; performing three-dimensional reconstruction on the target image, obtaining a first point cloud based on the first image, and obtaining a second point cloud based on the second image; determining multiple pieces of the first point cloud The splicing transformation relationship between the multiple pieces of the first point cloud is based on the splicing transformation relationship between the multiple pieces of the first point cloud.
  • the disclosed technical content can be implemented in other ways.
  • the device embodiments described above are only illustrative, for example, the division of the units may be a logical function division, and there may be other division methods in actual implementation, for example, multiple units or components may be combined or Integration into another system, or some features can be ignored, or not implemented.
  • the shown or discussed mutual coupling or direct coupling or communication connection may be through some interfaces, indirect coupling or communication connection of units or modules, and may be in electrical or other forms.
  • the units described as separate components may or may not be physically separated, and components shown as units may or may not be physical units, that is, may be located in one place, or may be distributed to multiple units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution in this embodiment.
  • each functional unit in each embodiment of the present invention may be integrated into one processing unit, or each unit may exist physically alone, or two or more units may be integrated into one unit.
  • the above-mentioned integrated units may be implemented in the form of hardware, or may be implemented in the form of software functional units.
  • the integrated unit if implemented in the form of a software functional unit and sold or used as an independent product, may be stored in a computer-readable storage medium.
  • the technical solution of the present invention is essentially or the part that contributes to the prior art, or all or part of the technical solution can be embodied in the form of a software product, and the computer software product is stored in a storage medium , including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute all or part of the steps of the methods described in the various embodiments of the present invention.
  • the aforementioned storage medium includes: U disk, read-only memory (ROM, Read-Only Memory), random access memory (RAM, Random Access Memory), mobile hard disk, magnetic disk or optical disk and other media that can store program codes .
  • the solution provided by the embodiment of the present application can be applied to a three-dimensional scanning system and a handheld mobile three-dimensional scanner, for performing three-dimensional reconstruction of a target image, and determining the splicing and conversion relationship between the first point clouds by using multiple pieces of first point clouds,
  • the splicing operation between multiple second point clouds is realized, and the splicing between the second point clouds corresponding to the second image is assisted through a large number of first point clouds, without the need for externally attached marker points, that is,
  • the splicing operation between the second point clouds with high precision can be completed, and the laser scanning without sticking points can be realized, thus solving the time-consuming and laborious technical problem in the related art that the point cloud data is stitched and scanned in real time by frequent manual sticking.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Optics & Photonics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

本发明公开了一种三维数据拼接方法及三维扫描系统、手持扫描仪。其中,该方法包括:获取第一图像和第二图像,其中,第一图像和第二图像为测量模组获取的被测物体表面的目标图像;对目标图像进行三维重建,基于第一图像得到第一点云,并基于第二图像得到第二点云;确定多片第一点云之间的拼接转换关系;基于多片第一点云之间的拼接转换关系,拼接对应的多片第二点云。

Description

三维数据拼接方法及三维扫描系统、手持扫描仪
本申请要求于2020年09月29日提交中国专利局、申请号为202011057282.0、申请名称“三维数据拼接方法及三维扫描系统、手持扫描仪”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本发明涉及三维数字化领域,具体而言,涉及一种三维数据拼接方法及三维扫描系统、手持扫描仪。
背景技术
相关技术中,基于条纹图像三维重建的三维扫描方式需要人工进行贴点,通过在被测物体表面粘贴标志点或特征,用摄影测量的进行标志点的拍摄和其三维数据获取,且在扫描过程中,之后将标志点或者特征三维数据导入,用扫描仪围绕着被测物体利用标志点或者特征进行拼接扫描,为了保证数据可以统一到一个坐标系下,需要借助标志点进行实时的扫描点云拼接,扫描完毕后需要人工清除贴点,浪费时间和人力。
针对上述的问题,目前尚未提出有效的解决方案。
发明内容
本申请提供了一种三维数据拼接方法及三维扫描系统、手持扫描仪,以至少解决相关技术中通过人工频繁贴点以实时拼接扫描点云数据,存在费时费力的技术问题。
根据本申请的一个方面,提供了三维扫描系统,设置为获得被测物体的三维数据,包括:测量模组,设置为获取被测物体表面的目标图像,所述目标图像包括第一图像和第二图像;计算机终端,设置为执行如下步骤:对所述目标图像进行三维重建,基于所述第一图像得到第一点云,并基于所述第二图像得到第二点云;确定多片所述第一点云之间的拼接转换关系;基于多片所述第一点云的拼接转换关系拼接对应的多片所述第二点云。
可选地,所述测量模组包括:第一测量模组,设置为获取被测物体表面的第一图像;第二测量模组,设置为获取被测物体表面的第二图像。
可选地,所述第一测量模组包括:散斑投射器,设置为投射第一波段的散斑图像至被测物体表面;第一采集模块,设置为采集投射至所述被测物体表面的散斑图像,以获取第一图像,和/或,所述第二测量模组包括:条纹投射模块,设置为发射第二波段的条纹图像至所述被测物体表面;第二采集模块,设置为采集投射至所述被测物体表面的条纹图像,以获取第二图像。
可选地,所述计算机终端在执行基于多片所述第一点云的拼接转换关系拼接对应的多片所述第二点云的步骤时,包括:获取所述第一测量模组与所述第二测量模组之间的位姿变换关系;基于第一测量模组和第二测量模组之间的位姿变换关系以及多片第一点云的拼接转换关系,拼接对应的多片所述第二点云。
可选地,所述测量模组包括:散斑投射器,设置为投射第一波段的散斑图像至被测物体表面;条纹投射模块,设置为发射第二波段的条纹图像至所述被测物体表面;采集模块,设置为采集投射至被测物体表面的目标图像,所述目标图像中包括对应于散斑图像的第一图像和对应于条纹图像的第二图像。
可选地,所述第一测量模组和所述第二测量模组在扫描之前,预先进行标定以获取第一测量模组和第二测量模组之间的位姿变换关系。
可选地,所述测量模组可移动测量。
根据本申请的另一个方面,还提供了一种三维数据拼接方法,设置为获得被测物体的三维数据,包括:获取第一图像和第二图像,其中,所述第一图像和所述第二图像为测量模组获取的被测物体表面的目标图像;对所述目标图像进行三维重建,基于所述第一图像得到第一点云,并基于所述第二图像得到第二点云;确定多片所述第一点云之间的拼接转换关系;基于多片所述第一点云之间的拼接转换关系,拼接对应的多片所述第二点云。
根据本申请的另一个方面,还提供了一种手持扫描仪,所述手持扫描仪与计算机终端连接,所述计算机终端中运行的程序执行上述的三维数据拼接方法,该手持扫描仪包括:测量模组,设置为获取被测物体表面的目标图像,并将所述目标图像发送至计算机终端,其中,所述目标图像包括第一图像和第二图像。
根据本申请的另一个方面,还提供了一种计算机可读存储介质,所述计算机可读存储介质包括存储的计算机程序,其中,在所述计算机程序运行时控制所述计算机可读存储介质所在设备执行上述的三维数据拼接方法。
本申请中,通过获取第一图像和第二图像,其中,第一图像和第二图像为测量模组获取的被测物体表面的目标图像,对目标图像进行三维重建,基于第一图像得到第 一点云,并基于第二图像得到第二点云,确定多片第一点云之间的拼接转换关系,基于多片第一点云之间的拼接转换关系拼接对应的多片第二点云。在本申请中,通过对目标图像进行三维重建,并以多片第一点云确定第一点云之间的拼接转换关系,实现多片第二点云之间的拼接操作,这样可以通过数量较多的第一点云,辅助完成与第二图像对应的第二点云之间的拼接,无需借助外部贴附的标志点,即可完成精度较高的第二点云之间的拼接操作,实现不贴点激光扫描,从而解决了相关技术中通过人工频繁贴点以实时拼接扫描点云数据,存在费时费力的技术问题。
附图说明
此处所说明的附图用来提供对本发明的进一步理解,构成本申请的一部分,本发明的示意性实施例及其说明用于解释本发明,并不构成对本发明的不当限定。在附图中:
图1是根据本发明实施例的一种可选的三维扫描系统的示意图;
图2是根据本发明实施例的另一种可选的三维扫描系统的示意图;
图3是根据本发明实施例的一种可选的三维数据拼接方法的流程图。
具体实施方式
为了使本技术领域的人员更好地理解本发明方案,下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本发明一部分的实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都应当属于本发明保护的范围。
需要说明的是,本发明的说明书和权利要求书及上述附图中的术语“第一”、“第二”等是用于区别类似的对象,而不必用于描述特定的顺序或先后次序。应该理解这样使用的数据在适当情况下可以互换,以便这里描述的本发明的实施例能够以除了在这里图示或描述的那些以外的顺序实施。此外,术语“包括”和“具有”以及他们的任何变形,意图在于覆盖不排他的包含,例如,包含了一系列步骤或单元的过程、方法、系统、产品或设备不必限于清楚地列出的那些步骤或单元,而是可包括没有清楚地列出的或对于这些过程、方法、产品或设备固有的其它步骤或单元。
图1是根据本发明实施例的一种可选的三维扫描系统的示意图,用于获得被扫描物体的三维数据,如图1所示,该三维扫描系统可以包括:测量模组11,计算机终端12,其中,
测量模组11,设置为获取被测物体表面的目标图像,所述目标图像包括第一图像和第二图像。
可选的,测量模组11包括:第一测量模组,设置为获取被测物体表面的第一图像;第二测量模组,设置为获取被测物体表面的第二图像。
在本发明实施例中,第一测量模组与第二测量模组分别设置不同的采集模块,具体地,第一测量模组包括第一采集模块,第二测量模组包括第二采集模块,第一采集模块与第二采集模块相互独立。
可选的,第一测量模组包括:散斑投射器,设置为投射第一波段的散斑图像至被测物体表面;第一采集模块,设置为采集投射至被测物体表面的散斑图像,以获取第一图像。
本发明实施例中,第一测量模组与下述的第二测量模组在扫描过程中两者的相对位置始终保持不变,第一测量模组与下述的第二测量模组可构建成固定式三维扫描仪,使被测物体在固定式扫描仪的测量范围内移动,实现多角度测量,或者,第一测量模组与下述的第二测量模组构建成移动式三维扫描仪,第一测量模组与第二测量模组集成于一个扫描仪中,扫描仪可移动测量,即移动扫描仪使其在不同测量时刻从不同的测量角度对被测物体进行多角度测量,其散斑投射器可以向被测物体表面投射散斑图像,并通过第一采集模块采集二维影像,得到第一图像,每个测量角度(每个测量时刻)分别获取第一图像,从而获取到多帧第一图像。可选的第一采集模块包括至少一个相机。
可选的,测量模组还可以包括滤光片、补光灯等,具体地,第一测量模组11及下述的第二测量模组分别包括滤光片、补光灯等。通过滤光片的选择设置使得相机仅采集对应波段的图像,例如,在散斑投射器与条纹投射模块同步向被测物体投射图像时,第一采集模块仅采集散斑图像而不采集条纹图像,第二采集模块仅采集条纹图像而不采集散斑图像;或者同一采集模块既可采集散斑图像又可采集条纹图像。
通过第一测量模组能够获取到第一图像(可以直接理解为包含散斑图像的物体表面图像),通过该第一图像三维重建得到第一点云,本申请利用散斑图像重建的第一点云中点数量多而密集、可反映被测物体丰富特征的特点,实现多片第一点云的特征拼接。扫描仪从多个测量角度拍摄被测物体表面的图像,获取到第一图像,每帧第一图像可以对应一个测量角度,通过对每帧第一图像三维重建,可以得到与每帧第一图像对应的单片第一点云,基于第一图像获取到多片第一点云,即第一图像、第一点云、测量时刻与测量角度一一对应,本发明实施例中,以每个测量角度对应拍摄一帧第一 图像为例,进行示意性说明。
通过多片第一点云之间的拼接,在本实施例中,多片第一点云之间进行特征拼接,可以确定出多片第一点云之间的拼接转换关系,也就是确定出第一测量模组在获取多片第一图像过程中的位姿变换关系,即第一测量模组在多个对应的测量角度(测量时刻)之间的位姿变换关系,用于辅助第二点云完成高精度的拼接操作。
可选的,第二测量模组包括:条纹投射模块,设置为投射第二波段的条纹图像至被测物体表面;第二采集模块,设置为采集投射至被测物体表面的条纹图像,以获取第二图像。
在本发明实施例中,条纹图像的类型包括下述至少之一:激光条纹图像、结构光条纹图像。
第二测量模组中的条纹投射模块向被扫描物体表面投射条纹图像(如以激光光源投射形成的激光条纹图像或者以结构光光源投射形成的结构光条纹图像),然后利用第二采集模块采集物体表面的图像,该第二采集模块包括至少两个相机,每个相机拍摄被测物体表面的图像。
此时,目标图像包括第一图像和第二图像,散斑投射器和条纹投射模块同步向被测物体表面的某一个部位投射图像,散斑图像与条纹图像共同呈现在该部位上,第一采集模块与第二采集模块同步拍摄被测物体表面的图像,得到对应于散斑图像的第一图像和对应于条纹图像的第二图像。在本实施例中,第一采集模块仅获取对应于散斑图像的第一图像,第二采集模块仅获取对应于条纹图像的第二图像,第一图像与第二图像是同步获取的。
在本实施例中,第二测量模组与第一测量模组同步获取被测物体表面的图像,具体地,扫描仪(手持式移动三维扫描仪)从测量角度A获取被测物体表面的图像,包括:散斑投射器向被测物体表面投射散斑图像,条纹投射模块向被测物体表面同步投射条纹图像,第一采集模块采集被测物体表面的散斑图像以获取第一图像A,第二采集模块同步采集被测物体表面的条纹图像以获取第二图像A,即扫描仪从测量角度A同步获取第一图像A和第二图像A;扫描仪从测量角度B获取被测物体表面的图像,包括:散斑投射器向被测物体表面投射散斑图像,条纹投射模块向被测物体表面同步投射条纹图像,第一采集模块采集被测物体表面的散斑图像以获取第一图像B,第二采集模块同步采集被测物体表面的条纹图像以获取第二图像B,即扫描仪从测量角度B同步获取第一图像B和第二图像B……随着扫描仪的移动扫描仪从多个测量角度同步获取第一图像与第二图像。
第二图像经三维重建得到第二点云,扫描仪从多个测量角度拍摄被测物体表面的图像,获取到第二图像,每帧第二图像可以对应一个测量角度,通过对每帧第二图像三维重建,可以得到与每帧第二图像对应的单片第二点云,基于第二图像获取到多片第二点云,即第二图像、第二点云、测量时刻与测量角度一一对应,可见,每个测量角度下获取的第一点云与第二点云是相对应的,然后在拼接多片第二点云时,利用第一测量模组与第二测量模组之间的位姿变换关系(即第一采集模块与第二采集模块之间的位姿变换关系)以及多片第一点云之间的拼接转换关系,实现多片第二点云之间的高精度拼接,从而得到被测物体的高精度的三维模型。
具体地,扫描仪从测量角度A获取被测物体表面的图像,即第一测量模组从测量角度A获取第一图像A,第二测量模组从测量角度A同步获取第二图像A,第一图像A三维重建得到第一点云A,第二图像A三维重建得到第二点云A;扫描仪从测量角度B获取被测物体表面的图像,即第一测量模组从测量角度B获取第一图像B,第二测量模组从测量角度B同步获取第二图像B,第一图像B三维重建得到第一点云B,第二图像B三维重建得到第二点云B;扫描仪从测量角度C获取被测物体表面的图像,即第一测量模组从测量角度C获取第一图像C,第二测量模组从测量角度C同步获取第二图像C,第一图像C三维重建得到第一点云C,第二图像C三维重建得到第二点云C;第一点云A与第一点云B进行特征拼接,获取到拼接转换关系[R 1,T 1],第一点云A与第一点云C进行特征拼接,获取到拼接转换关系[R 2,T 2];获取第一测量模组与第二测量模组之间的位姿变换关系[R,T],第二点云A与第二点云B通过[R,T]及[R 1,T 1]进行拼接,第二点云A与第二点云C通过[R,T]及[R 2,T 2]进行拼接。
优选地,扫描仪从测量角度A获取被测物体表面的图像,即第一测量模组从测量角度A获取第一图像A,第二测量模组从测量角度A同步获取第二图像A,第一图像A三维重建得到第一点云A,第二图像A三维重建得到第二点云A;扫描仪从测量角度B获取被测物体表面的图像,即第一测量模组从测量角度B获取第一图像B,第二测量模组从测量角度B同步获取第二图像B,第一图像B三维重建得到第一点云B,第二图像B三维重建得到第二点云B,第一点云A与第一点云B进行特征拼接(即当前帧第一点云与之前帧第一点云进行特征拼接),获取到拼接转换关系[R 1,T 1];获取第一测量模组与第二测量模组之间的位姿变换关系[R,T],第二点云A与第二点云B通过[R,T]及[R 1,T 1]进行拼接(即当前帧第二点云与之前帧第二点云进行拼接),获得拼接后第二点云AB;扫描仪从测量角度C获取被测物体表面的图像,即第一测量模组从测量角度C获取第一图像C,第二测量模组从测量角度C同步获取第二图像C,第一图像C三维重建得到第一点云C,第二图像C三维重建得到第二点云C,拼接后第二点云AB与第一点云C进行特征拼接,获取到拼接转换关系[R 2,T 2];拼接后第二点云AB与第二点云 C通过[R,T]及[R 2,T 2]进行拼接,获得拼接后第二点云ABC……直至获取到被测物体的整体点云。每获取到当前帧数据即可与之前帧数据进行处理,例如拼接,从而实现扫描仪数据的实时获取及处理,实时获取到被测物体的扫描数据,实时呈像。
在本发明实施例中,散斑投射器投射散斑图像的第一波段与条纹投射模块投射条纹图像的第二波段相同,或者,第一波段与第二波段互不干扰。即两个投射模块投射的光波段可以相同也可以不同。第一波段和第二波段发射的光类型包括但不限于:可见光、不可见光,在一个可选的实施方式中,第一波段为不可见光波段。优选的,第一波段为不可见光波段中的815-845nm波段。更进一步,第一波段的散斑图像采用特定的波长,波长为830nm。如果散斑投射器与条纹投射模块投射相同波段的光,则散斑投射器与条纹投射模块同时工作时,第一采集模块获取到的单帧图像包括第一图像与第二图像,第二采集模块同步获取到的单帧图像包括第一图像与第二图像,即散斑图像与条纹图像呈现在同一帧图像中,实际使用时可择一采集模块工作即可;如果散斑投射器与条纹投射模块投射不同波段的光,散斑投射器与条纹投射模块同时工作时,则第一采集模块获取到的单帧图像仅包括第一图像,第二采集模块获取的单帧图像仅包括第二图像,测量模组获取到的第一图像与第二图像为相互独立的两帧图像,当然通过不同滤光片的设置,也可使第一采集模块同时获取第一图像与第二图像、第二采集模块同时获取第一图像与第二图像,即单帧图像包括第一图像与第二图像。
本申请中,第一采集模块和第二采集模块可以分别包括两个相机。
第一测量模组和第二测量模组预先安装在手持扫描仪中,两个模组的位置是固定,因此,可以通过扫描仪标定直接确定第一测量模组和第二测量模组之间的位姿变换关系。在拍摄时,第一采集模块与第二采集模块可以同步对被测物体表面上的同一部位进行图像采集,在采集完成后,再继续对下一部位进行采集。可选地,第一测量模组与第二测量模组共用同一采集模块。
可选的,测量模组包括:散斑投射器,设置为投射第一波段的散斑图案至被测物体表面;条纹投射模块,设置为发射第二波段的条纹图像至被测物体表面;采集模块,设置为采集投射至被测物体表面的目标图像,散斑投射器与条纹投射器同步向被测物体表面投射,被测物体表面既形成有散斑图像又形成有条纹图像,采集模块采集被测物体表面的图像,单帧目标图像中既包括对应于散斑图像的第一图像又包括对应于条纹图像的第二图像。也就是说,散斑投射器与采集模块构建了第一测量模组,条纹投射模块与采集模块构建了第二测量模组,但第一测量模组与第二测量模组共用了同一采集模块,散斑图像与条纹图像成像于同一帧图像。
在本发明实施例中,第一采集模块和第二采集模块共用一组相机(包括至少两个 相机),在共用一组相机时,散斑投射器投射散斑图像的第一波段与条纹投射模块投射条纹图像的第二波段相同,每个相机获取到的单帧目标图像中既包含散斑图像又包含条纹图像。
图2是根据本发明实施例的另一种可选的三维扫描系统的示意图,如图2所示,该三维扫描系统中,第一测量模组和第二测量模组中的采集模块共用一组相机,通过一组相机实现多张图像的拍摄,图像中即包括散斑图像也包括条纹图像。在本实施例中,一个采集模块包括两个相机,两个相机构建双目立体视觉系统。
作为本发明可选的实施方式,由于投射于被测物体表面的散斑图像与条纹图像同时由同一采集模块获取,因此第一点云的拼接转换关系即为第二点云的拼接转换关系,基于多片第一点云的拼接转换关系即可拼接对应的多片第二点云。
上述手持扫描仪可以与计算机终端建立通信连接关系,然后将采集到的第一图像、第二图像实时传输至计算机终端,计算机终端中可以安装三维扫描应用或者三维分析软件,以通过第一图像和第二图像,分别得到第一点云和第二点云,之后,可以通过多片第一点云的特征拼接辅助对应的多片第二点云进行拼接,得到与被测物体对应的三维数字模型。
在本发明实施方式中,测量模组可移动测量。测量模组可以安装在手持扫描仪中,因此可以实现移动测量,对被测物体实现三维扫描测量。
计算机终端12,执行如下步骤:对目标图像进行三维重建,基于第一图像得到第一点云,并基于第二图像得到第二点云;确定多片第一点云之间的拼接转换关系;基于多片第一点云的拼接转换关系拼接对应的多片第二点云。
可选的,计算机终端在执行基于多片第一点云的拼接转换关系拼接对应的多片第二点云的步骤时,包括:获取第一测量模组与第二测量模组之间的位姿变换关系;基于第一测量模组和第二测量模组之间的位姿变换关系以及多片第一点云的拼接转换关系,拼接对应的多片第二点云。
由于获取图像时,每个时间点对应一个测量角度,每个测量角度,测量模组获取目标图像,即测量时间点、测量角度与目标图像一一对应,通过不同时间点从不同测量角度获取到投射至被测物体表面的图像,从而得到多帧目标图像,在分析第一点云时,需要确定前后不同时间点,测量模组获取目标图像时发生的位姿变换关系。
例如,以第一测量模组为标准,构建对应于第一测量模组的位姿变换关系,基于多片第一点云确定第一测量模组在不同时间点的位姿变换关系,通过不同时间点从不同测量角度获取到投射至被测物体表面的散斑图像,从而得到第一图像,在分析第一 点云时,需要确定前后不同时间点,第一测量模组获取第一图像时发生的位姿变换关系。通过该第一测量模组在不同时间点的位姿变换关系和两个测量模组之间的位姿变换关系,拼接多片第二点云。
上述三维扫描系统,采用测量模组获取被测物体表面的目标图像,目标图像包括第一图像和第二图像,采用计算机终端12执行如下步骤:对所述目标图像进行三维重建,基于所述第一图像得到第一点云,并基于所述第二图像得到第二点云;确定多片所述第一点云之间的拼接转换关系;基于多片所述第一点云的拼接转换关系拼接对应的多片所述第二点云。在该实施例中,通过对第一图像和第二图像进行三维重建,并以多片第一点云确定测量模组在拍摄图像过程中发生的位姿变换关系,实现多片第二点云之间的拼接操作,这样可以通过数量较多的第一点云,辅助完成与第二图像对应的第二点云之间的拼接,无需借助外部贴附的标志点,即可完成精度较高的第二点云之间的拼接操作,实现不贴点激光扫描,从而解决了相关技术中通过人工频繁贴点以实时拼接扫描点云数据,存在费时费力的技术问题。
对于本发明实施例的第一测量模组与第二测量模组之间的位姿变换关系,设备在扫描之前,可以先进行设备的标定,标定后可以确定两个模组之间的位姿变换关系[R,T],该位姿变换关系可以反映两个测量模组之间的位置关系,其可以通过一个旋转平移矩阵表示。
计算机终端12在接收到测量模组采集的目标图像后,可以三维重建得到被测物体的多片第一点云,多片第一点云通过特征拼接即可得到多片第一点云之间的拼接转换关系[R i,T i],也就是获取了第测量模组在获取对应的多帧目标图像过程中的位姿变换关系。
利用[R,T]和[R i,T i]将多片第二点云统一到一个坐标系下,完成拼接,从而实现免贴标志点的拼接。例如,第一点云A与第一点云B拼接得到两者之间的拼接转换关系[R 1,T 1],即获取第一图像A时第一测量模组的位姿变换到获取第一图像B时第一测量模组的位姿变换关系;第一点云A与第一点云C拼接得到两者之间的拼接转换关系[R 2,T 2],即获取第一图像A时第一测量模组的位姿变换到获取第一图像C时第一测量模组的位姿的位姿变换关系。由于在扫描过程中第一测量模组与第二测量模组的位姿变换关系[R,T]始终保持不变,因此,通过第一测量模组与第二测量模组的位姿变换关系[R,T]及获取第一图像A时第一测量模组的位姿变换到获取第一图像B时第一测量模组的位姿变换关系即可确定在获取第二图像A时第二测量模组的位姿变换到获取第二图像B时第二测量模组的位姿变换关系,即获取第二点云A与第二点云B之间的拼接转换关系[R 1’,T 1’],基于拼接转换关系[R 1’,T 1’]完成第二点云A与第二点云B的 拼接,通过第一测量模组与第二测量模组的位姿变换关系[R,T]及获取第一图像A时第一测量模组的位姿变换到获取第一图像C时第一测量模组的位姿变换关系即可确定获取第二图像A时第二测量模组的位姿变换到获取第二图像C时第二测量模组的位姿变换关系,即获取第二点云A与第二点云C之间的拼接转换关系[R 2’,T 2’],基于拼接转换关系[R 2’,T 2’]完成第二点云A与第二点云C的拼接。
通过上述实施例,可以利用测量模组11中获取投射在被测物体表面上的目标图像,然后利用计算机终端12三维重建,得到多片第一点云,确定多片所述第一点云之间的拼接转换关系,对对第二图像进行三维重建,得到多片第二点云,基于测量模组在拍摄不同图像过程的位姿变换关系以及两个测量模组之间的位姿变换关系,拼接多片第二点云,以得到被测物体的三维数字模型,实现同波段或者多波段免贴标志点的扫描拼接,无需外部贴附标志点。
在本实施例中,第一图像基于双目视觉重建原理三维重建得到第一点云,第二图像基于双目视觉重建原理三维重建得到第二点云。具体地,散斑图像基于双目视觉重建原理三维重建得到第一点云,条纹图像基于双目视觉重建原理三维重建得到第二点云,由于散斑图像重建得到的点云中点的数量多而密集,具有丰富的特征,因此多片第一点云可特征拼接得到多片第一点云之间的拼接转换关系[R i,T i],又由于在扫描过程中第一测量模组与第二测量模组之间的位姿变换关系[R,T]始终保持不变,因此,基于第一测量模组与第二测量模组之间的位姿变换关系[R,T]及多片第一点云之间的位姿变换关系/拼接转换关系[R i,T i],即可完成多片第二点云之间的拼接,这也解决了条纹图像三维重建得到的点云中点的数量少而稀疏、特征不丰富、需要借助粘贴标志点来辅助拼接所带来的人工频繁贴点耗时费力的问题,而条纹图像三维重建的点云的精度相对散斑图像三维重建的点云的精度要高,因此,由多片第二点云拼接得到的被测物体的三维数字模型精度高。
根据本发明可选的实施例,还提供了一种三维扫描系统,设置为获得被测物体的三维数据,包括:第一测量模组,至少包括:散斑投射器,散斑投射器设置为投射第一波段的散斑图像至被测物体表面;第二测量模组,至少包括:条纹投射模块,条纹投射模块设置为发射第二波段的条纹图像至被测物体表面;采集模块,设置为采集不同时间点投射至被测物体表面的图像;计算机终端,在接收图像后,执行如下步骤:步骤一,对图像进行三维重建,得到多片第一点云和多片第二点云,其中,第一点云与第一测量模组投射的散斑图像对应,第二点云与第二测量模组发射的条纹图像对应;步骤二,基于多片第一点云确定第一测量模组在不同时间点的位姿变换关系;步骤三,基于位姿变换关系,拼接多片第二点云。
图3是根据本发明实施例的一种可选的三维数据拼接方法的流程图,三维数据拼接方法用于获得被扫描物体的三维数据,如图3所示,该三维数据拼接方法包括:
步骤S302,获取第一图像和第二图像,其中,第一图像和第二图像为测量模组获取的被测物体表面的目标图像;
步骤S304,对所述目标图像进行三维重建,基于第一图像得到第一点云,并基于第二图像得到第二点云;
步骤S306,确定多片第一点云之间的拼接转换关系;
步骤S308,基于多片第一点云之间的拼接转换关系,拼接对应的多片第二点云。
通过上述步骤,获取第一图像和第二图像,其中,第一图像和第二图像为测量模组获取的被测物体表面的目标图像,对目标图像进行三维重建,基于第一图像得到第一点云,并基于第二图像得到第二点云,确定多片第一点云之间的拼接转换关系,基于多片第一点云之间的拼接转换关系,拼接对应的多片第二点云。在本申请中,通过对目标图进行三维重建,并以多片第一点云确定多片第一点云之间的拼接转换关系,实现对应的多片第二点云之间的拼接操作,这样可以通过数量较多的第一点云,辅助完成与第二图像对应的第二点云之间的拼接,无需借助外部贴附的标志点,即可完成精度较高的第二点云之间的拼接操作,实现不贴点激光扫描,从而解决了相关技术中通过人工频繁贴点以实时拼接扫描点云数据,存在费时费力的技术问题。
该实施例应用于计算机终端,该计算机终端可以与手持扫描仪连接,接收手持扫描仪中测量模组分别采集到的第一图像和第二图像,并完成被测物体的点云拼接,得到与被测物体对应的物体三维点云模型。
上述测量模组包括:第一测量模组,设置为获取被测物体表面的第一图像;第二测量模组,设置为获取被测物体表面的第二图像。
第一测量模组包括:散斑投射器,设置为投射第一波段的散斑图像至被测物体表面;第一采集模块,设置为采集投射至被测物体表面的散斑图像,以获取第一图像。
可选的,第二测量模组包括:条纹投射模块,设置为投射第二波段的条纹图像至被测物体表面;第二采集模块,设置为采集投射至被测物体表面的条纹图像,以获取第二图像。
可选的,条纹图像的类型包括下述至少之一:激光条纹图像、结构光条纹图像。
可选的,第一波段与第二波段相同,或者,第一波段与第二波段互不干扰。
可选的,计算机终端在执行基于多片第一点云的拼接转换关系拼接对应的多片第二点云的步骤时,包括:获取第一测量模组与第二测量模组之间的位姿变换关系;基于第一测量模组和第二测量模组之间的位姿变换关系以及多片第一点云的拼接转换关系拼接多片第二点云。
可选的,第一测量模组和第二测量模组在扫描之前,预先进行标定以获取第一测量模组和第二测量模组之间的位姿变换关系。
可选的,测量模组包括:散斑投射器,设置为投射第一波段的散斑图像至被测物体表面;条纹投射模块,设置为发射第二波段的条纹图像至被测物体表面;采集模块,设置为采集投射至被测物体表面的目标图像,目标图像中包括对应于散斑图像的第一图像和对应于条纹图像的第二图像。
另一种可选的,测量模组可移动测量。
根据本申请的另一方面,还提供了一种电子设备,包括:处理器;以及存储器,设置为存储处理器的可执行指令;其中,处理器配置为经由执行可执行指令来执行上述任意一项的三维数据拼接方法。
根据本申请的另一个方面,还提供了一种手持扫描仪,手持扫描仪与计算机终端连接,计算机终端中运行的程序执行上述的三维数据拼接方法,该手持扫描仪包括:测量模组,设置为获取被测物体表面的目标图像,并将目标图像发送至计算机终端,其中,目标图像包括第一图像和第二图像。
根据本申请的另一方面,还提供了一种计算机可读存储介质,计算机可读存储介质包括存储的计算机程序,其中,在计算机程序运行时控制计算机可读存储介质所在设备执行上述任意一项的三维数据拼接方法。
本申请还提供了一种计算机程序产品,当在数据处理设备上执行时,适于执行初始化有如下方法步骤的程序:获取第一图像和第二图像,其中,第一图像和第二图像为测量模组获取的被测物体表面的目标图像;对所述目标图像进行三维重建,基于第一图像得到第一点云,并基于第二图像得到第二点云;确定多片第一点云之间的拼接转换关系;基于多片第一点云之间的拼接转换关系拼接对应的多片第二点云。
上述本发明实施例序号仅仅为了描述,不代表实施例的优劣。
在本发明的上述实施例中,对各个实施例的描述都各有侧重,某个实施例中没有详述的部分,可以参见其他实施例的相关描述。
在本申请所提供的几个实施例中,应该理解到,所揭露的技术内容,可通过其它的方式实现。其中,以上所描述的装置实施例仅仅是示意性的,例如所述单元的划分,可以为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,单元或模块的间接耦合或通信连接,可以是电性或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本发明各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
所述集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本发明的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可为个人计算机、服务器或者网络设备等)执行本发明各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、只读存储器(ROM,Read-Only Memory)、随机存取存储器(RAM,Random Access Memory)、移动硬盘、磁碟或者光盘等各种可以存储程序代码的介质。
以上所述仅是本发明的优选实施方式,应当指出,对于本技术领域的普通技术人员来说,在不脱离本发明原理的前提下,还可以做出若干改进和润饰,这些改进和润饰也应视为本发明的保护范围。
工业实用性
本申请实施例提供的方案可以应用于三维扫描系统以及手持式移动三维扫描仪,用于对目标图像进行三维重建,并以多片第一点云确定第一点云之间的拼接转换关系,实现多片第二点云之间的拼接操作,通过数量较多的第一点云,辅助完成与第二图像对应的第二点云之间的拼接,无需借助外部贴附的标志点,即可完成精度较高的第二点云之间的拼接操作,实现不贴点激光扫描,从而解决了相关技术中通过人工频繁贴点以实时拼接扫描点云数据,存在费时费力的技术问题。

Claims (10)

  1. 一种三维扫描系统,用于获得被测物体的三维数据,包括:
    测量模组,设置为获取被测物体表面的目标图像,所述目标图像包括第一图像和第二图像;
    计算机终端,设置为执行如下步骤:
    对所述目标图像进行三维重建,基于所述第一图像得到第一点云,并基于所述第二图像得到第二点云;
    确定多片所述第一点云之间的拼接转换关系;
    基于多片所述第一点云的拼接转换关系拼接对应的多片所述第二点云。
  2. 根据权利要求1所述的三维扫描系统,其中,所述测量模组包括:
    第一测量模组,设置为获取被测物体表面的第一图像;
    第二测量模组,设置为获取被测物体表面的第二图像。
  3. 根据权利要求2所述的三维扫描系统,其中,所述第一测量模组包括:
    散斑投射器,设置为投射第一波段的散斑图像至被测物体表面;
    第一采集模块,设置为采集投射至所述被测物体表面的散斑图像,以获取第一图像,和/或,
    所述第二测量模组包括:
    条纹投射模块,设置为发射第二波段的条纹图像至所述被测物体表面;
    第二采集模块,设置为采集投射至所述被测物体表面的条纹图像,以获取第二图像。
  4. 根据权利要求2所述的三维扫描系统,其中,所述计算机终端在执行基于多片所述第一点云的拼接转换关系拼接对应的多片所述第二点云的步骤时,包括:获取所述第一测量模组与所述第二测量模组之间的位姿变换关系;基于所述第一测量模组和所述第二测量模组之间的位姿变换关系以及多片所述第一点云的拼接转换关系,拼接对应的多片第二点云。
  5. 根据权利要求4所述的三维扫描系统,其中,所述第一测量模组和所述第二测量 模组在扫描之前,预先进行标定以获取所述第一测量模组和所述第二测量模组之间的位姿变换关系。
  6. 根据权利要求1所述的三维扫描系统,其中,所述测量模组包括:
    散斑投射器,设置为投射第一波段的散斑图像至被测物体表面;
    条纹投射模块,设置为发射第二波段的条纹图像至所述被测物体表面;
    采集模块,设置为采集投射至被测物体表面的目标图像,所述目标图像中包括对应于散斑图像的第一图像和对应于条纹图像的第二图像。
  7. 根据权利要求1所述的三维扫描系统,其中,所述测量模组可移动测量。
  8. 一种三维数据拼接方法,用于获得被测物体的三维数据,包括:
    获取第一图像和第二图像,其中,所述第一图像和所述第二图像为测量模组获取的被测物体表面的目标图像;
    对所述目标图像进行三维重建,基于所述第一图像得到第一点云,并基于所述第二图像得到第二点云;
    确定多片所述第一点云之间的拼接转换关系;
    基于多片所述第一点云之间的拼接转换关系,拼接对应的多片所述第二点云。
  9. 一种手持扫描仪,所述手持扫描仪与计算机终端连接,所述计算机终端中运行的程序执行权利要求8所述的三维数据拼接方法,该手持扫描仪包括:
    测量模组,设置为获取被测物体表面的目标图像,并将所述目标图像发送至计算机终端,其中,所述目标图像包括第一图像和第二图像。
  10. 一种计算机可读存储介质,所述计算机可读存储介质包括存储的计算机程序,其中,在所述计算机程序运行时控制所述计算机可读存储介质所在设备执行权利要求8所述的三维数据拼接方法。
PCT/CN2021/116044 2020-09-29 2021-09-01 三维数据拼接方法及三维扫描系统、手持扫描仪 WO2022068510A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202011057282.0A CN112330732A (zh) 2020-09-29 2020-09-29 三维数据拼接方法及三维扫描系统、手持扫描仪
CN202011057282.0 2020-09-29

Publications (1)

Publication Number Publication Date
WO2022068510A1 true WO2022068510A1 (zh) 2022-04-07

Family

ID=74314370

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/116044 WO2022068510A1 (zh) 2020-09-29 2021-09-01 三维数据拼接方法及三维扫描系统、手持扫描仪

Country Status (2)

Country Link
CN (1) CN112330732A (zh)
WO (1) WO2022068510A1 (zh)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115330652A (zh) * 2022-08-15 2022-11-11 北京城市网邻信息技术有限公司 点云拼接方法、设备及存储介质
CN116206069A (zh) * 2023-04-28 2023-06-02 思看科技(杭州)股份有限公司 三维扫描中的图像数据处理方法、装置和三维扫描仪
CN117579754A (zh) * 2024-01-16 2024-02-20 思看科技(杭州)股份有限公司 三维扫描方法、装置、计算机设备以及存储介质
CN117579753A (zh) * 2024-01-16 2024-02-20 思看科技(杭州)股份有限公司 三维扫描方法、装置、计算机设备以及存储介质
CN117906910A (zh) * 2024-03-20 2024-04-19 季华实验室 水下流场信息测量系统和方法

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112330732A (zh) * 2020-09-29 2021-02-05 先临三维科技股份有限公司 三维数据拼接方法及三维扫描系统、手持扫描仪
CN113487721B (zh) * 2021-06-18 2023-08-22 浙江大学 一种基于三维点云的预制构件自动化识别方法
CN118236182A (zh) * 2021-07-01 2024-06-25 先临三维科技股份有限公司 三维扫描系统、扫描数据处理方法、装置、设备及介质
CN115797659B (zh) * 2023-01-09 2023-05-02 思看科技(杭州)股份有限公司 数据拼接方法、三维扫描系统、电子装置和存储介质
CN118264757A (zh) * 2023-12-15 2024-06-28 先临三维科技股份有限公司 扫描重建数据生成方法、装置、及非易失性存储介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150070468A1 (en) * 2013-09-10 2015-03-12 Faro Technologies, Inc. Use of a three-dimensional imager's point cloud data to set the scale for photogrammetry
CN106802138A (zh) * 2017-02-24 2017-06-06 杭州先临三维科技股份有限公司 一种三维扫描系统及其扫描方法
CN108151671A (zh) * 2016-12-05 2018-06-12 杭州先临三维科技股份有限公司 一种三维数字成像传感器、三维扫描系统及其扫描方法
CN109141289A (zh) * 2018-08-01 2019-01-04 先临三维科技股份有限公司 三维扫描方法和系统
CN111023970A (zh) * 2019-12-17 2020-04-17 杭州思看科技有限公司 多模式三维扫描方法及系统
CN112330732A (zh) * 2020-09-29 2021-02-05 先临三维科技股份有限公司 三维数据拼接方法及三维扫描系统、手持扫描仪

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN206311076U (zh) * 2017-01-04 2017-07-07 苏州西博三维科技有限公司 基于散斑的极速三维人体扫描仪
CN109357633B (zh) * 2018-09-30 2022-09-30 先临三维科技股份有限公司 三维扫描方法、装置、存储介质和处理器

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150070468A1 (en) * 2013-09-10 2015-03-12 Faro Technologies, Inc. Use of a three-dimensional imager's point cloud data to set the scale for photogrammetry
CN108151671A (zh) * 2016-12-05 2018-06-12 杭州先临三维科技股份有限公司 一种三维数字成像传感器、三维扫描系统及其扫描方法
CN106802138A (zh) * 2017-02-24 2017-06-06 杭州先临三维科技股份有限公司 一种三维扫描系统及其扫描方法
CN109141289A (zh) * 2018-08-01 2019-01-04 先临三维科技股份有限公司 三维扫描方法和系统
CN111023970A (zh) * 2019-12-17 2020-04-17 杭州思看科技有限公司 多模式三维扫描方法及系统
CN112330732A (zh) * 2020-09-29 2021-02-05 先临三维科技股份有限公司 三维数据拼接方法及三维扫描系统、手持扫描仪

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115330652A (zh) * 2022-08-15 2022-11-11 北京城市网邻信息技术有限公司 点云拼接方法、设备及存储介质
CN115330652B (zh) * 2022-08-15 2023-06-16 北京城市网邻信息技术有限公司 点云拼接方法、设备及存储介质
CN116206069A (zh) * 2023-04-28 2023-06-02 思看科技(杭州)股份有限公司 三维扫描中的图像数据处理方法、装置和三维扫描仪
CN116206069B (zh) * 2023-04-28 2023-10-13 思看科技(杭州)股份有限公司 三维扫描中的图像数据处理方法、装置和三维扫描仪
CN117579754A (zh) * 2024-01-16 2024-02-20 思看科技(杭州)股份有限公司 三维扫描方法、装置、计算机设备以及存储介质
CN117579753A (zh) * 2024-01-16 2024-02-20 思看科技(杭州)股份有限公司 三维扫描方法、装置、计算机设备以及存储介质
CN117579754B (zh) * 2024-01-16 2024-05-07 思看科技(杭州)股份有限公司 三维扫描方法、装置、计算机设备以及存储介质
CN117906910A (zh) * 2024-03-20 2024-04-19 季华实验室 水下流场信息测量系统和方法
CN117906910B (zh) * 2024-03-20 2024-06-04 季华实验室 水下流场信息测量系统和方法

Also Published As

Publication number Publication date
CN112330732A (zh) 2021-02-05

Similar Documents

Publication Publication Date Title
WO2022068510A1 (zh) 三维数据拼接方法及三维扫描系统、手持扫描仪
CN108151671B (zh) 一种三维数字成像传感器、三维扫描系统及其扫描方法
CN108267097B (zh) 基于双目三维扫描系统的三维重构方法和装置
CN111023970B (zh) 多模式三维扫描方法及系统
JP6564537B1 (ja) 単眼3次元走査システムによる3次元再構成法および装置
CN107202554B (zh) 同时具备摄影测量和三维扫描功能的手持式大尺度三维测量扫描仪系统
CA3022442C (en) Three-dimensional reconstruction method and device based on monocular three-dimensional scanning system
JP3624353B2 (ja) 3次元形状計測方法およびその装置
US20170094251A1 (en) Three-dimensional imager that includes a dichroic camera
WO2010032792A1 (ja) 3次元計測装置およびその方法
US20130113893A1 (en) Stereoscopic measurement system and method
CN107860337B (zh) 基于阵列相机的结构光三维重建方法与装置
WO2020199439A1 (zh) 基于单双目混合测量的三维点云计算方法
CN109900221A (zh) 一种手持式三维扫描系统
CN112254670B (zh) 一种基于光扫描和智能视觉融合的3d信息采集设备
CN113551611B (zh) 大尺寸运动物体的立体视觉测量方法、系统、设备及存储介质
US8249332B2 (en) Stereoscopic measurement system and method
CN114858086A (zh) 一种三维扫描系统、方法和装置
CA2757313C (en) Stereoscopic measurement system and method
CN110044266B (zh) 基于散斑投影的摄影测量系统
CN117346694B (zh) 复合面型样品的检测方法、检测系统
CN112082486B (zh) 一种手持式智能3d信息采集设备
WO2023179782A1 (zh) 三维扫描系统、方法、装置和移动计算模组
CN111982071B (zh) 一种基于tof相机的3d扫描方法及系统
CN107478227B (zh) 交互式大型空间的定位算法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21874171

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21874171

Country of ref document: EP

Kind code of ref document: A1