CN115493512B - Data processing method, three-dimensional scanning system, electronic device and storage medium - Google Patents

Data processing method, three-dimensional scanning system, electronic device and storage medium Download PDF

Info

Publication number
CN115493512B
CN115493512B CN202210956321.3A CN202210956321A CN115493512B CN 115493512 B CN115493512 B CN 115493512B CN 202210956321 A CN202210956321 A CN 202210956321A CN 115493512 B CN115493512 B CN 115493512B
Authority
CN
China
Prior art keywords
scanning
tracking
positioning devices
data processing
scanning device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210956321.3A
Other languages
Chinese (zh)
Other versions
CN115493512A (en
Inventor
陈尚俭
郑俊
戴明
周强
蒋鑫巍
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Silidi Technology Co ltd
Scantech Hangzhou Co Ltd
Original Assignee
Hangzhou Silidi Technology Co ltd
Scantech Hangzhou Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Silidi Technology Co ltd, Scantech Hangzhou Co Ltd filed Critical Hangzhou Silidi Technology Co ltd
Priority to CN202210956321.3A priority Critical patent/CN115493512B/en
Publication of CN115493512A publication Critical patent/CN115493512A/en
Application granted granted Critical
Publication of CN115493512B publication Critical patent/CN115493512B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The application relates to a data processing method, a three-dimensional scanning system, an electronic device and a storage medium, wherein the data processing method comprises the following steps: in the process of scanning the object to be detected by the scanning device, acquiring tracking results obtained by synchronously tracking identifiers of at least two positioning devices in the same scanning posture under different scanning postures of the scanning device; under the same scanning posture, the tracking state information and the tracking result of the identifier by at least two positioning devices are fused and calculated to obtain the tracking fusion result of each scanning posture in different scanning postures of the scanning device by at least two positioning devices; and according to the tracking fusion results under all scanning postures and the scanning results of the scanning device, which are obtained in the scanning process, completing the three-dimensional reconstruction of the measured object. The method realizes the fusion of tracking results of different positioning devices, so that the limitation of the tracking stability of a single tracking head on the scanning precision can be relieved, and the stability of tracking scanning is improved.

Description

Data processing method, three-dimensional scanning system, electronic device and storage medium
Technical Field
The present application relates to the field of three-dimensional scanning, and in particular, to a data processing method, a three-dimensional scanning system, an electronic device, and a storage medium.
Background
The tracking scanning technology is that the tracking equipment tracks the scanning equipment in real time and indirectly converts each frame of scanning data into a unified coordinate system, so that three-dimensional scanning is realized under the condition that the surface of a detected object is not attached. The current tracking type scanning technology often tracks a single scanning device in real time through the single tracking device, so that the scanning precision is limited by the tracking stability of the single tracking device, and the stability of the current tracking type scanning is not high.
Aiming at the problem of low stability of tracking scanning in the related art, no effective solution is proposed at present.
Disclosure of Invention
In this embodiment, a data processing method, a three-dimensional scanning system, an electronic device, and a storage medium are provided to solve the problem of low stability of tracking scanning in the related art.
In a first aspect, in this embodiment, there is provided a data processing method, for a three-dimensional scanning system, where the three-dimensional scanning system includes a scanning device and at least two positioning devices, where the scanning device is fixedly provided with an identifier, and the at least two positioning devices are used to synchronously track the identifier; the method comprises the following steps:
in the process of scanning the object to be detected by the scanning device, acquiring tracking results obtained by synchronously tracking the identifiers of the at least two positioning devices in the same scanning posture under different scanning postures of the scanning device;
under the same scanning posture, carrying out fusion calculation on the tracking state information of the identifier and the tracking result by the at least two positioning devices to obtain tracking fusion results of the at least two positioning devices on each scanning posture in different scanning postures of the scanning device;
and according to the tracking fusion results under all scanning postures and the scanning results of the scanning device, which are obtained in the scanning process, completing the three-dimensional reconstruction of the object to be detected.
In some embodiments, the acquiring the tracking result obtained by the at least two positioning devices by synchronously tracking the identifier in the same scanning posture under different scanning postures of the scanning device includes:
and acquiring coordinate information of the identifier under the same scanning posture under the corresponding coordinate systems of the at least two positioning devices by the at least two positioning devices under different scanning postures of the scanning device, so as to obtain the tracking result.
In some embodiments, in the same scanning gesture, the fusing calculation is performed on the tracking state information of the identifier and the tracking result by the at least two positioning devices, so as to obtain a tracking fusion result of the at least two positioning devices on each of different scanning gestures of the scanning device, where the tracking fusion result includes:
under the same scanning gesture, determining the fusion weight of each positioning device in the at least two positioning devices according to the tracking state information of the identifier;
based on the fusion weight and the tracking result, calculating the gesture of the scanning device under a preset optimal fusion condition, determining the gesture of the scanning device under the optimal fusion condition as a tracking fusion result of the scanning device under the same scanning gesture by the at least two positioning devices, and obtaining the tracking fusion result of each scanning gesture in different scanning gestures.
In some of these embodiments, the tracking state information is derived from a numerical distribution of image identification information of the identifier acquired by the positioning device.
In some of these embodiments, the numerical distribution state includes a covariance matrix, the method further comprising:
performing covariance matrix calculation on the image identification information based on a calculation rule preset by the image identification information to obtain the tracking state information; the preset calculation rule comprises the following steps: a predetermined linear relationship between the variance and the image identification information, or a predetermined nonlinear relationship between the covariance and the image identification information.
In some of these embodiments, the numerical distribution state includes a covariance matrix, the method further comprising:
acquiring at least two different image identification information of the identifier;
based on the calculation rule corresponding to each image identification information in the at least two different image identification information, covariance matrix calculation is carried out on the at least two different image identification information respectively, and the tracking state information is obtained; the calculation rule includes: a predetermined linear relationship between the variance and the image identification information, or a predetermined nonlinear relationship between the covariance and the image identification information.
In a second aspect, in this embodiment, there is provided a three-dimensional scanning system including: the device comprises a scanning device, a data processing device and at least two positioning devices, wherein the scanning device is fixedly provided with an identifier, and the identifier is in the common field of view of the two positioning devices;
the scanning device is used for scanning the object to be detected and transmitting the scanning result to the data processing device;
the at least two positioning devices are used for synchronously tracking the identifier in the same scanning posture under different scanning postures of the scanning device, and transmitting tracking results to the data processing device;
the data processing apparatus is configured to perform the data processing method according to the first aspect.
In some embodiments, different positioning devices of the at least two positioning devices are different in distribution direction relative to the scanning device under the same scanning posture of the scanning device.
In a third aspect, in this embodiment, there is provided an electronic device including a memory, a processor, and a computer program stored on the memory and executable on the processor, where the processor implements the data processing method described in the first aspect when executing the computer program.
In a fourth aspect, in this embodiment, there is provided a storage medium having stored thereon a computer program which, when executed by a processor, implements the data processing method of the first aspect described above.
Compared with the related art, the data processing method, the three-dimensional scanning system, the electronic device and the storage medium provided in the embodiment acquire tracking results obtained by synchronously tracking identifiers of at least two positioning devices in the same scanning posture under different scanning postures of the scanning device in the process of scanning the object to be detected by the scanning device; under the same scanning posture, the tracking state information and the tracking result of the identifier by at least two positioning devices are fused and calculated to obtain the tracking fusion result of each scanning posture in different scanning postures of the scanning device by at least two positioning devices; and according to the tracking fusion results under all scanning postures and the scanning results of the scanning device, which are obtained in the scanning process, completing the three-dimensional reconstruction of the measured object. The method realizes the fusion of tracking results of different positioning devices, so that the limitation of the tracking stability of a single tracking head on the scanning precision can be relieved, and the stability of tracking scanning is improved.
The details of one or more embodiments of the application are set forth in the accompanying drawings and the description below to provide a more thorough understanding of the other features, objects, and advantages of the application.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiments of the application and together with the description serve to explain the application and do not constitute an undue limitation to the application. In the drawings:
fig. 1 is a hardware configuration block diagram of a terminal of a data processing method of the present embodiment;
FIG. 2 is a flowchart of a data processing method of the present embodiment;
FIG. 3 is a schematic diagram of a plurality of binocular tracking heads simultaneously tracking a scanning device;
FIG. 4 is a flow chart of the tracking scanning method of the preferred embodiment;
fig. 5 is a schematic structural diagram of the three-dimensional scanning system of the present embodiment.
Detailed Description
For a clearer understanding of the objects, technical solutions and advantages of the present application, the present application is described and illustrated below with reference to the accompanying drawings and examples.
Unless defined otherwise, technical or scientific terms used herein shall have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terms "a," "an," "the," "these," and the like in this application are not intended to be limiting in number, but rather are singular or plural. The terms "comprising," "including," "having," and any variations thereof, as used in the present application, are intended to cover a non-exclusive inclusion; for example, a process, method, and system, article, or apparatus that comprises a list of steps or modules (units) is not limited to the list of steps or modules (units), but may include other steps or modules (units) not listed or inherent to such process, method, article, or apparatus. The terms "connected," "coupled," and the like in this application are not limited to physical or mechanical connections, but may include electrical connections, whether direct or indirect. Reference to "a plurality" in this application means two or more. "and/or" describes an association relationship of an association object, meaning that there may be three relationships, e.g., "a and/or B" may mean: a exists alone, A and B exist together, and B exists alone. Typically, the character "/" indicates that the associated object is an "or" relationship. The terms "first," "second," "third," and the like, as referred to in this application, merely distinguish similar objects and do not represent a particular ordering of objects.
The method embodiments provided in the present embodiment may be executed in a terminal, a computer, or similar computing device. For example, the terminal is operated, and fig. 1 is a block diagram of the hardware structure of the terminal of the data processing method of the present embodiment. As shown in fig. 1, the terminal may include one or more (only one is shown in fig. 1) processors 102 and a memory 104 for storing data, wherein the processors 102 may include, but are not limited to, a microprocessor MCU, a programmable logic device FPGA, or the like. The terminal may also include a transmission device 106 for communication functions and an input-output device 108. It will be appreciated by those skilled in the art that the structure shown in fig. 1 is merely illustrative and is not intended to limit the structure of the terminal. For example, the terminal may also include more or fewer components than shown in fig. 1, or have a different configuration than shown in fig. 1.
The memory 104 may be used to store a computer program, for example, a software program of application software and a module, such as a computer program corresponding to a data processing method in the present embodiment, and the processor 102 executes the computer program stored in the memory 104 to perform various functional applications and data processing, that is, to implement the above-described method. Memory 104 may include high-speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory. In some examples, the memory 104 may further include memory remotely located relative to the processor 102, which may be connected to the terminal via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The transmission device 106 is used to receive or transmit data via a network. The network includes a wireless network provided by a communication provider of the terminal. In one example, the transmission device 106 includes a network adapter (Network Interface Controller, simply referred to as NIC) that can connect to other network devices through a base station to communicate with the internet. In one example, the transmission device 106 may be a Radio Frequency (RF) module, which is configured to communicate with the internet wirelessly.
In this embodiment, a data processing method is provided for a three-dimensional scanning system. The three-dimensional scanning system comprises a scanning device and at least two positioning devices, wherein the scanning device is fixedly provided with identifiers, and the at least two positioning devices are used for synchronously tracking the identifiers. Fig. 2 is a flowchart of the data processing method of the present embodiment, and as shown in fig. 2, the flowchart includes the steps of:
step S210, during the process of scanning the object to be detected by the scanning device, obtaining the tracking result obtained by synchronously tracking the identifiers in the same scanning posture by at least two positioning devices under different scanning postures of the scanning device.
The scanning device is used for collecting point cloud data of the surface of the detected object in the process of scanning the detected object. The scanning device may be, for example, a scanning head in a tracking three-dimensional scanner. The identifier may specifically be a marking point fixed on the surface of the scanning device, for example, a plurality of marking points may be stuck on the surface of the scanning device, or the scanning device may be embedded in a rigid frame provided with a plurality of marking points, or the scanning device may be rigidly connected to a device provided with a plurality of marking points. The manner in which the identifier is disposed with the scanning device is not particularly limited herein. Additionally, the positioning device obtains tracking results by observing and tracking the identifier so as to determine the gesture of the scanning device relative to the positioning device. The positioning device may be, for example, a tracking head in a tracking three-dimensional scanner.
Specifically, tracking of the scanning device by the positioning device will be described below taking the positioning device as a binocular tracking head as an example. The scanning device model of the scanning device may be a local three-dimensional model composed of a plurality of marker points, and it is understood that the plurality of marker points composing the local three-dimensional model belong to the identifier, and the local three-dimensional model may be predetermined before scanning. Then, solving for the pose of the scanning device can be referred to by:
Figure BDA0003791506030000051
where E is the random error, z, in tracking the scanning device i,k Representing the coordinates of the i-th marker point acquired by the kth positioning device. K (K) i,k For locating internal parameters of the device, P i W For the above-mentioned known local three-dimensional model,
Figure BDA0003791506030000061
is the current pose of the scanning device. And carrying out optimization solution on the loss function to obtain the posture of the scanning device when the random error is minimum. Based on the above, it can be known that the accuracy of the tracking result of the positioning device on the scanning device is affected by random errors caused by self factors such as image extraction and camera noise, so the embodiment proposes to use multiple positioning devices to synchronously track the same scanning device, so as to reduce errors existing in tracking the scanning device by a single positioning device.
Specifically, errors in the three directions of the X axis, the Y axis and the Z axis of the positioning device in the three-dimensional coordinates of the space will cause errors in solving the pose of the scanning device. Illustratively, FIG. 3 is a schematic diagram of a plurality of binocular tracking heads simultaneously tracking a scanning apparatus. For the binocular tracking head, the random error in the Z axis direction is significantly higher than the error in the X axis and Y axis directions, so that on the premise that the binocular tracking head 301 is already present to track the scanning device, a binocular tracking head 302 can be further introduced into the three-dimensional scanning system, and the error of the binocular tracking head 301 in the Z axis direction can be compensated based on the newly introduced measurement result of the binocular tracking head 302 in the X axis direction. Similarly, the measurement result of the binocular tracking head 301 in the X-axis direction will also compensate for the error of the binocular tracking head 302 in the Z-axis direction. Based on this, from the viewpoint of reducing the random error of solving the posture of the scanner, the accuracy of the fusion result obtained by fusing the tracking results of the two binocular tracking heads will be higher than the tracking result of the single binocular tracking head. Therefore, in the process of scanning the object to be detected by the scanning device, the embodiment obtains the synchronous tracking result of the scanning device by the at least two positioning devices.
Step S220, under the same scanning gesture, the tracking state information and the tracking result of the identifier by at least two positioning devices are fused and calculated, and the tracking fusion result of each scanning gesture in different scanning gestures of the scanning device by at least two positioning devices is obtained.
It should be noted that, due to the influence of the working conditions, random errors generated by different positioning devices are different. Therefore, in the fusion process, tracking state information of each positioning device itself needs to be considered. The tracking state information refers to state information of identification results of identifiers of the scanning devices, wherein the tracking state information refers to image acquisition of the identifiers of the scanning devices by different positioning devices under different working conditions. Specifically, the tracking state information may be a numerical distribution state of the obtained image identification information of the identifier. The image identification information includes, but is not limited to, brightness information, position information, inclination degree information, and the like of the identifier. After the tracking state information of each positioning device on the identifier is obtained, the fusion weight of each positioning device in the fusion process can be determined based on the tracking state information, so that the fusion of the tracking results of different positioning devices is realized, and the tracking fusion results of different positioning devices on the same scanning gesture of the scanning device are obtained.
Step S230, completing three-dimensional reconstruction of the measured object according to tracking fusion results under all scanning postures and scanning results of the scanning device, which are obtained in the scanning process.
Based on the tracking fusion result, each scanning gesture of the scanning device in the process of scanning the object to be detected can be obtained, and the scanning data of the object to be detected, which are acquired by the scanning device, are combined, so that the reconstruction of the three-dimensional model of the object to be detected can be realized.
Step S210 to step S230 are performed, wherein in the process of scanning the object to be detected by the scanning device, tracking results obtained by synchronously tracking identifiers in the same scanning posture by at least two positioning devices under different scanning postures of the scanning device are obtained; under the same scanning posture, the tracking state information and the tracking result of the identifier by at least two positioning devices are fused and calculated to obtain the tracking fusion result of each scanning posture in different scanning postures of the scanning device by at least two positioning devices; and according to the tracking fusion results under all scanning postures and the scanning results of the scanning device, which are obtained in the scanning process, completing the three-dimensional reconstruction of the measured object. The method realizes the fusion of tracking results of different positioning devices, so that the limitation of the tracking stability of a single tracking head on the scanning precision can be relieved, and the stability of tracking scanning is improved.
In one embodiment, based on the step S210, the tracking result obtained by synchronously tracking the identifier in the same scanning posture by at least two positioning devices in different scanning postures of the scanning device may specifically include the following steps:
step S211, acquiring coordinate information of identifiers under the same scanning posture of at least two positioning devices under different scanning postures of the scanning device, and obtaining a tracking result by synchronously measuring the coordinate information of the identifiers under the corresponding coordinate systems of the at least two positioning devices.
In order to improve the tracking stability of the scanning device, the coordinate information of the identifier of the scanning device in the same scanning posture can be synchronously measured through a plurality of positioning devices, so that the coordinate information of the identifier in a coordinate system corresponding to the plurality of positioning devices is obtained. Preferably, a plurality of different positioning devices can be distributed in different directions to track the scanning device, so that the stability of a follow-up tracking fusion result is improved.
Additionally, in one embodiment, based on the step S220, in the same scanning posture, the fusion calculation is performed on the tracking state information and the tracking result of the identifier by at least two positioning devices, so as to obtain a tracking fusion result of each scanning posture of the at least two positioning devices in different scanning postures of the scanning device, which specifically may include the following steps:
step S221, under the same scanning gesture, determining the fusion weight of each positioning device in at least two positioning devices according to the tracking state information of the identifier;
step S222, calculating the gesture of the scanning device under the preset optimal fusion condition based on the fusion weight and the tracking result, determining the gesture of the scanning device under the optimal fusion condition as the tracking fusion result of at least two positioning devices on the scanning device under the same scanning gesture, and obtaining the tracking fusion result of each scanning gesture in different scanning gestures.
Illustratively, after determining the fusion weights for each positioning device, the tracking results may be fused based on the following equation:
Figure BDA0003791506030000081
wherein Σ is a covariance matrix for representing tracking state information, z i,k Representing the coordinates of the i-th marker point acquired by the kth positioning device. K (K) i,k For locating internal parameters of the device, P i W For the above-mentioned known local three-dimensional model,
Figure BDA0003791506030000082
is the current pose of the scanning device. In this embodiment, the fusion weight of the positioning device is determined based on the covariance matrix, and the pose of the scanning device under the preset optimal fusion condition is calculated based on the fusion weight. It will be appreciated that the optimal fusion condition is a condition that minimizes random errors.
Additionally, in one embodiment, the tracking state information is derived from a numerical distribution state of image identification information of the identifier acquired by the statistical locating device. The image identification information may be specifically, for example, a numerical distribution state of luminance information of the identifier, a numerical distribution state of position information of the identifier, a numerical distribution state of inclination degree of the identifier, and the like.
Further, in one embodiment, the numerical distribution state includes a covariance matrix, and the data processing method may further include the following steps:
step S240, covariance matrix calculation is carried out on the image identification information based on a calculation rule preset by the image identification information, so as to obtain tracking state information; the preset calculation rule comprises the following steps: a predetermined linear relationship between the variance and the image identification information, or a predetermined nonlinear relationship between the covariance and the image identification information.
Next, the present embodiment will be described taking as an example that the image identification information of the identifier includes image gradation information of the mark point, a position of the mark point in the image, and a degree of inclination of the mark point. The variance value of random jump of the mark point can be calculated from the gray value of the mark point in the image based on the preset linear relation between the gray value and the gray variance value of the mark point, and then the variance value is used as the diagonal element of the covariance matrix to obtain a covariance matrix corresponding to the image identification information, which can be specifically represented by the following formula:
δ=k 1 ·v gray +b 1 (3)
Figure BDA0003791506030000083
wherein delta is the variance value, k of random jitter of gray value of mark point 1 B is the slope in a predetermined linear relationship 1 Is an offset in a predetermined linear relationship. Sigma and method for producing the same 1 Is a covariance matrix corresponding to the distribution state of gradation information of the mark points. When extracting the gray value of the mark point, the mark point needs to be screened based on a preset gray value threshold, and the mark point with the gray value not meeting the gray value threshold can be identified as a disqualified mark point and then the sigma is solved 1 The lowest weight is set for it or its weight is set to 0.
Similarly, for the inclination degree of the mark points, the included angle of each mark point in the image on the abscissa and the ordinate can be calculated respectively, the variance value corresponding to the inclination degree is obtained according to the predetermined linear relation between the included angle and the corresponding variance, and then the variance value is used as the diagonal element of the covariance matrix to obtain the covariance matrix corresponding to the inclination degree. Specifically, the method can be represented by the following formula:
δ x =k 2 ·angle x +b 2 (5)
δ y =k 2 ·angle y +b 2 (6)
Figure BDA0003791506030000091
wherein angle is x Angle, the angle, which is the angle between the mark point and the abscissa y K is the included angle between the marked point and the ordinate 2 B is the slope in a predetermined linear relationship 2 Is an offset in a predetermined linear relationship. Delta x For the variance, delta, of the angle between the mark point and the abscissa y Is the variance of the angle between the mark point and the ordinate. Sigma and method for producing the same 2 A covariance matrix representing a numerical distribution state of the degree of inclination of the marker point.
In addition, based on the coordinates P (x, y) of the identified marker points in the image, a covariance matrix corresponding to the position information may be obtained based on a preset nonlinear relationship:
Σ 3 =F(P) (8)
wherein Σ is 3 For the covariance matrix corresponding to the position information, F () represents a nonlinear relationship between coordinates of the marker point and the corresponding covariance matrix, which can be determined in advance based on a lookup table or a piecewise function.
It will be appreciated by those skilled in the art that, in addition to the above image identification information listed in the present embodiment, tracking status information may be determined based on other types of image identification information, so as to complete fusion of tracking results of different positioning devices. According to the embodiment, the tracking state information is determined based on the covariance matrix corresponding to the image identification information, so that the rationality of fusion weight distribution of the positioning device can be improved, and the accuracy of fusion of tracking results is further improved.
Additionally, in one embodiment, the numerical distribution state includes a covariance matrix, and the data processing method may further include the steps of:
s241, acquiring at least two different image identification information of an identifier;
s242, based on the calculation rule corresponding to each image identification information in at least two different image identification information, covariance matrix calculation is carried out on the at least two different image identification information to obtain tracking state information; the calculation rule includes: a predetermined linear relationship between the variance and the image identification information, or a predetermined nonlinear relationship between the covariance and the image identification information.
After obtaining covariance matrices corresponding to different image identification information, for example, the gray information of the corresponding mark point, the inclination degree of the mark point, and the covariance matrix of the position information of the mark point, different covariance matrices may be fused, and a specific fusion manner may be determined according to the requirements of an actual application scenario, where the gray information of the mark point, the inclination degree of the mark point, and the covariance matrix corresponding to the position information of the mark point are fused according to the following formula in an exemplary embodiment:
all =∑ 123 (9)
wherein Σ is all To a total covariance matrix that is ultimately used to determine fusion weights. According to the embodiment, the covariance matrixes of different image identification information are fused, so that tracking state information of the identifier by the positioning device can be more accurately represented, and therefore more reasonable fusion weights are distributed to the positioning device in the fusion process, the accuracy and precision of fusion results are improved, and the stability of tracking scanning is improved.
The present embodiment is described and illustrated below by way of preferred embodiments.
Fig. 4 is a flowchart of the tracking scanning method of the present preferred embodiment. As shown in fig. 4, the tracking scanning method includes the steps of:
step S401, in the process of scanning the object to be detected by the scanning head, arranging a plurality of binocular tracking heads in different directions of the scanning head so that the scanning head is in the visual field range of the plurality of tracking heads at the same time;
step S402, synchronously tracking mark points fixedly arranged on the scanning head in the scanning posture through a plurality of tracking heads under the same scanning posture of the scanning head to obtain a tracking result;
step S403, calculating a covariance matrix of the image identification information of each tracking head to the mark point, determining the fusion weight of each tracking head, and fusing the tracking results under the scanning posture to obtain tracking fusion results;
step S404, tracking fusion results under different scanning postures of the scanning head are sequentially obtained;
step S405, based on the scanning data of the detected object obtained by the scanning head and the tracking fusion results under the different scanning postures, the detected object is subjected to three-dimensional reconstruction.
Also provided in this embodiment is a three-dimensional scanning system 50, and fig. 5 is a schematic structural diagram of the three-dimensional scanning system 50 of this embodiment. As shown in fig. 5, the three-dimensional scanning system 50 includes: the scanning device 52, the data processing device 54 and at least two positioning devices 56, wherein the scanning device 52 is fixedly provided with an identifier, and the identifier is in the common field of view of the two positioning devices 56; the scanning device 52 is used for scanning the object to be measured and transmitting the scanning result to the data processing device 54; at least two positioning devices 56 are used for synchronously tracking the identifier in the same scanning posture under different scanning postures of the scanning device 52, and transmitting the tracking result to the data processing device 54; the data processing device 54 is configured to perform the data processing method provided in any of the above embodiments.
The three-dimensional scanning system 50 realizes the fusion of tracking results of different positioning devices, so that the limitation of the tracking stability of a single tracking head on the scanning precision can be relieved, and the stability of tracking scanning can be improved.
Further, in one embodiment, in the three-dimensional scanning system 50, the distribution direction of at least two positioning devices 56 is different from the distribution direction of at least two positioning devices 52 in the same scanning posture of the scanning device 52. The positioning devices are distributed in different directions relative to the scanning device, so that the measurement results of the different positioning devices in the direction with larger random error can be mutually compensated, the precision of tracking fusion results is improved, and the stability of the three-dimensional scanning system is improved.
There is also provided in this embodiment an electronic device comprising a memory having stored therein a computer program and a processor arranged to run the computer program to perform the steps of any of the method embodiments described above.
Optionally, the electronic apparatus may further include a transmission device and an input/output device, where the transmission device is connected to the processor, and the input/output device is connected to the processor.
Alternatively, in the present embodiment, the above-described processor may be configured to execute the following steps by a computer program:
s1, acquiring tracking results obtained by synchronously tracking identifiers of at least two positioning devices in the same scanning posture under different scanning postures of a scanning device in the process of scanning the object to be detected by the scanning device;
s2, under the same scanning posture, carrying out fusion calculation on tracking state information and tracking results of identifiers by at least two positioning devices to obtain tracking fusion results of each scanning posture in different scanning postures of the scanning device by at least two positioning devices;
and S3, completing three-dimensional reconstruction of the object to be detected according to tracking fusion results under all scanning postures and scanning results of the scanning device, which are obtained in the scanning process.
It should be noted that, specific examples in this embodiment may refer to examples described in the foregoing embodiments and alternative implementations, and are not described in detail in this embodiment.
In addition, in combination with the data processing method provided in the above embodiment, a storage medium may be provided in this embodiment. The storage medium has a computer program stored thereon; the computer program, when executed by a processor, implements any of the data processing methods of the above embodiments.
It should be understood that the specific embodiments described herein are merely illustrative of this application and are not intended to be limiting. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the present application, are within the scope of the present application in light of the embodiments provided herein.
It is evident that the drawings are only examples or embodiments of the present application, from which the present application can also be adapted to other similar situations by a person skilled in the art without the inventive effort. In addition, it should be appreciated that while the development effort might be complex and lengthy, it would nevertheless be a routine undertaking of design, fabrication, or manufacture for those of ordinary skill having the benefit of this disclosure, and thus should not be construed as an admission of insufficient detail.
The term "embodiment" in this application means that a particular feature, structure, or characteristic described in connection with the embodiment may be included in at least one embodiment of the application. The appearances of such phrases in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive. It will be clear or implicitly understood by those of ordinary skill in the art that the embodiments described in this application can be combined with other embodiments without conflict.
The above examples only represent a few embodiments of the present application, which are described in more detail and are not to be construed as limiting the scope of the patent. It should be noted that it would be apparent to those skilled in the art that various modifications and improvements could be made without departing from the spirit of the present application, which would be within the scope of the present application. Accordingly, the scope of protection of the present application shall be subject to the appended claims.

Claims (9)

1. A data processing method for a three-dimensional scanning system, which is characterized in that the three-dimensional scanning system comprises a scanning device and at least two positioning devices, wherein the scanning device is fixedly provided with identifiers, and the at least two positioning devices are used for synchronously tracking the identifiers; the method comprises the following steps:
in the process of scanning the object to be detected by the scanning device, acquiring tracking results obtained by synchronously tracking the identifiers of the at least two positioning devices in the same scanning posture under different scanning postures of the scanning device;
under the same scanning gesture, determining the fusion weight of each positioning device in the at least two positioning devices according to the tracking state information of the identifier;
based on the fusion weight and the tracking result, calculating the gesture of the scanning device under a preset optimal fusion condition, and determining the gesture of the scanning device under the optimal fusion condition as a tracking fusion result of the scanning device under the same scanning gesture by the at least two positioning devices to obtain a tracking fusion result of each scanning gesture in different scanning gestures;
and according to the tracking fusion results under all scanning postures and the scanning results of the scanning device, which are obtained in the scanning process, completing the three-dimensional reconstruction of the object to be detected.
2. The method according to claim 1, wherein the acquiring the tracking result obtained by the at least two positioning devices by synchronously tracking the identifier in the same scanning posture under different scanning postures of the scanning device includes:
and acquiring coordinate information of the identifier under the same scanning posture under the corresponding coordinate systems of the at least two positioning devices by the at least two positioning devices under different scanning postures of the scanning device, so as to obtain the tracking result.
3. A data processing method according to claim 1 or 2, wherein the tracking state information is obtained by counting a numerical distribution state of image identification information of the identifier acquired by the positioning device.
4. A data processing method according to claim 3, wherein the numerical distribution state comprises a covariance matrix, the method further comprising:
performing covariance matrix calculation on the image identification information based on a calculation rule preset by the image identification information to obtain the tracking state information; the preset calculation rule comprises the following steps: a predetermined linear relationship between the variance and the image identification information, or a predetermined nonlinear relationship between the covariance and the image identification information.
5. A data processing method according to claim 3, wherein the numerical distribution state comprises a covariance matrix, the method further comprising:
acquiring at least two different image identification information of the identifier;
based on the calculation rule corresponding to each image identification information in the at least two different image identification information, covariance matrix calculation is carried out on the at least two different image identification information respectively, and the tracking state information is obtained; the calculation rule includes: a predetermined linear relationship between the variance and the image identification information, or a predetermined nonlinear relationship between the covariance and the image identification information.
6. A three-dimensional scanning system, comprising: the device comprises a scanning device, a data processing device and at least two positioning devices, wherein the scanning device is fixedly provided with an identifier, and the identifier is in the common field of view of the two positioning devices;
the scanning device is used for scanning the object to be detected and transmitting the scanning result to the data processing device;
the at least two positioning devices are used for synchronously tracking the identifier in the same scanning posture under different scanning postures of the scanning device, and transmitting tracking results to the data processing device;
the data processing apparatus is configured to perform the data processing method of any one of claims 1 to 5.
7. The three-dimensional scanning system of claim 6, wherein different ones of the at least two positioning devices are different in a direction of distribution relative to the scanning device in a same scanning pose of the scanning device.
8. An electronic device comprising a memory and a processor, characterized in that the memory has stored therein a computer program, the processor being arranged to run the computer program to perform the data processing method of any of claims 1 to 5.
9. A computer-readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the steps of the data processing method of any one of claims 1 to 5.
CN202210956321.3A 2022-08-10 2022-08-10 Data processing method, three-dimensional scanning system, electronic device and storage medium Active CN115493512B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210956321.3A CN115493512B (en) 2022-08-10 2022-08-10 Data processing method, three-dimensional scanning system, electronic device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210956321.3A CN115493512B (en) 2022-08-10 2022-08-10 Data processing method, three-dimensional scanning system, electronic device and storage medium

Publications (2)

Publication Number Publication Date
CN115493512A CN115493512A (en) 2022-12-20
CN115493512B true CN115493512B (en) 2023-06-13

Family

ID=84466622

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210956321.3A Active CN115493512B (en) 2022-08-10 2022-08-10 Data processing method, three-dimensional scanning system, electronic device and storage medium

Country Status (1)

Country Link
CN (1) CN115493512B (en)

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101413784A (en) * 2008-12-02 2009-04-22 哈尔滨工业大学 Method for measuring large linear range data fusion by compound color ultra-resolved differential confocal
WO2011047888A1 (en) * 2009-10-19 2011-04-28 Metaio Gmbh Method of providing a descriptor for at least one feature of an image and method of matching features
WO2016106961A1 (en) * 2014-12-30 2016-07-07 华中科技大学 Multi-sensor fusion-based super-near distance autonomous navigation device and method
CN109000582A (en) * 2018-03-15 2018-12-14 杭州思看科技有限公司 Scan method and system, storage medium, the equipment of tracking mode three-dimensional scanner
CN110738098A (en) * 2019-08-29 2020-01-31 北京理工大学 target identification positioning and locking tracking method
CN112802002A (en) * 2021-02-05 2021-05-14 杭州思锐迪科技有限公司 Object surface data detection method and system, electronic device and storage medium
CN112950696A (en) * 2021-02-03 2021-06-11 珠海格力智能装备有限公司 Navigation map generation method and generation device and electronic equipment
CN112964196A (en) * 2021-02-05 2021-06-15 杭州思锐迪科技有限公司 Three-dimensional scanning method, system, electronic device and computer equipment
CN113137938A (en) * 2021-04-13 2021-07-20 杭州思看科技有限公司 Three-dimensional scanning system, method, computer device, and storage medium
WO2021147391A1 (en) * 2020-01-21 2021-07-29 魔门塔(苏州)科技有限公司 Map generation method and device based on fusion of vio and satellite navigation system
CN113670202A (en) * 2021-08-25 2021-11-19 杭州思看科技有限公司 Three-dimensional scanning system and three-dimensional scanning method
CN113766083A (en) * 2021-09-09 2021-12-07 杭州思看科技有限公司 Parameter configuration method of tracking scanning system, electronic device and storage medium
CN113884021A (en) * 2021-09-24 2022-01-04 上海飞机制造有限公司 Scanning system, calibration device and calibration method of scanning system

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101413784A (en) * 2008-12-02 2009-04-22 哈尔滨工业大学 Method for measuring large linear range data fusion by compound color ultra-resolved differential confocal
WO2011047888A1 (en) * 2009-10-19 2011-04-28 Metaio Gmbh Method of providing a descriptor for at least one feature of an image and method of matching features
WO2016106961A1 (en) * 2014-12-30 2016-07-07 华中科技大学 Multi-sensor fusion-based super-near distance autonomous navigation device and method
CN109000582A (en) * 2018-03-15 2018-12-14 杭州思看科技有限公司 Scan method and system, storage medium, the equipment of tracking mode three-dimensional scanner
CN110738098A (en) * 2019-08-29 2020-01-31 北京理工大学 target identification positioning and locking tracking method
WO2021147391A1 (en) * 2020-01-21 2021-07-29 魔门塔(苏州)科技有限公司 Map generation method and device based on fusion of vio and satellite navigation system
CN112950696A (en) * 2021-02-03 2021-06-11 珠海格力智能装备有限公司 Navigation map generation method and generation device and electronic equipment
CN112964196A (en) * 2021-02-05 2021-06-15 杭州思锐迪科技有限公司 Three-dimensional scanning method, system, electronic device and computer equipment
CN112802002A (en) * 2021-02-05 2021-05-14 杭州思锐迪科技有限公司 Object surface data detection method and system, electronic device and storage medium
CN113137938A (en) * 2021-04-13 2021-07-20 杭州思看科技有限公司 Three-dimensional scanning system, method, computer device, and storage medium
CN113670202A (en) * 2021-08-25 2021-11-19 杭州思看科技有限公司 Three-dimensional scanning system and three-dimensional scanning method
CN113766083A (en) * 2021-09-09 2021-12-07 杭州思看科技有限公司 Parameter configuration method of tracking scanning system, electronic device and storage medium
CN113884021A (en) * 2021-09-24 2022-01-04 上海飞机制造有限公司 Scanning system, calibration device and calibration method of scanning system

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
基于点云位姿平均的非合作目标三维重构;李宜鹏;解永春;;空间控制技术与应用(第01期);全文 *
气流干扰下机载激光雷达扫描三维地形成像研究;敬远兵;;激光杂志(09);全文 *
谱-空图嵌入的高光谱图像多核分类算法;郭志民;孙玉宝;耿俊成;周强;;小型微型计算机系统(11);全文 *

Also Published As

Publication number Publication date
CN115493512A (en) 2022-12-20

Similar Documents

Publication Publication Date Title
US10984554B2 (en) Monocular vision tracking method, apparatus and non-volatile computer-readable storage medium
CN113907645A (en) Mobile robot positioning method and device, storage medium and electronic device
CN112613381A (en) Image mapping method and device, storage medium and electronic device
CN109214251B (en) Image recognition device and method for machine room monitoring
CN111461014A (en) Antenna attitude parameter detection method and device based on deep learning and storage medium
CN115493512B (en) Data processing method, three-dimensional scanning system, electronic device and storage medium
CN117151970A (en) Point cloud data processing method and device, electronic device and storage medium
CN113436234B (en) Wheel hub burr identification method, electronic device, device and readable storage medium
CN111336938A (en) Robot and object distance detection method and device thereof
CN113473118B (en) Data timestamp alignment method, device, equipment and storage medium
CN112163519A (en) Image mapping processing method, device, storage medium and electronic device
CN115018922A (en) Distortion parameter calibration method, electronic device and computer readable storage medium
US11445162B2 (en) Method and device for calibrating binocular camera
CN114782496A (en) Object tracking method and device, storage medium and electronic device
CN111311690B (en) Calibration method and device of depth camera, terminal and computer storage medium
CN112561992A (en) Position determination method and device, storage medium and electronic device
CN111210471B (en) Positioning method, device and system
CN112991463A (en) Camera calibration method, device, equipment, storage medium and program product
CN114387424A (en) Indoor distribution system fault positioning method, device, equipment and readable medium
CN115439630B (en) Mark point splicing method, photogrammetry method, device and electronic device
CN115222799B (en) Method and device for acquiring image gravity direction, electronic equipment and storage medium
CN113658313B (en) Face model rendering method and device and electronic equipment
US20230288526A1 (en) Beacon map construction method, device, and computer-readable storage medium
WO2024001847A1 (en) 2d marker, and indoor positioning method and apparatus
CN111132029B (en) Positioning method and device based on terrain constraint

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant